Sharma, Aditya; Heijenberg, Nico; Peter, Clement; Bolongei, Josephus; Reeder, Bruce; Alpha, Tamba; Sterk, Esther; Robert, Hugues; Kurth, Andreas; Cannas, Angela; Bocquin, Anne; Strecker, Thomas; Logue, Christopher; Di Caro, Antonino; Pottage, Thomas; Yue, Constanze; Stoecker, Kilian; Wölfel, Roman; Gabriel, Martin; Günther, Stephan; Damon, Inger
2014-11-21
Lofa County has one of the highest cumulative incidences of Ebola virus disease (Ebola) in Liberia. Recent situation reports from the Liberian Ministry of Health and Social Welfare (MoHSW) have indicated a decrease in new cases of Ebola in Lofa County. In October 2014, the Liberian MoHSW requested the assistance of CDC to further characterize recent trends in Ebola in Lofa County. Data collected during June 8-November 1, 2014 from three sources were analyzed: 1) aggregate data for newly reported cases, 2) case-based data for persons admitted to the dedicated Ebola treatment unit (ETU) for the county, and 3) test results for community decedents evaluated for Ebola. Trends from all three sources suggest that transmission of Ebola virus decreased as early as August 17, 2014, following rapid scale-up of response activities in Lofa County after a resurgence of Ebola in early June 2014. The comprehensive response strategy developed with participation from the local population in Lofa County might serve as a model to implement in other affected areas to accelerate control of Ebola.
LOFA analysis in helium and Pb-Li circuits of LLCB TBM by FE simulation
NASA Astrophysics Data System (ADS)
Chaudhuri, Paritosh; Ranjithkumar, S.; Sharma, Deepak; Danani, Chandan
2017-04-01
One of the main ITER objectives is to demonstrate the feasibility of the breeding blanket concepts that would lead to tritium self-sufficiency and the extraction of a high-grade heat for electricity production. India has developed the LLCB TBM to be tested in ITER for the validation of design concepts for tritium breeding blankets relevant DEMO and future power reactor. LLCB concept has the unique features of combination of both solid (lithium titanate as packed pebble bed) and liquid breeders (molten lead lithium). India specific IN-RAFMS is the structural material for TBM. The First Wall is actively cooled by high-pressure helium (He) gas [1]. It is important to validate the design of TBM to withstand various loads acting on it including accident analysis like LOCA, LOFA etc. Detailed thermal-hydraulic simulation studies including LOFA in helium and Pb-Li circuits of LLCB TBM have been performed using Finite Element using ANSYS. These analyses will provide important information about the temperature distribution in different materials used in TBM during steady state and transient condition. Thermal-hydraulic safety requirement has also been envisaged for the initiation the FPPS (Fusion Power Shutdown System) during LOFA. All these analysis will be presented in detail in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, Theron D.; McDonald, Jimmie M.; Cadwallader, Lee C.
2000-01-15
This paper discusses the thermal response of two prototypical International Thermonuclear Experimental Reactor (ITER) divertor channels during simulated loss-of-flow-accident (LOFA) experiments. The thermal response was characterized by the time-to-burnout (TBO), which is a figure of merit on the mockups' survivability. Data from the LOFA experiments illustrate that (a) the pre-LOFA inlet velocity does not significantly influence the TBO, (b) the incident heat flux (IHF) does influence the TBO, and (c) a swirl tape insert significantly improves the TBO and promotes the initiation of natural circulation. This natural circulation enabled the mockup to absorb steady-state IHFs after the coolant circulation pumpmore » was disabled. Several methodologies for thermal-hydraulic modeling of the LOFA were attempted.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, T.D.; McDonald, J.M.; Cadwallader, L.C.
2000-01-01
This paper discusses the thermal response of two prototypical International Thermonuclear Experimental Reactor (ITER) divertor channels during simulated loss-of-flow-accident (LOFA) experiments. The thermal response was characterized by the time-to-burnout (TBO), which is a figure of merit on the mockups' survivability. Data from the LOFA experiments illustrate that (a) the pre-LOFA inlet velocity does not significantly influence the TBO, (b) the incident heat flux (IHF) does influence the TBO, and (c) a swirl tape insert significantly improves the TBO and promotes the initiation of natural circulation. This natural circulation enabled the mockup to absorb steady-state IHFs after the coolant circulation pumpmore » was disabled. Several methodologies for thermal-hydraulic modeling of the LOFA were attempted.« less
Bypass flow computations on the LOFA transient in a VHTR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tung, Yu-Hsin; Johnson, Richard W.; Ferng, Yuh-Ming
2014-01-01
Bypass flow in the prismatic gas-cooled very high temperature reactor (VHTR) is not intentionally designed to occur, but is present in the gaps between graphite blocks. Previous studies of the bypass flow in the core indicated that the cooling provided by flow in the bypass gaps had a significant effect on temperature and flow distributions for normal operating conditions. However, the flow and heat transports in the core are changed significantly after a Loss of Flow Accident (LOFA). This study aims to study the effect and role of the bypass flow after a LOFA in terms of the temperature andmore » flow distributions and for the heat transport out of the core by natural convection of the coolant for a 1/12 symmetric section of the active core which is composed of images and mirror images of two sub-region models. The two sub-region models, 9 x 1/12 and 15 x 1/12 symmetric sectors of the active core, are employed as the CFD flow models using computational grid systems of 70.2 million and 117 million nodes, respectively. It is concluded that the effect of bypass flow is significant for the initial conditions and the beginning of LOFA, but the bypass flow has little effect after a long period of time in the transient computation of natural circulation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamm, L.L.
1998-10-07
This report is one of a series of reports that document normal operation and accident simulations for the Accelerator Production of Tritium (APT) blanket heat removal system. These simulations were performed for the Preliminary Safety Analysis Report. This report documents the results of simulations of a Loss-of-Flow Accident (LOFA) where power is lost to all of the pumps that circulate water in the blanket region, the accelerator beam is shut off and neither the residual heat removal nor cavity flood systems operate.
Kouadio, Koffi Isidore; Clement, Peter; Bolongei, Josephus; Tamba, Alpha; Gasasira, Alex Ntale; Warsame, Abdihamid; Okeibunor, Joseph Chukwudi; Ota, Martin Okechukwu; Tamba, Boima; Gumede, Nicksy; Shaba, Keith; Poy, Alain; Salla, Mbaye; Mihigo, Richard; Nshimirimana, Deo
2015-05-06
Ebola Virus Disease (EVD) outbreak was confirmed in Liberia on March 31st 2014. A response comprising of diverse expertise was mobilized and deployed to the country to contain transmission of Ebola and give relief to a people already impoverished from protracted civil war. This paper describes the epidemiological and surveillance response to the EVD outbreak in Lofa County in Liberia from March to September 2014. Five of the 6 districts of Lofa were affected. The most affected districts were Voinjama/Guardu Gbondi and Foya. By 26th September, 2014, a total of 619 cases, including 19.4% probable cases, 20.3% suspected cases and 44.2% confirmed cases were recorded by the Ebola Emergency Response Team (EERT) of Lofa County. Adults (20-50 years) were the most affected. Overall fatality rate was 53.3%. Twenty two (22) cases were reported among the Health Care Workers with a fatality rate of 81.8%. Seventy eight percent (78%) of the contacts successfully completed 21 days follow-up while 134 (6.15%) that developed signs and symptoms of EVD were referred to the ETU in Foya. The contributions of the weak health systems as well as socio-cultural factors in fueling the epidemic are highlighted. Importantly, the lessons learnt including the positive impact of multi-sectorial and multidisciplinary and coordinated response led by the government and community. Again, given that the spread of infectious disease can be considered a security threat every effort has to put in place to strengthen the health systems in developing countries including the International Health Regulation (IHR)'s core capacities. Key words: Ebola virus disease, outbreak, epidemiology and surveillance, socio-cultural factors, health system, West Africa.
Modeling Transients and Designing a Passive Safety System for a Nuclear Thermal Rocket Using Relap5
NASA Astrophysics Data System (ADS)
Khatry, Jivan
Long-term high payload missions necessitate the need for nuclear space propulsion. Several nuclear reactor types were investigated by the Nuclear Engine for Rocket Vehicle Application (NERVA) program of National Aeronautics and Space Administration (NASA). Study of planned/unplanned transients on nuclear thermal rockets is important due to the need for long-term missions. A NERVA design known as the Pewee I was selected for this purpose. The following transients were run: (i) modeling of corrosion-induced blockages on the peripheral fuel element coolant channels and their impact on radiation heat transfer in the core, and (ii) modeling of loss-of-flow-accidents (LOFAs) and their impact on radiation heat transfer in the core. For part (i), the radiation heat transfer rate of blocked channels increases while their neighbors' decreases. For part (ii), the core radiation heat transfer rate increases while the flow rate through the rocket system is decreased. However, the radiation heat transfer decreased while there was a complete LOFA. In this situation, the peripheral fuel element coolant channels handle the majority of the radiation heat transfer. Recognizing the LOFA as the most severe design basis accident, a passive safety system was designed in order to respond to such a transient. This design utilizes the already existing tie rod tubes and connects them to a radiator in a closed loop. Hence, this is basically a secondary loop. The size of the core is unchanged. During normal steady-state operation, this secondary loop keeps the moderator cool. Results show that the safety system is able to remove the decay heat and prevent the fuel elements from melting, in response to a LOFA and subsequent SCRAM.
Successful Proof of Concept of Family Planning and Immunization Integration in Liberia
Cooper, Chelsea M; Fields, Rebecca; Mazzeo, Corinne I; Taylor, Nyapu; Pfitzer, Anne; Momolu, Mary; Jabbeh-Howe, Cuallau
2015-01-01
ABSTRACT Globally, unmet need for postpartum family planning remains high, while immunization services are among the most wide-reaching and equitable interventions. Given overlapping time frames, integrating these services provides an opportunity to leverage existing health visits to offer women more comprehensive services. From March through November 2012, Liberia's government, with support from the Maternal and Child Health Integrated Program (MCHIP), piloted an integrated family planning and immunization model at 10 health facilities in Bong and Lofa counties. Vaccinators provided mothers bringing infants for routine immunization with targeted family planning and immunization messages and same-day referrals to co-located family planning services. In February 2013, we compared service statistics for family planning and immunization during the pilot against the previous year's statistics. We also conducted in-depth interviews with service providers and other personnel and focus group discussions with clients. Results showed that referral acceptance across the facilities varied from 10% to 45% per month, on average. Over 80% of referral acceptors completed the family planning visit that day, of whom over 90% accepted a contraceptive method that day. The total number of new contraceptive users at participating facilities increased by 73% in Bong and by 90% in Lofa. Women referred from immunization who accepted family planning that day accounted for 44% and 34% of total new contraceptive users in Bong and Lofa, respectively. In Lofa, pilot sites administered 35% more Penta 1 and 21% more Penta 3 doses during the pilot period compared with the same period of the previous year, while Penta 1 and Penta 3 administration decreased in non-pilot facilities. In Bong, there was little difference in the number of Penta 1 and Penta 3 doses administered between pilot and non-pilot facilities. In both counties, Penta 1 to Penta 3 dropout rates increased at pilot sites but not in non-pilot facilities, possibly due to higher than average background dropout rates at pilot sites prior to the intervention in Lofa and the disproportionate effect of data from 1 large facility in Bong. The project provided considerable basic support to assess this proof of concept. However, results suggest that introducing a simple model that is minimally disruptive to existing immunization service delivery can facilitate integration. The model is currently being scaled-up to other counties in Liberia, which could potentially contribute to increased postpartum contraceptive uptake, leading to longer birth intervals and improved health outcomes for children and mothers. PMID:25745121
Capps, Jean Meyer; Njiru, Haron; deVries, Pieter
2017-01-01
The Ebola virus disease (EVD) epidemic entered Liberia through Lofa County in February 2014 and spread to two health districts where the nongovernmental organization Global Communities had been implementing community-led total sanitation (CLTS) since 2012. By December 2014 the county had 928 Ebola cases (422 of them confirmed) and 648 deaths. Before the epidemic, CLTS was triggered in 155 communities, and 98 communities were certified as Open Defecation Free (ODF). Using mixed quantitative and qualitative methods, we determined that no cases of EVD were found in ODF communities and in only one CLTS community that had not reached ODF status. No differences were found between EVD and non-EVD communities in tribe, religion, ethnic group, or major sources of Ebola information. Radio was the most common source of information for all communities, but health workers were the most trusted information sources. CLTS ODF communities attributed their avoidance of EVD to Water, Sanitation, and Hygiene behaviors, especially hand washing with soap and disposal of feces that they learned from CLTS prior to the epidemic. Communities that got EVD blamed their strong initial resistance to Ebola response messages on their distrust that Ebola was real and their reliance on friends and family for advice. A strong inverse correlation between EVD and CLTS with or without ODF emerged from the regression analysis (R = -.6).
Mortality, nutrition and health in Lofa County Liberia five years post-conflict.
Doocy, Shannon; Lewy, Daniela; Guenther, Tanya; Larrance, Ryan
2010-01-01
Liberia remains in transition from a state of humanitarian emergency to development, and Lofa County was the epicentre of recent conflict. This study aimed to estimate mortality and malnutrition and evaluate access to health services, water and sanitation. The survey was conducted in April 2009 and employed a 46 cluster×20 design (n=920 households) with probability proportional to size sampling. The crude mortality rate was 24.3/1000/year (CI: 19.0 to 29.6) or 0.67/10,000/day (CI: 0.52 to 0.81). The global acute malnutrition rate was 7.9% (CI: 5.4 to 8.9), and the severe acute malnutrition rate was 4.5% (CI: 2.9 to 6.7). Access to basic health services was relatively good according to a variety of indicators; however, access to sanitation was low, with 39.5% of households reporting access to toilets or latrines. Despite high rates of displacement and infrastructure destruction, population health appears to be relatively stable 5 years post-conflict, though a continued focus on reconstruction and development is needed.
Bogus, Joshua; Gankpala, Lincoln; Fischer, Kerstin; Krentel, Alison; Weil, Gary J.; Fischer, Peter U.; Kollie, Karsor; Bolay, Fatorma K.
2016-01-01
The recent outbreak of Ebola virus disease (EVD) interrupted mass drug administration (MDA) programs to control and eliminate neglected tropical diseases in Liberia. MDA programs treat entire communities with medication regardless of infection status to interrupt transmission and eliminate lymphatic filariasis and onchocerciasis. Following reports of hostilities toward health workers and fear that they might be spreading EVD, it was important to determine whether attitudes toward MDA might have changed after the outbreak. We surveyed 140 community leaders from 32 villages in Lofa County, Liberia, that had previously participated in MDA and are located in an area that was an early epicenter of the EVD outbreak. Survey respondents reported a high degree of community trust in the MDA program, and 97% thought their communities were ready to resume MDA. However, respondents predicted that fewer people would comply with MDA after the EVD epidemic than before. The survey also uncovered fears in the community that EVD and MDA might be linked. Respondents suggested that MDA programs emphasize to people that the medications are identical to those previously distributed and that MDA programs have nothing to do with EVD. PMID:26666700
Funk, Sebastian; Ciglenecki, Iza; Tiffany, Amanda; Gignoux, Etienne; Camacho, Anton; Eggo, Rosalind M; Kucharski, Adam J; Edmunds, W John; Bolongei, Josephus; Azuma, Phillip; Clement, Peter; Alpha, Tamba S; Sterk, Esther; Telfer, Barbara; Engel, Gregory; Parker, Lucy Anne; Suzuki, Motoi; Heijenberg, Nico; Reeder, Bruce
2017-05-26
The Ebola epidemic in West Africa was stopped by an enormous concerted effort of local communities and national and international organizations. It is not clear, however, how much the public health response and behavioural changes in affected communities, respectively, contributed to ending the outbreak. Here, we analyse the epidemic in Lofa County, Liberia, lasting from March to November 2014, by reporting a comprehensive time line of events and estimating the time-varying transmission intensity using a mathematical model of Ebola transmission. Model fits to the epidemic show an alternation of peaks and troughs in transmission, consistent with highly heterogeneous spread. This is combined with an overall decline in the reproduction number of Ebola transmission from early August, coinciding with an expansion of the local Ebola treatment centre. We estimate that healthcare seeking approximately doubled over the course of the outbreak, and that isolation of those seeking healthcare reduced their reproduction number by 62% (mean estimate, 95% credible interval (CI) 59-66). Both expansion of bed availability and improved healthcare seeking contributed to ending the epidemic, highlighting the importance of community engagement alongside clinical intervention.This article is part of the themed issue 'The 2013-2016 West African Ebola epidemic: data, decision-making and disease control'. © 2017 The Authors.
Bogus, Joshua; Gankpala, Lincoln; Fischer, Kerstin; Krentel, Alison; Weil, Gary J; Fischer, Peter U; Kollie, Karsor; Bolay, Fatorma K
2016-03-01
The recent outbreak of Ebola virus disease (EVD) interrupted mass drug administration (MDA) programs to control and eliminate neglected tropical diseases in Liberia. MDA programs treat entire communities with medication regardless of infection status to interrupt transmission and eliminate lymphatic filariasis and onchocerciasis. Following reports of hostilities toward health workers and fear that they might be spreading EVD, it was important to determine whether attitudes toward MDA might have changed after the outbreak. We surveyed 140 community leaders from 32 villages in Lofa County, Liberia, that had previously participated in MDA and are located in an area that was an early epicenter of the EVD outbreak. Survey respondents reported a high degree of community trust in the MDA program, and 97% thought their communities were ready to resume MDA. However, respondents predicted that fewer people would comply with MDA after the EVD epidemic than before. The survey also uncovered fears in the community that EVD and MDA might be linked. Respondents suggested that MDA programs emphasize to people that the medications are identical to those previously distributed and that MDA programs have nothing to do with EVD. © The American Society of Tropical Medicine and Hygiene.
Challenges in responding to the ebola epidemic - four rural counties, Liberia, August-November 2014.
Summers, Aimee; Nyenswah, Tolbert G; Montgomery, Joel M; Neatherlin, John; Tappero, Jordan W; T, Nyenswah; M, Fahnbulleh; M, Massaquoi
2014-12-19
The first cases of Ebola virus disease (Ebola) in West Africa were identified in Guinea on March 22, 2014. On March 30, the first Liberian case was identified in Foya Town, Lofa County, near the Guinean border. Because the majority of early cases occurred in Lofa and Montserrado counties, resources were concentrated in these counties during the first several months of the response, and these counties have seen signs of successful disease control. By October 2014, the epidemic had reached all 15 counties of Liberia. During August 27-September 10, 2014, CDC in collaboration with the Liberian Ministry of Health and Social Welfare assessed county Ebola response plans in four rural counties (Grand Cape Mount, Grand Bassa, Rivercess, and Sinoe, to identify county-specific challenges in executing their Ebola response plans, and to provide recommendations and training to enhance control efforts. Assessments were conducted through interviews with county health teams and health care providers and visits to health care facilities. At the time of assessment, county health teams reported lacking adequate training in core Ebola response strategies and reported facing many challenges because of poor transportation and communication networks. Development of communication and transportation network strategies for communities with limited access to roads and limited means of communication in addition to adequate training in Ebola response strategies is critical for successful management of Ebola in remote areas.
NASA Astrophysics Data System (ADS)
Fradeneck, Austen; Kimber, Mark
2017-11-01
The present study evaluates the effectiveness of current RANS and LES models in simulating natural convection in high-aspect ratio parallel plate channels. The geometry under consideration is based on a simplification of the coolant and bypass channels in the very high-temperature gas reactor (VHTR). Two thermal conditions are considered, asymmetric and symmetric wall heating with an applied heat flux to match Rayleigh numbers experienced in the VHTR during a loss of flow accident (LOFA). RANS models are compared to analogous high-fidelity LES simulations. Preliminary results demonstrate the efficacy of the low-Reynolds number k- ɛ formulations and their enhancement to the standard form and Reynolds stress transport model in terms of calculating the turbulence production due to buoyancy and overall mean flow variables.
Time-to-burnout data for a prototypical ITER divertor tube during a simulated loss of flow accident
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, T.D.; Watson, R.D.; McDonald, J.M.
The Loss of Flow Accident (LOFA) is a serious safety concern for the International Thermonuclear Experimental Reactor (ITER) as it has been suggested that greater than 100 seconds are necessary to safely shutdown the plasma when ITER is operating at full power. In this experiment, the thermal response of a prototypical ITER divertor tube during a simulated LOFA was studied. The divertor tube was fabricated from oxygen-free high-conductivity copper to have a square geometry with a circular coolant channel. The coolant channel inner diameter was 0.77 cm, the heated length was 4.0 cm, and the heated width was 1.6 cm.more » The mockup did not feature any flow enhancement techniques, i.e., swirl tape, helical coils, or internal fins. One-sided surface heating of the mockup was accomplished through the use of the 30 kW Sandia Electron Beam Test System. After reaching steady state temperatures in the mockup, as determined by two Type-K thermocouples installed 0.5 mm beneath the heated surface, the coolant pump was manually tripped off and the coolant flow allowed to naturally coast down. Electron beam heating continued after the pump trip until the divertor tube`s heated surface exhibited the high temperature transient normally indicative of rapidly approaching burnout. Experimental data showed that time-to-burnout increases proportionally with increasing inlet velocity and decreases proportionally with increasing incident heat flux.« less
van der Walle, G A; Buisman, G J; Weusthuis, R A; Eggink, G
1999-01-01
Unsaturated medium-chain-length poly(3-hydroxyalkanoates) (mcl-PHAs) produced by Pseudomonas putida from linseed oil fatty acids (LOFA) and tall oil fatty acids (TOFA), were used as the polymer binder in the formulation of high solid alkyd-like paints. The relatively high concentration of unsaturated alkyl side chains incorporated into the PHA resins resulted in oxidative drying PHA paints having excellent coating properties. The homogeneously pigmented PHA coatings yielded high-gloss, smooth and strong films upon curing and showed an excellent flexibility, a good adhesion to different substrates, cohesive film properties and resistance to chipping.
NASA Astrophysics Data System (ADS)
Clifford, Corey; Kimber, Mark
2017-11-01
Over the last 30 years, an industry-wide shift within the nuclear community has led to increased utilization of computational fluid dynamics (CFD) to supplement nuclear reactor safety analyses. One such area that is of particular interest to the nuclear community, specifically to those performing loss-of-flow accident (LOFA) analyses for next-generation very-high temperature reactors (VHTR), is the capacity of current computational models to predict heat transfer across a wide range of buoyancy conditions. In the present investigation, a critical evaluation of Reynolds-averaged Navier-Stokes (RANS) and large-eddy simulation (LES) turbulence modeling techniques is conducted based on CFD validation data collected from the Rotatable Buoyancy Tunnel (RoBuT) at Utah State University. Four different experimental flow conditions are investigated: (1) buoyancy-aided forced convection; (2) buoyancy-opposed forced convection; (3) buoyancy-aided mixed convection; (4) buoyancy-opposed mixed convection. Overall, good agreement is found for both forced convection-dominated scenarios, but an overly-diffusive prediction of the normal Reynolds stress is observed for the RANS-based turbulence models. Low-Reynolds number RANS models perform adequately for mixed convection, while higher-order RANS approaches underestimate the influence of buoyancy on the production of turbulence.
Forrester, Joseph D; Pillai, Satish K; Beer, Karlyn D; Neatherlin, John; Massaquoi, Moses; Nyenswah, Tolbert G; Montgomery, Joel M; De Cock, Kevin
2014-10-10
Ebola virus disease (Ebola) is a multisystem disease caused by a virus of the genus Ebolavirus. In late March 2014, Ebola cases were described in Liberia, with epicenters in Lofa County and later in Montserrado County. While information about case burden and health care infrastructure was available for the two epicenters, little information was available about remote counties in southeastern Liberia. Over 9 days, August 6-14, 2014, Ebola case burden, health care infrastructure, and emergency preparedness were assessed in collaboration with the Liberian Ministry of Health and Social Welfare in four counties in southeastern Liberia: Grand Gedeh, Grand Kru, River Gee, and Maryland. Data were collected by health care facility visits to three of the four county referral hospitals and by unstructured interviews with county and district health officials, hospital administrators, physicians, nurses, physician assistants, and health educators in all four counties. Local burial practices were discussed with county officials, but no direct observation of burial practices was conducted. Basic information about Ebola surveillance and epidemiology, case investigation, contact tracing, case management, and infection control was provided to local officials.
A passively-safe fusion reactor blanket with helium coolant and steel structure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crosswait, Kenneth Mitchell
1994-04-01
Helium is attractive for use as a fusion blanket coolant for a number of reasons. It is neutronically and chemically inert, nonmagnetic, and will not change phase during any off-normal or accident condition. A significant disadvantage of helium, however, is its low density and volumetric heat capacity. This disadvantage manifests itself most clearly during undercooling accident conditions such as a loss of coolant accident (LOCA) or a loss of flow accident (LOFA). This thesis describes a new helium-cooled tritium breeding blanket concept which performs significantly better during such accidents than current designs. The proposed blanket uses reduced-activation ferritic steel asmore » a structural material and is designed for neutron wall loads exceeding 4 MW/m{sup 2}. The proposed geometry is based on the nested-shell concept developed by Wong, but some novel features are used to reduce the severity of the first wall temperature excursion. These features include the following: (1) A ``beryllium-joint`` concept is introduced, which allows solid beryllium slabs to be used as a thermal conduction path from the first wall to the cooler portions of the blanket. The joint concept allows for significant swelling of the beryllium (10 percent or more) without developing large stresses in the blanket structure. (2) Natural circulation of the coolant in the water-cooled shield is used to maintain shield temperatures below 100 degrees C, thus maintaining a heat sink close to the blanket during the accident. This ensures the long-term passive safety of the blanket.« less
Social Vulnerability and Ebola Virus Disease in Rural Liberia.
Stanturf, John A; Goodrick, Scott L; Warren, Melvin L; Charnley, Susan; Stegall, Christie M
2015-01-01
The Ebola virus disease (EVD) epidemic that has stricken thousands of people in the three West African countries of Liberia, Sierra Leone, and Guinea highlights the lack of adaptive capacity in post-conflict countries. The scarcity of health services in particular renders these populations vulnerable to multiple interacting stressors including food insecurity, climate change, and the cascading effects of disease epidemics such as EVD. However, the spatial distribution of vulnerable rural populations and the individual stressors contributing to their vulnerability are unknown. We developed a Social Vulnerability Classification using census indicators and mapped it at the district scale for Liberia. According to the Classification, we estimate that districts having the highest social vulnerability lie in the north and west of Liberia in Lofa, Bong, Grand Cape Mount, and Bomi Counties. Three of these counties together with the capital Monrovia and surrounding Montserrado and Margibi counties experienced the highest levels of EVD infections in Liberia. Vulnerability has multiple dimensions and a classification developed from multiple variables provides a more holistic view of vulnerability than single indicators such as food insecurity or scarcity of health care facilities. Few rural Liberians are food secure and many cannot reach a medical clinic in <80 minutes. Our results illustrate how census and household survey data, when displayed spatially at a sub-county level, may help highlight the location of the most vulnerable households and populations. Our results can be used to identify vulnerability hotspots where development strategies and allocation of resources to address the underlying causes of vulnerability in Liberia may be warranted. We demonstrate how social vulnerability index approaches can be applied in the context of disease outbreaks, and our methods are relevant elsewhere.
Control of Ebola virus disease - firestone district, liberia, 2014.
Reaves, Erik J; Mabande, Lyndon G; Thoroughman, Douglas A; Arwady, M Allison; Montgomery, Joel M
2014-10-24
On March 30, 2014, the Ministry of Health and Social Welfare (MOHSW) of Liberia alerted health officials at Firestone Liberia, Inc. (Firestone) of the first known case of Ebola virus disease (Ebola) inside the Firestone rubber tree plantation of Liberia. The patient, who was the wife of a Firestone employee, had cared for a family member with confirmed Ebola in Lofa County, the epicenter of the Ebola outbreak in Liberia during March-April 2014. To prevent a large outbreak among Firestone's 8,500 employees, their dependents, and the surrounding population, the company responded by 1) establishing an incident management system, 2) instituting procedures for the early recognition and isolation of Ebola patients, 3) enforcing adherence to standard Ebola infection control guidelines, and 4) providing differing levels of management for contacts depending on their exposure, including options for voluntary quarantine in the home or in dedicated facilities. In addition, Firestone created multidisciplinary teams to oversee the outbreak response, address case detection, manage cases in a dedicated unit, and reintegrate convalescent patients into the community. The company also created a robust risk communication, prevention, and social mobilization campaign to boost community awareness of Ebola and how to prevent transmission. During August 1-September 23, a period of intense Ebola transmission in the surrounding areas, 71 cases of Ebola were diagnosed among the approximately 80,000 Liberians for whom Firestone provides health care (cumulative incidence = 0.09%). Fifty-seven (80%) of the cases were laboratory confirmed; 39 (68%) of these cases were fatal. Aspects of Firestone's response appear to have minimized the spread of Ebola in the local population and might be successfully implemented elsewhere to limit the spread of Ebola and prevent transmission to health care workers (HCWs).
Soka, Moses J; Choi, Mary J; Baller, April; White, Stephen; Rogers, Emerson; Purpura, Lawrence J; Mahmoud, Nuha; Wasunna, Christine; Massaquoi, Moses; Abad, Neetu; Kollie, Jomah; Dweh, Straker; Bemah, Philip K; Christie, Athalia; Ladele, Victor; Subah, Oneykachi C; Pillai, Satish; Mugisha, Margaret; Kpaka, Jonathan; Kowalewski, Stephen; German, Emilio; Stenger, Mark; Nichol, Stuart; Ströher, Ute; Vanderende, Kristin E; Zarecki, Shauna Mettee; Green, Hugh Henry W; Bailey, Jeffrey A; Rollin, Pierre; Marston, Barbara; Nyenswah, Tolbert G; Gasasira, Alex; Knust, Barbara; Williams, Desmond
2016-10-01
Ebola virus has been detected in semen of Ebola virus disease survivors after recovery. Liberia's Men's Health Screening Program (MHSP) offers Ebola virus disease survivors semen testing for Ebola virus. We present preliminary results and behavioural outcomes from the first national semen testing programme for Ebola virus. The MHSP operates out of three locations in Liberia: Redemption Hospital in Montserrado County, Phebe Hospital in Bong County, and Tellewoyan Hospital in Lofa County. Men aged 15 years and older who had an Ebola treatment unit discharge certificate are eligible for inclusion. Participants' semen samples were tested for Ebola virus RNA by real-time RT-PCR and participants received counselling on safe sexual practices. Participants graduated after receiving two consecutive negative semen tests. Counsellors collected information on sociodemographics and sexual behaviours using questionnaires administered at enrolment, follow up, and graduation visits. Because the programme is ongoing, data analysis was restricted to data obtained from July 7, 2015, to May 6, 2016. As of May 6, 2016, 466 Ebola virus disease survivors had enrolled in the programme; real-time RT-PCR results were available from 429 participants. 38 participants (9%) produced at least one semen specimen that tested positive for Ebola virus RNA. Of these, 24 (63%) provided semen specimens that tested positive 12 months or longer after Ebola virus disease recovery. The longest interval between discharge from an Ebola treatment unit and collection of a positive semen sample was 565 days. Among participants who enrolled and provided specimens more than 90 days since their Ebola treatment unit discharge, men older than 40 years were more likely to have a semen sample test positive than were men aged 40 years or younger (p=0·0004). 84 (74%) of 113 participants who reported not using a condom at enrolment reported using condoms at their first follow-up visit (p<0·0001). 176 (46%) of 385 participants who reported being sexually active at enrolment reported abstinence at their follow-up visit (p<0·0001). Duration of detection of Ebola virus RNA by real-time RT-PCR varies by individual and might be associated with age. By combining behavioural counselling and laboratory testing, the Men's Health Screening Program helps male Ebola virus disease survivors understand their individual risk and take appropriate measures to protect their sexual partners. World Health Organization and the US Centers for Disease Control and Prevention. ©2016 World Health Organization; licensee Elsevier. This is an Open Access article published under the CC BY 3.0 IGO license which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In any use of this article, there should be no suggestion that WHO endorses any specific organisation, products or services. The use of the WHO logo is not permitted. This notice should be preserved along with the article's original URL.
Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier
2014-05-01
Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
2001-10-25
Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the Dynamic Pulmonary Imaging technique 18,5,17,6. We have proposed and evaluated a multiresolutional method with an explicit ventilation model based on pyramid images for ventilation analysis. We have further extended the method for ventilation analysis to pulmonary perfusion. This paper focuses on the clinical evaluation of our method for
32 CFR 989.25 - Base closure and realignment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.25 Base closure and realignment. Base closure or realignment may entail special requirements for environmental analysis. The permanent base closure... frequently contain provisions limiting the extent of environmental analysis required for actions taken under...
32 CFR 989.25 - Base closure and realignment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.25 Base closure and realignment. Base closure or realignment may entail special requirements for environmental analysis. The permanent base closure... frequently contain provisions limiting the extent of environmental analysis required for actions taken under...
Analysis of DNA Sequences by an Optical Time-Integrating Correlator: Proof-of-Concept Experiments.
1992-05-01
DNA ANALYSIS STRATEGY 4 2.1 Representation of DNA Bases 4 2.2 DNA Analysis Strategy 6 3.0 CUSTOM GENERATORS FOR DNA SEQUENCES 10 3.1 Hardware Design 10...of the DNA bases where each base is represented by a 7-bits long pseudorandom sequence. 5 Figure 4: Coarse analysis of a DNA sequence. 7 Figure 5: Fine...a 20-bases long database. 32 xiii LIST OF TABLES PAGE Table 1: Short representations of the DNA bases where each base is represented by 7-bits long
Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...
NASA Astrophysics Data System (ADS)
Maharani, S.; Suprapto, E.
2018-03-01
Critical thinking is very important in Mathematics; it can make student more understanding mathematics concept. Critical thinking is also needed in numerical analysis. The Numerical analysis's book is not yet including critical thinking in them. This research aims to develop group investigation-based book on numerical analysis to increase critical thinking student’s ability, to know the quality of the group investigation-based book on numerical analysis is valid, practical, and effective. The research method is Research and Development (R&D) with the subject are 30 student college department of Mathematics education at Universitas PGRI Madiun. The development model used is 4-D modified to 3-D until the stage development. The type of data used is descriptive qualitative data. Instruments used are sheets of validation, test, and questionnaire. Development results indicate that group investigation-based book on numerical analysis in the category of valid a value 84.25%. Students response to the books very positive, so group investigation-based book on numerical analysis category practical, i.e., 86.00%. The use of group investigation-based book on numerical analysis has been meeting the completeness criteria classical learning that is 84.32 %. Based on research result of this study concluded that group investigation-based book on numerical analysis is feasible because it meets the criteria valid, practical, and effective. So, the book can be used by every mathematics academician. The next research can be observed that book based group investigation in other subjects.
Nakai, Yasushi; Takiguchi, Tetsuya; Matsui, Gakuyo; Yamaoka, Noriko; Takada, Satoshi
2017-10-01
Abnormal prosody is often evident in the voice intonations of individuals with autism spectrum disorders. We compared a machine-learning-based voice analysis with human hearing judgments made by 10 speech therapists for classifying children with autism spectrum disorders ( n = 30) and typical development ( n = 51). Using stimuli limited to single-word utterances, machine-learning-based voice analysis was superior to speech therapist judgments. There was a significantly higher true-positive than false-negative rate for machine-learning-based voice analysis but not for speech therapists. Results are discussed in terms of some artificiality of clinician judgments based on single-word utterances, and the objectivity machine-learning-based voice analysis adds to judging abnormal prosody.
Novel Image Encryption Scheme Based on Chebyshev Polynomial and Duffing Map
2014-01-01
We present a novel image encryption algorithm using Chebyshev polynomial based on permutation and substitution and Duffing map based on substitution. Comprehensive security analysis has been performed on the designed scheme using key space analysis, visual testing, histogram analysis, information entropy calculation, correlation coefficient analysis, differential analysis, key sensitivity test, and speed test. The study demonstrates that the proposed image encryption algorithm shows advantages of more than 10113 key space and desirable level of security based on the good statistical results and theoretical arguments. PMID:25143970
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-22
... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...
ERIC Educational Resources Information Center
Tutlys, Vidmantas; Spöttl, Georg
2017-01-01
Purpose: This paper aims to explore methodological and institutional challenges on application of the work-process analysis approach in the design and development of competence-based occupational standards for Lithuania. Design/methodology/approach: The theoretical analysis is based on the review of scientific literature and the analysis of…
1989-08-01
Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
On-Line Analysis of Southern FIA Data
Michael P. Spinney; Paul C. Van Deusen; Francis A. Roesch
2006-01-01
The Southern On-Line Estimator (SOLE) is a web-based FIA database analysis tool designed with an emphasis on modularity. The Java-based user interface is simple and intuitive to use and the R-based analysis engine is fast and stable. Each component of the program (data retrieval, statistical analysis and output) can be individually modified to accommodate major...
2016-09-01
HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON PORTER’S FIVE FORCES FRAMEWORK FOR...TITLE AND SUBTITLE SECURING HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON...Distribution is unlimited. SECURING HEALTHCARE’S QUANTIFIED-SELF DATA: A COMPARATIVE ANALYSIS VERSUS PERSONAL FINANCIAL ACCOUNT AGGREGATORS BASED ON
2012-12-01
IN U.S. PACIFIC COMMAND: A COST- BASED ANALYSIS AND COMPARATIVE ADVANTAGE TO COMMERCIAL SHIPMENT by Tod B. Diffey Matthew J. Beck December...PACIFIC COMMAND: A COST- BASED ANALYSIS AND COMPARATIVE ADVANTAGE TO COMMERCIAL SHIPMENT 5. FUNDING NUMBERS 6. AUTHOR(S) Tod B. Diffey and Matthew...this study will provide a cost based analysis and qualitative evaluation regarding the use of commercial agencies and/or United States Marine Corps
ERIC Educational Resources Information Center
Shintani, Natsuko; Li, Shaofeng; Ellis, Rod
2013-01-01
This article reports a meta-analysis of studies that investigated the relative effectiveness of comprehension-based instruction (CBI) and production-based instruction (PBI). The meta-analysis only included studies that featured a direct comparison of CBI and PBI in order to ensure methodological and statistical robustness. A total of 35 research…
Using Willie's Acid-Base Box for Blood Gas Analysis
ERIC Educational Resources Information Center
Dietz, John R.
2011-01-01
In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…
Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination
NASA Technical Reports Server (NTRS)
Groen, Frank
2010-01-01
This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.
49 CFR 236.923 - Task analysis and basic requirements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... classroom, simulator, computer-based, hands-on, or other formally structured training and testing, except... for Processor-Based Signal and Train Control Systems § 236.923 Task analysis and basic requirements...) Based on a formal task analysis, identify the installation, maintenance, repair, modification...
Analysis of DNA Sequences by An Optical Time-Integrating Correlator: Proof-Of-Concept Experiments.
1992-05-01
TABLES xv LIST OF ABBREVIATIONS xvii 1.0 INTRODUCTION 1 2.0 DNA ANALYSIS STRATEGY 4 2.1 Representation of DNA Bases 4 2.2 DNA Analysis Strategy 6 3.0...Zehnder architecture. 3 Figure 3: Short representations of the DNA bases where each base is represented by a 7-bits long pseudorandom sequence. 5... DNA bases where each base is represented by 7-bits long pseudorandom sequences. 4 Table 2: Long representations of the DNA bases with 255-bits maximum
Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models
NASA Astrophysics Data System (ADS)
Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter
Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.
ERIC Educational Resources Information Center
Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse
2015-01-01
The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…
YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.
Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh
2015-01-16
Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the preliminary results showed differences in virulence genes found in Yersinia pestis and Yersinia pseudotuberculosis compared to other Yersinia species, and differences between Yersinia enterocolitica subsp. enterocolitica and Yersinia enterocolitica subsp. palearctica. YersiniaBase offers free access to wide range of genomic data and analysis tools for the analysis of Yersinia. YersiniaBase can be accessed at http://yersinia.um.edu.my .
Analysis of space shuttle main engine data using Beacon-based exception analysis for multi-missions
NASA Technical Reports Server (NTRS)
Park, H.; Mackey, R.; James, M.; Zak, M.; Kynard, M.; Sebghati, J.; Greene, W.
2002-01-01
This paper describes analysis of the Space Shuttle Main Engine (SSME) sensor data using Beacon-based exception analysis for multimissions (BEAM), a new technology developed for sensor analysis and diagnostics in autonomous space systems by the Jet Propulsion Laboratory (JPL).
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
2007-05-01
BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...8217\\ \\ \\ \\ \\\\ \\ ~ >(- \\ , ~ AOC01 \\ PS018 / WP002 \\ DP008 // WP006 \\ ~ ,/ ’----- -----·-------------~--/·/ LAUGHLIN AIR FORCE BASE ENVIROMENTAL RESTORATION
Evidence based practice readiness: A concept analysis.
Schaefer, Jessica D; Welton, John M
2018-01-15
To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.
Engineering Analysis Using a Web-based Protocol
NASA Technical Reports Server (NTRS)
Schoeffler, James D.; Claus, Russell W.
2002-01-01
This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Analysis of DNA Sequences by an Optical Time-Integrating Correlator: Proposal
1991-11-01
OF THE PROBLEM AND CURRENT TECHNOLOGY 2 3.0 TIME-INTEGRATING CORRELATOR 2 4.0 REPRESENTATIONS OF THE DNA BASES 8 5.0 DNA ANALYSIS STRATEGY 8 6.0... DNA bases where each base is represented by a 7-bits long pseudorandom sequence. 9 Figure 5: The flow of data in a DNA analysis system based on an...logarithmic scale and a linear scale. 15 x LIST OF TABLES PAGE Table 1: Short representations of the DNA bases where each base is represented by 7-bits
NASA Technical Reports Server (NTRS)
Park, Han G.; Cannon, Howard; Bajwa, Anupa; Mackey, Ryan; James, Mark; Maul, William
2004-01-01
This paper describes the initial integration of a hybrid reasoning system utilizing a continuous domain feature-based detector, Beacon-based Exceptions Analysis for Multimissions (BEAM), and a discrete domain model-based reasoner, Livingstone.
Practical Application of Model-based Programming and State-based Architecture to Space Missions
NASA Technical Reports Server (NTRS)
Horvath, Gregory; Ingham, Michel; Chung, Seung; Martin, Oliver; Williams, Brian
2006-01-01
A viewgraph presentation to develop models from systems engineers that accomplish mission objectives and manage the health of the system is shown. The topics include: 1) Overview; 2) Motivation; 3) Objective/Vision; 4) Approach; 5) Background: The Mission Data System; 6) Background: State-based Control Architecture System; 7) Background: State Analysis; 8) Overview of State Analysis; 9) Background: MDS Software Frameworks; 10) Background: Model-based Programming; 10) Background: Titan Model-based Executive; 11) Model-based Execution Architecture; 12) Compatibility Analysis of MDS and Titan Architectures; 13) Integrating Model-based Programming and Execution into the Architecture; 14) State Analysis and Modeling; 15) IMU Subsystem State Effects Diagram; 16) Titan Subsystem Model: IMU Health; 17) Integrating Model-based Programming and Execution into the Software IMU; 18) Testing Program; 19) Computationally Tractable State Estimation & Fault Diagnosis; 20) Diagnostic Algorithm Performance; 21) Integration and Test Issues; 22) Demonstrated Benefits; and 23) Next Steps
Heading in the right direction: thermodynamics-based network analysis and pathway engineering.
Ataman, Meric; Hatzimanikatis, Vassily
2015-12-01
Thermodynamics-based network analysis through the introduction of thermodynamic constraints in metabolic models allows a deeper analysis of metabolism and guides pathway engineering. The number and the areas of applications of thermodynamics-based network analysis methods have been increasing in the last ten years. We review recent applications of these methods and we identify the areas that such analysis can contribute significantly, and the needs for future developments. We find that organisms with multiple compartments and extremophiles present challenges for modeling and thermodynamics-based flux analysis. The evolution of current and new methods must also address the issues of the multiple alternatives in flux directionalities and the uncertainties and partial information from analytical methods. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
32 CFR 989.25 - Base closure and realignment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Base closure and realignment. 989.25 Section 989... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.25 Base closure and realignment. Base closure or realignment may entail special requirements for environmental analysis. The permanent base closure...
32 CFR 989.25 - Base closure and realignment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 6 2012-07-01 2012-07-01 false Base closure and realignment. 989.25 Section 989... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.25 Base closure and realignment. Base closure or realignment may entail special requirements for environmental analysis. The permanent base closure...
Space shuttle booster multi-engine base flow analysis
NASA Technical Reports Server (NTRS)
Tang, H. H.; Gardiner, C. R.; Anderson, W. A.; Navickas, J.
1972-01-01
A comprehensive review of currently available techniques pertinent to several prominent aspects of the base thermal problem of the space shuttle booster is given along with a brief review of experimental results. A tractable engineering analysis, capable of predicting the power-on base pressure, base heating, and other base thermal environmental conditions, such as base gas temperature, is presented and used for an analysis of various space shuttle booster configurations. The analysis consists of a rational combination of theoretical treatments of the prominent flow interaction phenomena in the base region. These theories consider jet mixing, plume flow, axisymmetric flow effects, base injection, recirculating flow dynamics, and various modes of heat transfer. Such effects as initial boundary layer expansion at the nozzle lip, reattachment, recompression, choked vent flow, and nonisoenergetic mixing processes are included in the analysis. A unified method was developed and programmed to numerically obtain compatible solutions for the various flow field components in both flight and ground test conditions. Preliminary prediction for a 12-engine space shuttle booster base thermal environment was obtained for a typical trajectory history. Theoretical predictions were also obtained for some clustered-engine experimental conditions. Results indicate good agreement between the data and theoretical predicitons.
Team-Based Care: A Concept Analysis.
Baik, Dawon
2017-10-01
The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.
Gender-Based Analysis On-Line Dialogue. Final Report.
ERIC Educational Resources Information Center
2001
An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with…
ERIC Educational Resources Information Center
Larkin, Wallace; Hawkins, Renee O.; Collins, Tai
2016-01-01
Functional behavior assessments and function-based interventions are effective methods for addressing the challenging behaviors of children; however, traditional functional analysis has limitations that impact usability in applied settings. Trial-based functional analysis addresses concerns relating to the length of time, level of expertise…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... of Management and Budget (OMB) cleared the associated information collection requirements (ICR) on...
Knowledge-Based Image Analysis.
1981-04-01
UNCLASSIF1 ED ETL-025s N IIp ETL-0258 AL Ai01319 S"Knowledge-based image analysis u George C. Stockman Barbara A. Lambird I David Lavine Laveen N. Kanal...extraction, verification, region classification, pattern recognition, image analysis . 3 20. A. CT (Continue on rever.. d. It necessary and Identify by...UNCLgSTFTF n In f SECURITY CLASSIFICATION OF THIS PAGE (When Date Entered) .L1 - I Table of Contents Knowledge Based Image Analysis I Preface
CrossTalk: The Journal of Defense Software Engineering. Volume 27, Number 1, January/February 2014
2014-02-01
deficit in trustworthiness and will permit analysis on how this deficit needs to be overcome. This analysis will help identify adaptations that are...approaches to trustworthy analysis split into two categories: product-based and process-based. Product-based techniques [9] identify factors that...Criticalities may also be assigned to decompositions and contributions. 5. Evaluation and analysis : in this task the propagation rules of the NFR
Exploring JavaScript and ROOT technologies to create Web-based ATLAS analysis and monitoring tools
NASA Astrophysics Data System (ADS)
Sánchez Pineda, A.
2015-12-01
We explore the potential of current web applications to create online interfaces that allow the visualization, interaction and real cut-based physics analysis and monitoring of processes through a web browser. The project consists in the initial development of web- based and cloud computing services to allow students and researchers to perform fast and very useful cut-based analysis on a browser, reading and using real data and official Monte- Carlo simulations stored in ATLAS computing facilities. Several tools are considered: ROOT, JavaScript and HTML. Our study case is the current cut-based H → ZZ → llqq analysis of the ATLAS experiment. Preliminary but satisfactory results have been obtained online.
Peterson, Leif E
2002-01-01
CLUSFAVOR (CLUSter and Factor Analysis with Varimax Orthogonal Rotation) 5.0 is a Windows-based computer program for hierarchical cluster and principal-component analysis of microarray-based transcriptional profiles. CLUSFAVOR 5.0 standardizes input data; sorts data according to gene-specific coefficient of variation, standard deviation, average and total expression, and Shannon entropy; performs hierarchical cluster analysis using nearest-neighbor, unweighted pair-group method using arithmetic averages (UPGMA), or furthest-neighbor joining methods, and Euclidean, correlation, or jack-knife distances; and performs principal-component analysis. PMID:12184816
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
NASA Technical Reports Server (NTRS)
Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)
2004-01-01
A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).
Market-Based Multirobot Coordination: A Survey and Analysis
2005-04-01
observe new information about their surroundings. Market -based approaches can often seamlessly incorporate online tasks by auctioning new tasks as they... Market -Based Multirobot Coordination: A Survey and Analysis M. Bernardine Dias, Robert Zlot, Nidhi Kalra, and Anthony Stentz CMU-RI-TR-05-13 April...00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Market -Based Multirobot Coordination: A Survey and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER
2012-11-20
pbofbp= Aviation Logistics in U.S. Pacific Command: A Cost- Based Analysis and Comparative Advantage to Commercial Shipment 20 November 2012...AND SUBTITLE Aviation Logistics in U.S. Pacific Command: A Cost- Based Analysis and Comparative Advantage to Commercial Shipment 5a. CONTRACT NUMBER...asset throughput in the customs departments of our allied nations. In considering and analyzing these dynamics, this study provides a cost- based
McLean, Kendra; Day, Lesley; Dalton, Andrew
2015-03-26
Falls among older people are of growing concern globally. Implementing cost-effective strategies for their prevention is of utmost importance given the ageing population and associated potential for increased costs of fall-related injury over the next decades. The purpose of this study was to undertake a cost-utility analysis and secondary cost-effectiveness analysis from a healthcare system perspective, of a group-based exercise program compared to routine care for falls prevention in an older community-dwelling population. A decision analysis using a decision tree model was based on the results of a previously published randomised controlled trial with a community-dwelling population aged over 70. Measures of falls, fall-related injuries and resource use were directly obtained from trial data and supplemented by literature-based utility measures. A sub-group analysis was performed of women only. Cost estimates are reported in 2010 British Pound Sterling (GBP). The ICER of GBP£51,483 per QALY for the base case analysis was well above the accepted cost-effectiveness threshold of GBP£20,000 to £30,000 per QALY, but in a sensitivity analysis with minimised program implementation the incremental cost reached GBP£25,678 per QALY. The ICER value at 95% confidence in the base case analysis was GBP£99,664 per QALY and GBP£50,549 per QALY in the lower cost analysis. Males had a 44% lower injury rate if they fell, compared to females resulting in a more favourable ICER for the women only analysis. For women only the ICER was GBP£22,986 per QALY in the base case and was below the cost-effectiveness threshold for all other variations of program implementation. The ICER value at 95% confidence was GBP£48,212 in the women only base case analysis and GBP£23,645 in the lower cost analysis. The base case incremental cost per fall averted was GBP£652 (GBP£616 for women only). A threshold analysis indicates that this exercise program cannot realistically break even. The results suggest that this exercise program is cost-effective for women only. There is no evidence to support its cost-effectiveness in a group of mixed gender unless the costs of program implementation are minimal. Conservative assumptions may have underestimated the true cost-effectiveness of the program.
Nonlinear multi-analysis of agent-based financial market dynamics by epidemic system
NASA Astrophysics Data System (ADS)
Lu, Yunfan; Wang, Jun; Niu, Hongli
2015-10-01
Based on the epidemic dynamical system, we construct a new agent-based financial time series model. In order to check and testify its rationality, we compare the statistical properties of the time series model with the real stock market indices, Shanghai Stock Exchange Composite Index and Shenzhen Stock Exchange Component Index. For analyzing the statistical properties, we combine the multi-parameter analysis with the tail distribution analysis, the modified rescaled range analysis, and the multifractal detrended fluctuation analysis. For a better perspective, the three-dimensional diagrams are used to present the analysis results. The empirical research in this paper indicates that the long-range dependence property and the multifractal phenomenon exist in the real returns and the proposed model. Therefore, the new agent-based financial model can recurrence some important features of real stock markets.
A Method for Cognitive Task Analysis
1992-07-01
A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.
Aoki, Shuichiro; Murata, Hiroshi; Fujino, Yuri; Matsuura, Masato; Miki, Atsuya; Tanito, Masaki; Mizoue, Shiro; Mori, Kazuhiko; Suzuki, Katsuyoshi; Yamashita, Takehiro; Kashiwagi, Kenji; Hirasawa, Kazunori; Shoji, Nobuyuki; Asaoka, Ryo
2017-12-01
To investigate the usefulness of the Octopus (Haag-Streit) EyeSuite's cluster trend analysis in glaucoma. Ten visual fields (VFs) with the Humphrey Field Analyzer (Carl Zeiss Meditec), spanning 7.7 years on average were obtained from 728 eyes of 475 primary open angle glaucoma patients. Mean total deviation (mTD) trend analysis and EyeSuite's cluster trend analysis were performed on various series of VFs (from 1st to 10th: VF1-10 to 6th to 10th: VF6-10). The results of the cluster-based trend analysis, based on different lengths of VF series, were compared against mTD trend analysis. Cluster-based trend analysis and mTD trend analysis results were significantly associated in all clusters and with all lengths of VF series. Between 21.2% and 45.9% (depending on VF series length and location) of clusters were deemed to progress when the mTD trend analysis suggested no progression. On the other hand, 4.8% of eyes were observed to progress using the mTD trend analysis when cluster trend analysis suggested no progression in any two (or more) clusters. Whole field trend analysis can miss local VF progression. Cluster trend analysis appears as robust as mTD trend analysis and useful to assess both sectorial and whole field progression. Cluster-based trend analyses, in particular the definition of two or more progressing cluster, may help clinicians to detect glaucomatous progression in a timelier manner than using a whole field trend analysis, without significantly compromising specificity. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Analysis and Risk-Based Preventive Controls for Human Food'' (the proposed preventive controls rule) and... Farm.'' The purpose of the draft RA is to provide a science-based risk analysis of those activity/food... Food, Drug, and Cosmetic Act for hazard analysis and risk-based preventive controls (the proposed...
2013-03-31
certainly remain comingled with other solid waste. For example, some bases provided containers for segregation of recyclables including plastic and...prevalent types of solid waste are food (19.1% by average sample weight), wood (18.9%), and plastics (16.0%) based on analysis of bases in...within the interval shown. Food and wood wastes are the largest components of the average waste stream (both at ~19% by weight), followed by plastic
A Risk-Analysis Approach to Implementing Web-Based Assessment
ERIC Educational Resources Information Center
Ricketts, Chris; Zakrzewski, Stan
2005-01-01
Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…
CognitionMaster: an object-based image analysis framework
2013-01-01
Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542
Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J
2013-05-01
Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.
A new metaphor for projection-based visual analysis and data exploration
NASA Astrophysics Data System (ADS)
Schreck, Tobias; Panse, Christian
2007-01-01
In many important application domains such as Business and Finance, Process Monitoring, and Security, huge and quickly increasing volumes of complex data are collected. Strong efforts are underway developing automatic and interactive analysis tools for mining useful information from these data repositories. Many data analysis algorithms require an appropriate definition of similarity (or distance) between data instances to allow meaningful clustering, classification, and retrieval, among other analysis tasks. Projection-based data visualization is highly interesting (a) for visual discrimination analysis of a data set within a given similarity definition, and (b) for comparative analysis of similarity characteristics of a given data set represented by different similarity definitions. We introduce an intuitive and effective novel approach for projection-based similarity visualization for interactive discrimination analysis, data exploration, and visual evaluation of metric space effectiveness. The approach is based on the convex hull metaphor for visually aggregating sets of points in projected space, and it can be used with a variety of different projection techniques. The effectiveness of the approach is demonstrated by application on two well-known data sets. Statistical evidence supporting the validity of the hull metaphor is presented. We advocate the hull-based approach over the standard symbol-based approach to projection visualization, as it allows a more effective perception of similarity relationships and class distribution characteristics.
Fruehwald-Pallamar, J; Hesselink, J R; Mafee, M F; Holzer-Fruehwald, L; Czerny, C; Mayerhoefer, M E
2016-02-01
To evaluate whether texture-based analysis of standard MRI sequences can help in the discrimination between benign and malignant head and neck tumors. The MR images of 100 patients with a histologically clarified head or neck mass, from two different institutions, were analyzed. Texture-based analysis was performed using texture analysis software, with region of interest measurements for 2 D and 3 D evaluation independently for all axial sequences. COC, RUN, GRA, ARM, and WAV features were calculated for all ROIs. 10 texture feature subsets were used for a linear discriminant analysis, in combination with k-nearest-neighbor classification. Benign and malignant tumors were compared with regard to texture-based values. There were differences in the images from different field-strength scanners, as well as from different vendors. For the differentiation of benign and malignant tumors, we found differences on STIR and T2-weighted images for 2 D, and on contrast-enhanced T1-TSE with fat saturation for 3 D evaluation. In a separate analysis of the subgroups 1.5 and 3 Tesla, more discriminating features were found. Texture-based analysis is a useful tool in the discrimination of benign and malignant tumors when performed on one scanner with the same protocol. We cannot recommend this technique for the use of multicenter studies with clinical data. 2 D/3 D texture-based analysis can be performed in head and neck tumors. Texture-based analysis can differentiate between benign and malignant masses. Analyzed MR images should originate from one scanner with an identical protocol. © Georg Thieme Verlag KG Stuttgart · New York.
The Potential of Multivariate Analysis in Assessing Students' Attitude to Curriculum Subjects
ERIC Educational Resources Information Center
Gaotlhobogwe, Michael; Laugharne, Janet; Durance, Isabelle
2011-01-01
Background: Understanding student attitudes to curriculum subjects is central to providing evidence-based options to policy makers in education. Purpose: We illustrate how quantitative approaches used in the social sciences and based on multivariate analysis (categorical Principal Components Analysis, Clustering Analysis and General Linear…
ERIC Educational Resources Information Center
Burk, Erlan
2012-01-01
Aerospace companies needed additional research on technology-based training to verify expectations when enhancing human capital through online systems analysis training. The research for online systems analysis training provided aerospace companies a means to verify expectations for systems analysis technology-based training on business…
Improving the Accuracy of Software-Based Energy Analysis for Residential Buildings (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polly, B.
2011-09-01
This presentation describes the basic components of software-based energy analysis for residential buildings, explores the concepts of 'error' and 'accuracy' when analysis predictions are compared to measured data, and explains how NREL is working to continuously improve the accuracy of energy analysis methods.
Lorenzen, Nina Dyrberg; Stilling, Maiken; Jakobsen, Stig Storgaard; Gustafson, Klas; Søballe, Kjeld; Baad-Hansen, Thomas
2013-11-01
The stability of implants is vital to ensure a long-term survival. RSA determines micro-motions of implants as a predictor of early implant failure. RSA can be performed as a marker- or model-based analysis. So far, CAD and RE model-based RSA have not been validated for use in hip resurfacing arthroplasty (HRA). A phantom study determined the precision of marker-based and CAD and RE model-based RSA on a HRA implant. In a clinical study, 19 patients were followed with stereoradiographs until 5 years after surgery. Analysis of double-examination migration results determined the clinical precision of marker-based and CAD model-based RSA, and at the 5-year follow-up, results of the total translation (TT) and the total rotation (TR) for marker- and CAD model-based RSA were compared. The phantom study showed that comparison of the precision (SDdiff) in marker-based RSA analysis was more precise than model-based RSA analysis in TT (p CAD < 0.001; p RE = 0.04) and TR (p CAD = 0.01; p RE < 0.001). The clinical precision (double examination in 8 patients) comparing the precision SDdiff was better evaluating the TT using the marker-based RSA analysis (p = 0.002), but showed no difference between the marker- and CAD model-based RSA analysis regarding the TR (p = 0.91). Comparing the mean signed values regarding the TT and the TR at the 5-year follow-up in 13 patients, the TT was lower (p = 0.03) and the TR higher (p = 0.04) in the marker-based RSA compared to CAD model-based RSA. The precision of marker-based RSA was significantly better than model-based RSA. However, problems with occluded markers lead to exclusion of many patients which was not a problem with model-based RSA. HRA were stable at the 5-year follow-up. The detection limit was 0.2 mm TT and 1° TR for marker-based and 0.5 mm TT and 1° TR for CAD model-based RSA for HRA.
Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.
Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily
2018-05-01
Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.
Imperial College near infrared spectroscopy neuroimaging analysis framework.
Orihuela-Espina, Felipe; Leff, Daniel R; James, David R C; Darzi, Ara W; Yang, Guang-Zhong
2018-01-01
This paper describes the Imperial College near infrared spectroscopy neuroimaging analysis (ICNNA) software tool for functional near infrared spectroscopy neuroimaging data. ICNNA is a MATLAB-based object-oriented framework encompassing an application programming interface and a graphical user interface. ICNNA incorporates reconstruction based on the modified Beer-Lambert law and basic processing and data validation capabilities. Emphasis is placed on the full experiment rather than individual neuroimages as the central element of analysis. The software offers three types of analyses including classical statistical methods based on comparison of changes in relative concentrations of hemoglobin between the task and baseline periods, graph theory-based metrics of connectivity and, distinctively, an analysis approach based on manifold embedding. This paper presents the different capabilities of ICNNA in its current version.
Yan, W Y; Li, L; Yang, Y G; Lin, X L; Wu, J Z
2016-08-01
We designed a computer-based respiratory sound analysis system to identify pediatric normal lung sound. To verify the validity of the computer-based respiratory sound analysis system. First we downloaded the standard lung sounds from the network database (website: http: //www.easyauscultation.com/lung-sounds-reference-guide) and recorded 3 samples of abnormal loud sound (rhonchi, wheeze and crackles) from three patients of The Department of Pediatrics, the First Affiliated Hospital of Xiamen University. We regarded such lung sounds as"reference lung sounds". The"test lung sounds"were recorded from 29 children form Kindergarten of Xiamen University. we recorded lung sound by portable electronic stethoscope and valid lung sounds were selected by manual identification. We introduced Mel-frequency cepstral coefficient (MFCC) to extract lung sound features and dynamic time warping (DTW) for signal classification. We had 39 standard lung sounds, recorded 58 test lung sounds. This computer-based respiratory sound analysis system was carried out in 58 lung sound recognition, correct identification of 52 times, error identification 6 times. Accuracy was 89.7%. Based on MFCC and DTW, our computer-based respiratory sound analysis system can effectively identify healthy lung sounds of children (accuracy can reach 89.7%), fully embodies the reliability of the lung sounds analysis system.
SensA: web-based sensitivity analysis of SBML models.
Floettmann, Max; Uhlendorf, Jannis; Scharp, Till; Klipp, Edda; Spiesser, Thomas W
2014-10-01
SensA is a web-based application for sensitivity analysis of mathematical models. The sensitivity analysis is based on metabolic control analysis, computing the local, global and time-dependent properties of model components. Interactive visualization facilitates interpretation of usually complex results. SensA can contribute to the analysis, adjustment and understanding of mathematical models for dynamic systems. SensA is available at http://gofid.biologie.hu-berlin.de/ and can be used with any modern browser. The source code can be found at https://bitbucket.org/floettma/sensa/ (MIT license) © The Author 2014. Published by Oxford University Press.
Preprocessing and Analysis of LC-MS-Based Proteomic Data
Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W.
2016-01-01
Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed. PMID:26519169
Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective
NASA Astrophysics Data System (ADS)
Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.
2016-06-01
We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).
ERIC Educational Resources Information Center
Yasar, M. Diyaddin
2017-01-01
This study aimed at performing content analysis and meta-analysis on dissertations related to brain-based learning in science education to find out the general trend and tendency of brain-based learning in science education and find out the effect of such studies on achievement and attitude of learners with the ultimate aim of raising awareness…
Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis
ERIC Educational Resources Information Center
Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.
2013-01-01
Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
2012-01-01
Background This study illustrates an evidence-based method for the segmentation analysis of patients that could greatly improve the approach to population-based medicine, by filling a gap in the empirical analysis of this topic. Segmentation facilitates individual patient care in the context of the culture, health status, and the health needs of the entire population to which that patient belongs. Because many health systems are engaged in developing better chronic care management initiatives, patient profiles are critical to understanding whether some patients can move toward effective self-management and can play a central role in determining their own care, which fosters a sense of responsibility for their own health. A review of the literature on patient segmentation provided the background for this research. Method First, we conducted a literature review on patient satisfaction and segmentation to build a survey. Then, we performed 3,461 surveys of outpatient services users. The key structures on which the subjects’ perception of outpatient services was based were extrapolated using principal component factor analysis with varimax rotation. After the factor analysis, segmentation was performed through cluster analysis to better analyze the influence of individual attitudes on the results. Results Four segments were identified through factor and cluster analysis: the “unpretentious,” the “informed and supported,” the “experts” and the “advanced” patients. Their policies and managerial implications are outlined. Conclusions With this research, we provide the following: – a method for profiling patients based on common patient satisfaction surveys that is easily replicable in all health systems and contexts; – a proposal for segments based on the results of a broad-based analysis conducted in the Italian National Health System (INHS). Segments represent profiles of patients requiring different strategies for delivering health services. Their knowledge and analysis might support an effort to build an effective population-based medicine approach. PMID:23256543
Jiang, Wei; Yu, Weichuan
2017-02-15
In genome-wide association studies (GWASs) of common diseases/traits, we often analyze multiple GWASs with the same phenotype together to discover associated genetic variants with higher power. Since it is difficult to access data with detailed individual measurements, summary-statistics-based meta-analysis methods have become popular to jointly analyze datasets from multiple GWASs. In this paper, we propose a novel summary-statistics-based joint analysis method based on controlling the joint local false discovery rate (Jlfdr). We prove that our method is the most powerful summary-statistics-based joint analysis method when controlling the false discovery rate at a certain level. In particular, the Jlfdr-based method achieves higher power than commonly used meta-analysis methods when analyzing heterogeneous datasets from multiple GWASs. Simulation experiments demonstrate the superior power of our method over meta-analysis methods. Also, our method discovers more associations than meta-analysis methods from empirical datasets of four phenotypes. The R-package is available at: http://bioinformatics.ust.hk/Jlfdr.html . eeyu@ust.hk. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
1992-12-21
in preparation). Foundations of artificial intelligence. Cambridge, MA: MIT Press. O’Reilly, R. C. (1991). X3DNet: An X- Based Neural Network ...2.2.3 Trace based protocol analysis 19 2.2A Summary of important data features 21 2.3 Tools related to process model testing 23 2.3.1 Tools for building...algorithm 57 3. Requirements for testing process models using trace based protocol 59 analysis 3.1 Definition of trace based protocol analysis (TBPA) 59
He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z
2013-12-04
Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.
[Big data analysis and evidence-based medicine: controversy or cooperation].
Chen, Xinzu; Hu, Jiankun
2016-01-01
The development of evidence-based medicince should be an important milestone from the empirical medicine to the evidence-driving modern medicine. With the outbreak in biomedical data, the rising big data analysis can efficiently solve exploratory questions or decision-making issues in biomedicine and healthcare activities. The current problem in China is that big data analysis is still not well conducted and applied to deal with problems such as clinical decision-making, public health policy, and should not be a debate whether big data analysis can replace evidence-based medicine or not. Therefore, we should clearly understand, no matter whether evidence-based medicine or big data analysis, the most critical infrastructure must be the substantial work in the design, constructure and collection of original database in China.
Li, Zhiming; Yu, Lan; Wang, Xin; Yu, Haiyang; Gao, Yuanxiang; Ren, Yande; Wang, Gang; Zhou, Xiaoming
2017-11-09
The purpose of this study was to investigate the diagnostic performance of mammographic texture analysis in the differential diagnosis of benign and malignant breast tumors. Digital mammography images were obtained from the Picture Archiving and Communication System at our institute. Texture features of mammographic images were calculated. Mann-Whitney U test was used to identify differences between the benign and malignant group. The receiver operating characteristic (ROC) curve analysis was used to assess the diagnostic performance of texture features. Significant differences of texture features of histogram, gray-level co-occurrence matrix (GLCM) and run length matrix (RLM) were found between the benign and malignant breast group (P < .05). The area under the ROC (AUROC) of histogram, GLCM, and RLM were 0.800, 0.787, and 0.761, with no differences between them (P > .05). The AUROCs of imaging-based diagnosis, texture analysis, and imaging-based diagnosis combined with texture analysis were 0.873, 0.863, and 0.961, respectively. When imaging-based diagnosis was combined with texture analysis, the AUROC was higher than that of imaging-based diagnosis or texture analysis (P < .05). Mammographic texture analysis is a reliable technique for differential diagnosis of benign and malignant breast tumors. Furthermore, the combination of imaging-based diagnosis and texture analysis can significantly improve diagnostic performance. Copyright © 2017 Elsevier Inc. All rights reserved.
A Bootstrap Generalization of Modified Parallel Analysis for IRT Dimensionality Assessment
ERIC Educational Resources Information Center
Finch, Holmes; Monahan, Patrick
2008-01-01
This article introduces a bootstrap generalization to the Modified Parallel Analysis (MPA) method of test dimensionality assessment using factor analysis. This methodology, based on the use of Marginal Maximum Likelihood nonlinear factor analysis, provides for the calculation of a test statistic based on a parametric bootstrap using the MPA…
Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
Meta-analysis of pathway enrichment: combining independent and dependent omics data sets.
Kaever, Alexander; Landesfeind, Manuel; Feussner, Kirstin; Morgenstern, Burkhard; Feussner, Ivo; Meinicke, Peter
2014-01-01
A major challenge in current systems biology is the combination and integrative analysis of large data sets obtained from different high-throughput omics platforms, such as mass spectrometry based Metabolomics and Proteomics or DNA microarray or RNA-seq-based Transcriptomics. Especially in the case of non-targeted Metabolomics experiments, where it is often impossible to unambiguously map ion features from mass spectrometry analysis to metabolites, the integration of more reliable omics technologies is highly desirable. A popular method for the knowledge-based interpretation of single data sets is the (Gene) Set Enrichment Analysis. In order to combine the results from different analyses, we introduce a methodical framework for the meta-analysis of p-values obtained from Pathway Enrichment Analysis (Set Enrichment Analysis based on pathways) of multiple dependent or independent data sets from different omics platforms. For dependent data sets, e.g. obtained from the same biological samples, the framework utilizes a covariance estimation procedure based on the nonsignificant pathways in single data set enrichment analysis. The framework is evaluated and applied in the joint analysis of Metabolomics mass spectrometry and Transcriptomics DNA microarray data in the context of plant wounding. In extensive studies of simulated data set dependence, the introduced correlation could be fully reconstructed by means of the covariance estimation based on pathway enrichment. By restricting the range of p-values of pathways considered in the estimation, the overestimation of correlation, which is introduced by the significant pathways, could be reduced. When applying the proposed methods to the real data sets, the meta-analysis was shown not only to be a powerful tool to investigate the correlation between different data sets and summarize the results of multiple analyses but also to distinguish experiment-specific key pathways.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herrera, Joshua M.
2015-03-01
This report is an analysis of the means of egress and life safety requirements for the laboratory building. The building is located at Sandia National Laboratories (SNL) in Albuquerque, NM. The report includes a prescriptive-based analysis as well as a performance-based analysis. Following the analysis are appendices which contain maps of the laboratory building used throughout the analysis. The top of all the maps is assumed to be north.
Recent Electrochemical and Optical Sensors in Flow-Based Analysis
Chailapakul, Orawon; Ngamukot, Passapol; Yoosamran, Alongkorn; Siangproh, Weena; Wangfuengkanagul, Nattakarn
2006-01-01
Some recent analytical sensors based on electrochemical and optical detection coupled with different flow techniques have been chosen in this overview. A brief description of fundamental concepts and applications of each flow technique, such as flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA), and multipumped FIA (MPFIA) were reviewed.
An Analysis of the Formal Features of "Reality-Based" Television Programs.
ERIC Educational Resources Information Center
Neapolitan, D. M.
Reality-based television programs showcase actual footage or recreate actual events, and include programs such as "America's Most Wanted" and "Rescue 911." To identify the features that typify reality-based television programs, this study conducted an analysis of formal features used in reality-based programs. Formal features…
Relationship between Preferred and Actual Opinions about Inquiry-Based Instruction Classroom
ERIC Educational Resources Information Center
Nuangchalerm, Prasart
2017-01-01
Based on 10 preservice science teachers in 4 schools, this study presents a detailed analysis of how preservice teacher expectation interacts with school practicum and authentic classroom action of inquiry-based instruction. Classroom observation, lesson plan analysis, and interviews revealed that inquiry-based instruction in the expectation and…
Structural landscape of base pairs containing post-transcriptional modifications in RNA
Seelam, Preethi P.; Sharma, Purshotam
2017-01-01
Base pairs involving post-transcriptionally modified nucleobases are believed to play important roles in a wide variety of functional RNAs. Here we present our attempts toward understanding the structural and functional role of naturally occurring modified base pairs using a combination of X-ray crystal structure database analysis, sequence analysis, and advanced quantum chemical methods. Our bioinformatics analysis reveals that despite their presence in all major secondary structural elements, modified base pairs are most prevalent in tRNA crystal structures and most commonly involve guanine or uridine modifications. Further, analysis of tRNA sequences reveals additional examples of modified base pairs at structurally conserved tRNA regions and highlights the conservation patterns of these base pairs in three domains of life. Comparison of structures and binding energies of modified base pairs with their unmodified counterparts, using quantum chemical methods, allowed us to classify the base modifications in terms of the nature of their electronic structure effects on base-pairing. Analysis of specific structural contexts of modified base pairs in RNA crystal structures revealed several interesting scenarios, including those at the tRNA:rRNA interface, antibiotic-binding sites on the ribosome, and the three-way junctions within tRNA. These scenarios, when analyzed in the context of available experimental data, allowed us to correlate the occurrence and strength of modified base pairs with their specific functional roles. Overall, our study highlights the structural importance of modified base pairs in RNA and points toward the need for greater appreciation of the role of modified bases and their interactions, in the context of many biological processes involving RNA. PMID:28341704
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
2016-09-12
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Voulgarelis, Dimitrios; Velayudhan, Ajoy; Smith, Frank
2017-01-01
Agent-based models provide a formidable tool for exploring complex and emergent behaviour of biological systems as well as accurate results but with the drawback of needing a lot of computational power and time for subsequent analysis. On the other hand, equation-based models can more easily be used for complex analysis in a much shorter timescale. This paper formulates an ordinary differential equations and stochastic differential equations model to capture the behaviour of an existing agent-based model of tumour cell reprogramming and applies it to optimization of possible treatment as well as dosage sensitivity analysis. For certain values of the parameter space a close match between the equation-based and agent-based models is achieved. The need for division of labour between the two approaches is explored. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
An Analysis of Base Pressure at Supersonic Velocities and Comparison with Experiment
NASA Technical Reports Server (NTRS)
Chapman, Dean R
1951-01-01
In the first part of the investigation an analysis is made of base pressure in an inviscid fluid, both for two-dimensional and axially symmetric flow. It is shown that for two-dimensional flow, and also for the flow over a body of revolution with a cylindrical sting attached to the base, there are an infinite number of possible solutions satisfying all necessary boundary conditions at any given free-stream Mach number. For the particular case of a body having no sting attached only one solution is possible in an inviscid flow, but it corresponds to zero base drag. Accordingly, it is concluded that a strictly inviscid-fluid theory cannot be satisfactory for practical applications. An approximate semi-empirical analysis for base pressure in a viscous fluid is developed in a second part of the investigation. The semi-empirical analysis is based partly on inviscid-flow calculations.
EPA released the final report, Analysis of Fat Intake Based on USDA’s 1994-1996, 1998 Continuing Survey of Food Intakes by Individuals (CSFII, Final Report). For this report, the EPA conducted an analysis of fat consumption across the U.S. population based on data derived...
ERIC Educational Resources Information Center
Bosuwon, Takwa; Woodrow, Lindy
2009-01-01
This paper reports on a needs analysis underlying a proposed business English reading course using a problem-based learning approach designed to enhance English reading abilities of Thai undergraduate students. As part of a work in progress, the needs analysis survey was done prior to the course design with the major stakeholders in business and…
ERIC Educational Resources Information Center
Nugent, William R.
2017-01-01
Meta-analysis is a significant methodological advance that is increasingly important in research synthesis. Fundamental to meta-analysis is the presumption that effect sizes, such as the standardized mean difference (SMD), based on scores from different measures are comparable. It has been argued that population observed score SMDs based on scores…
Komorowski, Dariusz; Pietraszek, Stanislaw
2016-01-01
This paper presents the analysis of multi-channel electrogastrographic (EGG) signals using the continuous wavelet transform based on the fast Fourier transform (CWTFT). The EGG analysis was based on the determination of the several signal parameters such as dominant frequency (DF), dominant power (DP) and index of normogastria (NI). The use of continuous wavelet transform (CWT) allows for better visible localization of the frequency components in the analyzed signals, than commonly used short-time Fourier transform (STFT). Such an analysis is possible by means of a variable width window, which corresponds to the scale time of observation (analysis). Wavelet analysis allows using long time windows when we need more precise low-frequency information, and shorter when we need high frequency information. Since the classic CWT transform requires considerable computing power and time, especially while applying it to the analysis of long signals, the authors used the CWT analysis based on the fast Fourier transform (FFT). The CWT was obtained using properties of the circular convolution to improve the speed of calculation. This method allows to obtain results for relatively long records of EGG in a fairly short time, much faster than using the classical methods based on running spectrum analysis (RSA). In this study authors indicate the possibility of a parametric analysis of EGG signals using continuous wavelet transform which is the completely new solution. The results obtained with the described method are shown in the example of an analysis of four-channel EGG recordings, performed for a non-caloric meal.
Systematic Analysis of Icotinib Treatment for Patients with Non-Small Cell Lung Cancer.
Shi, Bing; Zhang, Xiu-Bing; Xu, Jian; Huang, Xin-En
2015-01-01
This analysis was conducted to evaluate the efficacy and safety of icotinib based regimens in treating patients with non-small cell lung cancer (NSCLC). Clinical studies evaluating the efficacy and safety of icotinib-based regimens with regard to response and safety for patients with NSCLC were identified using a predefined search strategy. Pooled response rates of treatment were calculated. With icotinib-based regimens, 7 clinical studies which including 5,985 Chinese patients with NSCLC were considered eligible for inclusion. The pooled analysis suggested that, in all patients, the positive reponse rate was 30.1% (1,803/5,985) with icotinib-based regimens. Mild skin itching, rashes and diarrhea were the main side effects. No grade III or IV renal or liver toxicity was observed. No treatment-related death occurred in patients treated with icotinib-based regimens. This evidence based analysis suggests that icotinib based regimens are associated with mild response rate and acceptable toxicity for treating Chinese patients with NSCLC.
Evaluating Web-Based Nursing Education's Effects: A Systematic Review and Meta-Analysis.
Kang, Jiwon; Seomun, GyeongAe
2017-09-01
This systematic review and meta-analysis investigated whether using web-based nursing educational programs increases a participant's knowledge and clinical performance. We performed a meta-analysis of studies published between January 2000 and July 2016 and identified through RISS, CINAHL, ProQuest Central, Embase, the Cochrane Library, and PubMed. Eleven studies were eligible for inclusion in this analysis. The results of the meta-analysis demonstrated significant differences not only for the overall effect but also specifically for blended programs and short (2 weeks or 4 weeks) intervention periods. To present more evidence supporting the effectiveness of web-based nursing educational programs, further research is warranted.
A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.
The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less
Van Neste, Christophe; Gansemans, Yannick; De Coninck, Dieter; Van Hoofstat, David; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip
2015-03-01
Routine use of massively parallel sequencing (MPS) for forensic genomics is on the horizon. The last few years, several algorithms and workflows have been developed to analyze forensic MPS data. However, none have yet been tailored to the needs of the forensic analyst who does not possess an extensive bioinformatics background. We developed our previously published forensic MPS data analysis framework MyFLq (My-Forensic-Loci-queries) into an open-source, user-friendly, web-based application. It can be installed as a standalone web application, or run directly from the Illumina BaseSpace environment. In the former, laboratories can keep their data on-site, while in the latter, data from forensic samples that are sequenced on an Illumina sequencer can be uploaded to Basespace during acquisition, and can subsequently be analyzed using the published MyFLq BaseSpace application. Additional features were implemented such as an interactive graphical report of the results, an interactive threshold selection bar, and an allele length-based analysis in addition to the sequenced-based analysis. Practical use of the application is demonstrated through the analysis of four 16-plex short tandem repeat (STR) samples, showing the complementarity between the sequence- and length-based analysis of the same MPS data. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Performance analysis and dynamic modeling of a single-spool turbojet engine
NASA Astrophysics Data System (ADS)
Andrei, Irina-Carmen; Toader, Adrian; Stroe, Gabriela; Frunzulica, Florin
2017-01-01
The purposes of modeling and simulation of a turbojet engine are the steady state analysis and transient analysis. From the steady state analysis, which consists in the investigation of the operating, equilibrium regimes and it is based on appropriate modeling describing the operation of a turbojet engine at design and off-design regimes, results the performance analysis, concluded by the engine's operational maps (i.e. the altitude map, velocity map and speed map) and the engine's universal map. The mathematical model that allows the calculation of the design and off-design performances, in case of a single spool turbojet is detailed. An in house code was developed, its calibration was done for the J85 turbojet engine as the test case. The dynamic modeling of the turbojet engine is obtained from the energy balance equations for compressor, combustor and turbine, as the engine's main parts. The transient analysis, which is based on appropriate modeling of engine and its main parts, expresses the dynamic behavior of the turbojet engine, and further, provides details regarding the engine's control. The aim of the dynamic analysis is to determine a control program for the turbojet, based on the results provided by performance analysis. In case of the single-spool turbojet engine, with fixed nozzle geometry, the thrust is controlled by one parameter, which is the fuel flow rate. The design and management of the aircraft engine controls are based on the results of the transient analysis. The construction of the design model is complex, since it is based on both steady-state and transient analysis, further allowing the flight path cycle analysis and optimizations. This paper presents numerical simulations for a single-spool turbojet engine (J85 as test case), with appropriate modeling for steady-state and dynamic analysis.
ERIC Educational Resources Information Center
Bhathal, Ragbir; Sharma, Manjula D.; Mendez, Alberto
2010-01-01
This paper describes an educational analysis of a first year physics experiment on standing waves for engineering students. The educational analysis is based on the ACELL (Advancing Chemistry by Enhancing Learning in the Laboratory) approach which includes a statement of educational objectives and an analysis of student learning experiences. The…
Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning
ERIC Educational Resources Information Center
Corry, Michael; Ianacone, Robert; Stella, Julie
2014-01-01
The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…
ANALYSIS/PLOT: a graphics package for use with the SORT/ANALYSIS data bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sady, C.A.
1983-08-01
This report describes a graphics package that is used with the SORT/ANALYSIS data bases. The data listed by the SORT/ANALYSIS program can be presented in pie, bar, line, or Gantt chart form. Instructions for the use of the plotting program and descriptions of the subroutines are given in the report.
Proceedings of the 6th annual Speakeasy conference. [Chicago, August 17-18, 1978
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-01-01
This meeting on the Speakeasy programming language and its applications included papers on the following subjects: graphics (graphics under Speakeasy, Speakeasy on a mini, color graphics), time series (OASIS - a user-oriented system at USDA, writing input-burdened linkules), applications (weather and crop yield analysis system, property investment analysis system), data bases under Speakeasy (relational data base, applications of relational data bases), survey analysis (survey analysis package from Liege, sic and its future under Speakeasy), and new features in Speakeasy (partial differential equations, the Speakeasy compiler and optimization). (RWR)
NASA Astrophysics Data System (ADS)
Wang, Hongliang; Liu, Baohua; Ding, Zhongjun; Wang, Xiangxin
2017-02-01
Absorption-based optical sensors have been developed for the determination of water pH. In this paper, based on the preparation of a transparent sol-gel thin film with a phenol red (PR) indicator, several calculation methods, including simple linear regression analysis, quadratic regression analysis and dual-wavelength absorbance ratio analysis, were used to calculate water pH. Results of MSSRR show that dual-wavelength absorbance ratio analysis can improve the calculation accuracy of water pH in long-term measurement.
Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.
Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M
2007-01-01
Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).
Information Leakage Analysis by Abstract Interpretation
NASA Astrophysics Data System (ADS)
Zanioli, Matteo; Cortesi, Agostino
Protecting the confidentiality of information stored in a computer system or transmitted over a public network is a relevant problem in computer security. The approach of information flow analysis involves performing a static analysis of the program with the aim of proving that there will not be leaks of sensitive information. In this paper we propose a new domain that combines variable dependency analysis, based on propositional formulas, and variables' value analysis, based on polyhedra. The resulting analysis is strictly more accurate than the state of the art abstract interpretation based analyses for information leakage detection. Its modular construction allows to deal with the tradeoff between efficiency and accuracy by tuning the granularity of the abstraction and the complexity of the abstract operators.
NASA Astrophysics Data System (ADS)
Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.
2018-02-01
A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.
Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D
NASA Astrophysics Data System (ADS)
Bales, Ben; Pollock, Tresa; Petzold, Linda
2017-06-01
Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.
NASA Astrophysics Data System (ADS)
Halim, N. Z. A.; Sulaiman, S. A.; Talib, K.; Yusof, O. M.; Wazir, M. A. M.; Adimin, M. K.
2017-10-01
This paper explains the process carried out in identifying the significant role of NDCDB in Malaysia specifically in the land-based analysis. The research was initially a part of a larger research exercise to identify the significance of NDCDB from the legal, technical, role and land-based analysis perspectives. The research methodology of applying the Delphi technique is substantially discussed in this paper. A heterogeneous panel of 14 experts was created to determine the importance of NDCDB from the role standpoint. Seven statements pertaining the significant role of NDCDB in Malaysia and land-based analysis were established after three rounds of consensus building. The agreed statements provided a clear definition to describe the important role of NDCDB in Malaysia and for land-based analysis, which was limitedly studied that lead to unclear perception to the general public and even the geospatial community. The connection of the statements with disaster management is discussed concisely at the end of the research.
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations
2013-03-01
TITLE: Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations PRINCIPAL INVESTIGATOR: Fengshan Liu...SUBTITLE 5a. CONTRACT NUMBER Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations 5b. GRANT NUMBER...identifying the prevalence of women with incomplete visualization of the breast . We developed a code to estimate the breast cancer risks using the
Analysis And Augmentation Of Timing Advance Based Geolocation In Lte Cellular Networks
2016-12-01
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA DISSERTATION ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCATION IN LTE CELLULAR NETWORKS by...estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the...AND SUBTITLE ANALYSIS AND AUGMENTATION OF TIMING ADVANCE-BASED GEOLOCA- TION IN LTE CELLULAR NETWORKS 5. FUNDING NUMBERS 6. AUTHOR(S) John D. Roth 7
2007-06-15
the base -case, a series analysis can be performed by varying the various inputs to the network to examine the impact of potential changes to improve...successfully interrogated was the primary MOE. • Based solely on the cost benefit analysis , the RSTG found that the addition of an Unmanned Surface...cargo. The CBP uses a risk based analysis and intelligence to pre-screen, assess and examine 100% of suspicious containers. The remaining cargo is
N'Diaye, Amidou; Haile, Jemanesh K; Cory, Aron T; Clarke, Fran R; Clarke, John M; Knox, Ron E; Pozniak, Curtis J
2017-01-01
Association mapping is usually performed by testing the correlation between a single marker and phenotypes. However, because patterns of variation within genomes are inherited as blocks, clustering markers into haplotypes for genome-wide scans could be a worthwhile approach to improve statistical power to detect associations. The availability of high-density molecular data allows the possibility to assess the potential of both approaches to identify marker-trait associations in durum wheat. In the present study, we used single marker- and haplotype-based approaches to identify loci associated with semolina and pasta colour in durum wheat, the main objective being to evaluate the potential benefits of haplotype-based analysis for identifying quantitative trait loci. One hundred sixty-nine durum lines were genotyped using the Illumina 90K Infinium iSelect assay, and 12,234 polymorphic single nucleotide polymorphism (SNP) markers were generated and used to assess the population structure and the linkage disequilibrium (LD) patterns. A total of 8,581 SNPs previously localized to a high-density consensus map were clustered into 406 haplotype blocks based on the average LD distance of 5.3 cM. Combining multiple SNPs into haplotype blocks increased the average polymorphism information content (PIC) from 0.27 per SNP to 0.50 per haplotype. The haplotype-based analysis identified 12 loci associated with grain pigment colour traits, including the five loci identified by the single marker-based analysis. Furthermore, the haplotype-based analysis resulted in an increase of the phenotypic variance explained (50.4% on average) and the allelic effect (33.7% on average) when compared to single marker analysis. The presence of multiple allelic combinations within each haplotype locus offers potential for screening the most favorable haplotype series and may facilitate marker-assisted selection of grain pigment colour in durum wheat. These results suggest a benefit of haplotype-based analysis over single marker analysis to detect loci associated with colour traits in durum wheat.
Pieterman, Elise D; Budde, Ricardo P J; Robbers-Visser, Daniëlle; van Domburg, Ron T; Helbing, Willem A
2017-09-01
Follow-up of right ventricular performance is important for patients with congenital heart disease. Cardiac magnetic resonance imaging is optimal for this purpose. However, observer-dependency of manual analysis of right ventricular volumes limit its use. Knowledge-based reconstruction is a new semiautomatic analysis tool that uses a database including knowledge of right ventricular shape in various congenital heart diseases. We evaluated whether knowledge-based reconstruction is a good alternative for conventional analysis. To assess the inter- and intra-observer variability and agreement of knowledge-based versus conventional analysis of magnetic resonance right ventricular volumes, analysis was done by two observers in a mixed group of 22 patients with congenital heart disease affecting right ventricular loading conditions (dextro-transposition of the great arteries and right ventricle to pulmonary artery conduit) and a group of 17 healthy children. We used Bland-Altman analysis and coefficient of variation. Comparison between the conventional method and the knowledge-based method showed a systematically higher volume for the latter group. We found an overestimation for end-diastolic volume (bias -40 ± 24 mL, r = .956), end-systolic volume (bias -34 ± 24 mL, r = .943), stroke volume (bias -6 ± 17 mL, r = .735) and an underestimation of ejection fraction (bias 7 ± 7%, r = .671) by knowledge-based reconstruction. The intra-observer variability of knowledge-based reconstruction varied with a coefficient of variation of 9% for end-diastolic volume and 22% for stroke volume. The same trend was noted for inter-observer variability. A systematic difference (overestimation) was noted for right ventricular size as assessed with knowledge-based reconstruction compared with conventional methods for analysis. Observer variability for the new method was comparable to what has been reported for the right ventricle in children and congenital heart disease with conventional analysis. © 2017 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction
NASA Technical Reports Server (NTRS)
Olson, Erik D.; Mavris, Dimitri N.
2006-01-01
An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.
NASA Technical Reports Server (NTRS)
Joshi, Anjali; Heimdahl, Mats P. E.; Miller, Steven P.; Whalen, Mike W.
2006-01-01
System safety analysis techniques are well established and are used extensively during the design of safety-critical systems. Despite this, most of the techniques are highly subjective and dependent on the skill of the practitioner. Since these analyses are usually based on an informal system model, it is unlikely that they will be complete, consistent, and error free. In fact, the lack of precise models of the system architecture and its failure modes often forces the safety analysts to devote much of their effort to gathering architectural details about the system behavior from several sources and embedding this information in the safety artifacts such as the fault trees. This report describes Model-Based Safety Analysis, an approach in which the system and safety engineers share a common system model created using a model-based development process. By extending the system model with a fault model as well as relevant portions of the physical system to be controlled, automated support can be provided for much of the safety analysis. We believe that by using a common model for both system and safety engineering and automating parts of the safety analysis, we can both reduce the cost and improve the quality of the safety analysis. Here we present our vision of model-based safety analysis and discuss the advantages and challenges in making this approach practical.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors.
Ge, Xiaoliang; Theuwissen, Albert J P
2018-02-27
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.
Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors †
Theuwissen, Albert J. P.
2018-01-01
This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS) image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS) technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models. PMID:29495496
[Development of laboratory sequence analysis software based on WWW and UNIX].
Huang, Y; Gu, J R
2001-01-01
Sequence analysis tools based on WWW and UNIX were developed in our laboratory to meet the needs of molecular genetics research in our laboratory. General principles of computer analysis of DNA and protein sequences were also briefly discussed in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Chavarría-Miranda, Daniel
Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimation. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. High performance computing holds the promise of faster analysis of more contingency cases for the purpose of safe and reliable operation of today’s power grids with less operating margin and more intermittent renewable energy sources. This paper evaluates the performance of counter-based dynamic load balancing schemes for massive contingency analysis under different computing environments. Insights frommore » the performance evaluation can be used as guidance for users to select suitable schemes in the application of massive contingency analysis. Case studies, as well as MATLAB simulations, of massive contingency cases using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing with counter-based dynamic load balancing schemes.« less
Design of an image encryption scheme based on a multiple chaotic map
NASA Astrophysics Data System (ADS)
Tong, Xiao-Jun
2013-07-01
In order to solve the problem that chaos is degenerated in limited computer precision and Cat map is the small key space, this paper presents a chaotic map based on topological conjugacy and the chaotic characteristics are proved by Devaney definition. In order to produce a large key space, a Cat map named block Cat map is also designed for permutation process based on multiple-dimensional chaotic maps. The image encryption algorithm is based on permutation-substitution, and each key is controlled by different chaotic maps. The entropy analysis, differential analysis, weak-keys analysis, statistical analysis, cipher random analysis, and cipher sensibility analysis depending on key and plaintext are introduced to test the security of the new image encryption scheme. Through the comparison to the proposed scheme with AES, DES and Logistic encryption methods, we come to the conclusion that the image encryption method solves the problem of low precision of one dimensional chaotic function and has higher speed and higher security.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mok, G.C.; Thomas, G.R.; Gerhard, M.A.
SCANS (Shipping Cask ANalysis System) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent fuel shipping casks. SCANS is an easy-to-use system that calculates the global response to impact loads, pressure loads and thermal conditions, providing reviewers with an independent check on analyses submitted by licensees. SCANS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens thatmore » contain descriptive data requests. Analysis options are based on regulatory cases described in the Code of Federal Regulations 10 CFR 71 and Regulatory Guides published by the US Nuclear Regulatory Commission in 1977 and 1978.« less
A program to form a multidisciplinary data base and analysis for dynamic systems
NASA Technical Reports Server (NTRS)
Taylor, L. W.; Suit, W. T.; Mayo, M. H.
1984-01-01
Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.
BiNChE: a web tool and library for chemical enrichment analysis based on the ChEBI ontology.
Moreno, Pablo; Beisken, Stephan; Harsha, Bhavana; Muthukrishnan, Venkatesh; Tudose, Ilinca; Dekker, Adriano; Dornfeldt, Stefanie; Taruttis, Franziska; Grosse, Ivo; Hastings, Janna; Neumann, Steffen; Steinbeck, Christoph
2015-02-21
Ontology-based enrichment analysis aids in the interpretation and understanding of large-scale biological data. Ontologies are hierarchies of biologically relevant groupings. Using ontology annotations, which link ontology classes to biological entities, enrichment analysis methods assess whether there is a significant over or under representation of entities for ontology classes. While many tools exist that run enrichment analysis for protein sets annotated with the Gene Ontology, there are only a few that can be used for small molecules enrichment analysis. We describe BiNChE, an enrichment analysis tool for small molecules based on the ChEBI Ontology. BiNChE displays an interactive graph that can be exported as a high-resolution image or in network formats. The tool provides plain, weighted and fragment analysis based on either the ChEBI Role Ontology or the ChEBI Structural Ontology. BiNChE aids in the exploration of large sets of small molecules produced within Metabolomics or other Systems Biology research contexts. The open-source tool provides easy and highly interactive web access to enrichment analysis with the ChEBI ontology tool and is additionally available as a standalone library.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Effects of computer-based training on procedural modifications to standard functional analyses.
Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.
EXPLORING FUNCTIONAL CONNECTIVITY IN FMRI VIA CLUSTERING.
Venkataraman, Archana; Van Dijk, Koene R A; Buckner, Randy L; Golland, Polina
2009-04-01
In this paper we investigate the use of data driven clustering methods for functional connectivity analysis in fMRI. In particular, we consider the K-Means and Spectral Clustering algorithms as alternatives to the commonly used Seed-Based Analysis. To enable clustering of the entire brain volume, we use the Nyström Method to approximate the necessary spectral decompositions. We apply K-Means, Spectral Clustering and Seed-Based Analysis to resting-state fMRI data collected from 45 healthy young adults. Without placing any a priori constraints, both clustering methods yield partitions that are associated with brain systems previously identified via Seed-Based Analysis. Our empirical results suggest that clustering provides a valuable tool for functional connectivity analysis.
ERIC Educational Resources Information Center
Lonchamp, Jacques
2010-01-01
Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-08
... explanation of the bases for the contention and a concise statement of the alleged facts or expert opinion... design bases. Revised analysis may either result in continued conformance with design bases or may change the design bases. If design basis changes result from a revised analysis, the specific design changes...
NASA Technical Reports Server (NTRS)
Bender, Robert L.; Reardon, John E.; Prendergast, Maurice J.; Schmitz, Craig P.; Brown, John R.
1992-01-01
A preliminary analysis of National Launch System ascent plume induced base heating environments has been completed to support the Induced Environments Panel's objective to assist in maturing the NLS vehicle (1.5 stage and heavy launch lift vehicle) design. Environments during ascent have been determined from this analysis for a few selected locations on the engine nozzles and base heat shield for both vehicles. The environments reflect early summer 1991 configurations and performance data and conservative methodology. A more complete and thorough analysis is under way to update these environments for the cycle 1 review in January 1992.
Chandra, K; Blackhouse, G; McCurdy, BR; Bornstein, M; Campbell, K; Costa, V; Franek, J; Kaulback, K; Levin, L; Sehatzadeh, S; Sikich, N; Thabane, M; Goeree, R
2012-01-01
Executive Summary In July 2010, the Medical Advisory Secretariat (MAS) began work on a Chronic Obstructive Pulmonary Disease (COPD) evidentiary framework, an evidence-based review of the literature surrounding treatment strategies for patients with COPD. This project emerged from a request by the Health System Strategy Division of the Ministry of Health and Long-Term Care that MAS provide them with an evidentiary platform on the effectiveness and cost-effectiveness of COPD interventions. After an initial review of health technology assessments and systematic reviews of COPD literature, and consultation with experts, MAS identified the following topics for analysis: vaccinations (influenza and pneumococcal), smoking cessation, multidisciplinary care, pulmonary rehabilitation, long-term oxygen therapy, noninvasive positive pressure ventilation for acute and chronic respiratory failure, hospital-at-home for acute exacerbations of COPD, and telehealth (including telemonitoring and telephone support). Evidence-based analyses were prepared for each of these topics. For each technology, an economic analysis was also completed where appropriate. In addition, a review of the qualitative literature on patient, caregiver, and provider perspectives on living and dying with COPD was conducted, as were reviews of the qualitative literature on each of the technologies included in these analyses. The Chronic Obstructive Pulmonary Disease Mega-Analysis series is made up of the following reports, which can be publicly accessed at the MAS website at: http://www.hqontario.ca/en/mas/mas_ohtas_mn.html. Chronic Obstructive Pulmonary Disease (COPD) Evidentiary Framework Influenza and Pneumococcal Vaccinations for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Smoking Cessation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Community-Based Multidisciplinary Care for Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Pulmonary Rehabilitation for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Long-Term Oxygen Therapy for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Acute Respiratory Failure Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Noninvasive Positive Pressure Ventilation for Chronic Respiratory Failure Patients With Stable Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Hospital-at-Home Programs for Patients With Acute Exacerbations of Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Home Telehealth for Patients With Chronic Obstructive Pulmonary Disease (COPD): An Evidence-Based Analysis Cost-Effectiveness of Interventions for Chronic Obstructive Pulmonary Disease Using an Ontario Policy Model Experiences of Living and Dying With COPD: A Systematic Review and Synthesis of the Qualitative Empirical Literature For more information on the qualitative review, please contact Mita Giacomini at: http://fhs.mcmaster.ca/ceb/faculty_member_giacomini.htm. For more information on the economic analysis, please visit the PATH website: http://www.path-hta.ca/About-Us/Contact-Us.aspx. The Toronto Health Economics and Technology Assessment (THETA) collaborative has produced an associated report on patient preference for mechanical ventilation. For more information, please visit the THETA website: http://theta.utoronto.ca/static/contact. Background Chronic obstructive pulmonary disease (COPD) is characterized by chronic inflammation throughout the airways, parenchyma, and pulmonary vasculature. The inflammation causes repeated cycles of injury and repair in the airway wall— inflammatory cells release a variety of chemicals and lead to cellular damage. The inflammation process also contributes to the loss of elastic recoil pressure in the lung, thereby reducing the driving pressure for expiratory flow through narr
Commerce Lab: Mission analysis. Payload integration study
NASA Technical Reports Server (NTRS)
Marvin, G. D.
1984-01-01
The objectives of the commerce lab mission analysis and payload integration study are discussed. A mission model which accommodates commercial users and provides a basic data base for future mission planning is described. The data bases developed under this study include: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight opportunities. Optimum missions are identified.
An introduction to kernel-based learning algorithms.
Müller, K R; Mika, S; Rätsch, G; Tsuda, K; Schölkopf, B
2001-01-01
This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by discussing applications such as optical character recognition and DNA analysis.
Basic gait analysis based on continuous wave radar.
Zhang, Jun
2012-09-01
A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy. Copyright © 2012 Elsevier B.V. All rights reserved.
How to interpret a healthcare economic analysis.
Brown, Melissa M; Brown, Gary C
2005-06-01
The purpose of the review is to present guidelines to help the clinician to interpret healthcare economic analyses and review pertinent recent analysis in the ophthalmic literature. There are four variants of healthcare economic analyses: (1) cost-minimization analysis; (2) cost-benefit analysis; (3) cost-effectiveness analysis and (4) cost-utility analysis. Cost-utility utility analysis has assumed an increasingly important role in healthcare, with increasing number of analyses occurring in the peer-reviewed ophthalmic literature. These include cost-utility analyses of cataract surgery in the first and second eyes, amblyopia treatment, and cost-utility analyses encompassing the vitreoretinal interventions of the following: (1) laser photocoagulation for exudative macular degeneration; (2) laser treatment for diabetic retinopathy; (3) laser photocoagulation for branch retinal vein obstruction; (4) diabetic vitrectomy; (5) treatment of proliferative retinopathy of prematurity and (6) treatment of retinal detachment associated with proliferative vitreoretinopathy. As an increasing number of cost-utility analyses become available they will provide the information system for the practice of value-based medicine, or medicine based upon the patient-perceived value conferred by interventions. Increasing numbers of cost-utility analysis in the ophthalmic literature suggest that ophthalmic interventions, including vitreoretinal interventions, are cost effective. Cost-utility analysis is a major tool in value-based medicine, the practice of medicine based upon the patient-perceived value conferred by healthcare interventions.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Event-based analysis of free-living behaviour.
Granat, Malcolm H
2012-11-01
The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.
RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis.
Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab
2012-01-01
RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. http://www.cemb.edu.pk/sw.html RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language.
Markov Chain Monte Carlo Methods for Bayesian Data Analysis in Astronomy
NASA Astrophysics Data System (ADS)
Sharma, Sanjib
2017-08-01
Markov Chain Monte Carlo based Bayesian data analysis has now become the method of choice for analyzing and interpreting data in almost all disciplines of science. In astronomy, over the last decade, we have also seen a steady increase in the number of papers that employ Monte Carlo based Bayesian analysis. New, efficient Monte Carlo based methods are continuously being developed and explored. In this review, we first explain the basics of Bayesian theory and discuss how to set up data analysis problems within this framework. Next, we provide an overview of various Monte Carlo based methods for performing Bayesian data analysis. Finally, we discuss advanced ideas that enable us to tackle complex problems and thus hold great promise for the future. We also distribute downloadable computer software (available at https://github.com/sanjibs/bmcmc/ ) that implements some of the algorithms and examples discussed here.
NASA Astrophysics Data System (ADS)
García-Moreno, Angel-Iván; González-Barbosa, José-Joel; Ramírez-Pedraza, Alfonso; Hurtado-Ramos, Juan B.; Ornelas-Rodriguez, Francisco-Javier
2016-04-01
Computer-based reconstruction models can be used to approximate urban environments. These models are usually based on several mathematical approximations and the usage of different sensors, which implies dependency on many variables. The sensitivity analysis presented in this paper is used to weigh the relative importance of each uncertainty contributor into the calibration of a panoramic camera-LiDAR system. Both sensors are used for three-dimensional urban reconstruction. Simulated and experimental tests were conducted. For the simulated tests we analyze and compare the calibration parameters using the Monte Carlo and Latin hypercube sampling techniques. Sensitivity analysis for each variable involved into the calibration was computed by the Sobol method, which is based on the analysis of the variance breakdown, and the Fourier amplitude sensitivity test method, which is based on Fourier's analysis. Sensitivity analysis is an essential tool in simulation modeling and for performing error propagation assessments.
Individualized Positive Behavior Support in School Settings: A Meta-Analysis
ERIC Educational Resources Information Center
Goh, Ailsa E.; Bambara, Linda M.
2012-01-01
This meta-analysis examined school-based intervention research based on functional behavioral assessment (FBA) to determine the effectiveness of key individualized positive behavior support (IPBS) practices in school settings. In all, 83 studies representing 145 participants were included in the meta-analysis. Intervention, maintenance, and…
Language-Based Curriculum Analysis: A Collaborative Assessment and Intervention Process.
ERIC Educational Resources Information Center
Prelock, Patricia A.
1997-01-01
Presents a systematic process for completing a language-based curriculum analysis to address curriculum expectations that may challenge students with communication impairments. Analysis of vocabulary and the demands for comprehension, oral, and written expression within specific content areas provides a framework for collaboration between teachers…
Verloo, Henk; Desmedt, Mario; Morin, Diane
2017-09-01
To evaluate two psychometric properties of the French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales, namely their internal consistency and construct validity. The Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales developed by Melnyk et al. are recognised as valid, reliable instruments in English. However, no psychometric validation for their French versions existed. Secondary analysis of a cross sectional survey. Source data came from a cross-sectional descriptive study sample of 382 nurses and other allied healthcare providers. Cronbach's alpha was used to evaluate internal consistency, and principal axis factor analysis and varimax rotation were computed to determine construct validity. The French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales showed excellent reliability, with Cronbach's alphas close to the scores established by Melnyk et al.'s original versions. Principal axis factor analysis showed medium-to-high factor loading scores without obtaining collinearity. Principal axis factor analysis with varimax rotation of the 16-item Evidence-Based Practice Beliefs scale resulted in a four-factor loading structure. Principal axis factor analysis with varimax rotation of the 17-item Evidence-Based Practice Implementation scale revealed a two-factor loading structure. Further research should attempt to understand why the French Evidence-Based Practice Implementation scale showed a two-factor loading structure but Melnyk et al.'s original has only one. The French versions of the Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales can both be considered valid and reliable instruments for measuring Evidence-Based Practice beliefs and implementation. The results suggest that the French Evidence-Based Practice Beliefs and Evidence-Based Practice Implementation scales are valid and reliable and can therefore be used to evaluate the effectiveness of organisational strategies aimed at increasing professionals' confidence in Evidence-Based Practice, supporting its use and implementation. © 2017 John Wiley & Sons Ltd.
Analysis of Load Stress for Asphalt Pavement of Lean Concrete Base
NASA Astrophysics Data System (ADS)
Lijun, Suo; Xinwu, Wang
The study revealed that whether it is early distresses in asphalt pavement or not depends largely on working performance of base. In the field of asphalt pavement, it is widely accepted that lean concrete base, compared with the general semi-rigid base, has better working performance, such as high strength and good eroding resistance. Problem of early distresses in asphalt pavement, which caused by more traffic loadings, can be settled effectively when lean concrete is used in asphalt pavement. Traffic loading is important parameter used in the analysis of the new pavement design. However, few studies have done extensive and intensive research on the load stress for asphalt pavement of lean concrete base. Because of that, it is necessary to study the load stress for the asphalt pavement. In the paper, first of all, three-dimension finite element model of the asphalt pavement is created for the aim of doing mechanical analysis for the asphalt pavement. And then, the two main objectives of this study are investigated. One is analysis for load stress of lean concrete base, and the other is analysis for load stress of asphalt surface. The results show that load stress of lean concrete base decreases, decrease and increase with increase of base's thickness, surface's thickness and ratio of base's modulus to foundation's modulus respectively. So far as the asphalt surface is concerned, maximum shearing stress, which is caused by load, is evident in asphalt surface which is located in transverse contraction joint of lean concrete base of asphalt pavement. Maximum shearing stress decrease, decrease, decrease and increase respectively with increase of the surface's modulus, the surface's thickness, base's thickness and ratio of base's modulus to foundation's modulus.
Centre of pressure patterns in the golf swing: individual-based analysis.
Ball, Kevin; Best, Russell
2012-06-01
Weight transfer has been identified as important in group-based analyses. The aim of this study was to extend this work by examining the importance of weight transfer in the golf swing on an individual basis. Five professional and amateur golfers performed 50 swings with the driver, hitting a ball into a net. The golfer's centre of pressure position and velocity, parallel with the line of shot, were measured by two force plates at eight swing events that were identified from high-speed video. The relationships between these parameters and club head velocity at ball contact were examined using regression statistics. The results did support the use of group-based analysis, with all golfers returning significant relationships. However, results were also individual-specific, with golfers returning different combinations of significant factors. Furthermore, factors not identified in group-based analysis were significant on an individual basis. The most consistent relationship was a larger weight transfer range associated with a larger club head velocity (p < 0.05). All golfers also returned at least one significant relationship with rate of weight transfer at swing events (p < 0.01). Individual-based analysis should form part of performance-based biomechanical analysis of sporting skills.
Grosse Frie, Kirstin; Janssen, Christian
2009-01-01
Based on the theoretical and empirical approach of Pierre Bourdieu, a multivariate non-linear method is introduced as an alternative way to analyse the complex relationships between social determinants and health. The analysis is based on face-to-face interviews with 695 randomly selected respondents aged 30 to 59. Variables regarding socio-economic status, life circumstances, lifestyles, health-related behaviour and health were chosen for the analysis. In order to determine whether the respondents can be differentiated and described based on these variables, a non-linear canonical correlation analysis (OVERALS) was performed. The results can be described on three dimensions; Eigenvalues add up to the fit of 1.444, which can be interpreted as approximately 50 % of explained variance. The three-dimensional space illustrates correspondences between variables and provides a framework for interpretation based on latent dimensions, which can be described by age, education, income and gender. Using non-linear canonical correlation analysis, health characteristics can be analysed in conjunction with socio-economic conditions and lifestyles. Based on Bourdieus theoretical approach, the complex correlations between these variables can be more substantially interpreted and presented.
A generalized least-squares framework for rare-variant analysis in family data.
Li, Dalin; Rotter, Jerome I; Guo, Xiuqing
2014-01-01
Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.
Modified multidimensional scaling approach to analyze financial markets.
Yin, Yi; Shang, Pengjian
2014-06-01
Detrended cross-correlation coefficient (σDCCA) and dynamic time warping (DTW) are introduced as the dissimilarity measures, respectively, while multidimensional scaling (MDS) is employed to translate the dissimilarities between daily price returns of 24 stock markets. We first propose MDS based on σDCCA dissimilarity and MDS based on DTW dissimilarity creatively, while MDS based on Euclidean dissimilarity is also employed to provide a reference for comparisons. We apply these methods in order to further visualize the clustering between stock markets. Moreover, we decide to confront MDS with an alternative visualization method, "Unweighed Average" clustering method, for comparison. The MDS analysis and "Unweighed Average" clustering method are employed based on the same dissimilarity. Through the results, we find that MDS gives us a more intuitive mapping for observing stable or emerging clusters of stock markets with similar behavior, while the MDS analysis based on σDCCA dissimilarity can provide more clear, detailed, and accurate information on the classification of the stock markets than the MDS analysis based on Euclidean dissimilarity. The MDS analysis based on DTW dissimilarity indicates more knowledge about the correlations between stock markets particularly and interestingly. Meanwhile, it reflects more abundant results on the clustering of stock markets and is much more intensive than the MDS analysis based on Euclidean dissimilarity. In addition, the graphs, originated from applying MDS methods based on σDCCA dissimilarity and DTW dissimilarity, may also guide the construction of multivariate econometric models.
Methodology for cost analysis of film-based and filmless portable chest systems
NASA Astrophysics Data System (ADS)
Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.
1996-05-01
Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.
Dunn, Heather; Quinn, Laurie; Corbridge, Susan J; Eldeirawi, Kamal; Kapella, Mary; Collins, Eileen G
2017-05-01
The use of cluster analysis in the nursing literature is limited to the creation of classifications of homogeneous groups and the discovery of new relationships. As such, it is important to provide clarity regarding its use and potential. The purpose of this article is to provide an introduction to distance-based, partitioning-based, and model-based cluster analysis methods commonly utilized in the nursing literature, provide a brief historical overview on the use of cluster analysis in nursing literature, and provide suggestions for future research. An electronic search included three bibliographic databases, PubMed, CINAHL and Web of Science. Key terms were cluster analysis and nursing. The use of cluster analysis in the nursing literature is increasing and expanding. The increased use of cluster analysis in the nursing literature is positioning this statistical method to result in insights that have the potential to change clinical practice.
Li, Yang; Huang, Xin-En
2015-01-01
This analysis was conducted to evaluate the efficacy and safety of crizotinib based regimens in treating Chinese patients with EML4-ALK positive non-small-cell lung cancer. Clinical studies evaluating the efficacy and safety of crizotinib based regimens on response and safety for Chinese patients with EML4-ALK positive non-small-cell lung cancer were identified by using a predefined search strategy. Pooled response rate (RR) of treatment were calculated. In crizotinib based regimens, 3 clinical studies which including 128 Chinese patients with EML4-ALK positive non-small-cell lung cancer and treated with crizotinib based regimen were considered eligible for inclusion. Pooled analysis suggested that, in all patients, the pooled RR was 59.3% (76/128) in crizotinib based regimens. ALT/AST mild visual disturbances, nausea, and vomiting were the main side effects. No treatment related death occurred in these crizotinib based treatments. This pooled analysis suggests that crizotinib based regimens are associated with good response rate and accepted toxicities in treating Chinese patients with EML4-ALK positive non-small-cell lung cancer.
You, Siming; Wang, Wei; Dai, Yanjun; Tong, Yen Wah; Wang, Chi-Hwa
2016-10-01
The compositions of food wastes and their co-gasification producer gas were compared with the existing data of sewage sludge. Results showed that food wastes are more favorable than sewage sludge for co-gasification based on residue generation and energy output. Two decentralized gasification-based schemes were proposed to dispose of the sewage sludge and food wastes in Singapore. Monte Carlo simulation-based cost-benefit analysis was conducted to compare the proposed schemes with the existing incineration-based scheme. It was found that the gasification-based schemes are financially superior to the incineration-based scheme based on the data of net present value (NPV), benefit-cost ratio (BCR), and internal rate of return (IRR). Sensitivity analysis was conducted to suggest effective measures to improve the economics of the schemes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Applied Cognitive Task Analysis (ACTA) Methodology
1997-11-01
experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the
NASA Technical Reports Server (NTRS)
1972-01-01
An economic analysis of space tug operations is presented. The subjects discussed are: (1) data base for orbit injection stages, (2) data base for reusable space tug, (3) performance equations, (4) data integration and interpretation, (5) tug performance and mission model accomodation, (6) total program cost, (7) payload analysis, (8) computer software, and (9) comparison of tug concepts.
ERIC Educational Resources Information Center
Fleming, Courtney V.
2011-01-01
Minimal research has investigated training packages used to teach professional staff how to implement functional analysis procedures and to interpret data gathered during functional analysis. The current investigation used video-based training with role-play and feedback to teach six professionals in a clinical setting to implement procedures of a…
Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses
ERIC Educational Resources Information Center
Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo
2018-01-01
Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
10 CFR 503.34 - Inability to comply with applicable environmental requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...
USDA-ARS?s Scientific Manuscript database
This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...
Roland, Michelle; Hull, M L; Howell, S M
2011-05-01
In a previous paper, we reported the virtual axis finder, which is a new method for finding the rotational axes of the knee. The virtual axis finder was validated through simulations that were subject to limitations. Hence, the objective of the present study was to perform a mechanical validation with two measurement modalities: 3D video-based motion analysis and marker-based roentgen stereophotogrammetric analysis (RSA). A two rotational axis mechanism was developed, which simulated internal-external (or longitudinal) and flexion-extension (FE) rotations. The actual axes of rotation were known with respect to motion analysis and RSA markers within ± 0.0006 deg and ± 0.036 mm and ± 0.0001 deg and ± 0.016 mm, respectively. The orientation and position root mean squared errors for identifying the longitudinal rotation (LR) and FE axes with video-based motion analysis (0.26 deg, 0.28 m, 0.36 deg, and 0.25 mm, respectively) were smaller than with RSA (1.04 deg, 0.84 mm, 0.82 deg, and 0.32 mm, respectively). The random error or precision in the orientation and position was significantly better (p=0.01 and p=0.02, respectively) in identifying the LR axis with video-based motion analysis (0.23 deg and 0.24 mm) than with RSA (0.95 deg and 0.76 mm). There was no significant difference in the bias errors between measurement modalities. In comparing the mechanical validations to virtual validations, the virtual validations produced comparable errors to those of the mechanical validation. The only significant difference between the errors of the mechanical and virtual validations was the precision in the position of the LR axis while simulating video-based motion analysis (0.24 mm and 0.78 mm, p=0.019). These results indicate that video-based motion analysis with the equipment used in this study is the superior measurement modality for use with the virtual axis finder but both measurement modalities produce satisfactory results. The lack of significant differences between validation techniques suggests that the virtual sensitivity analysis previously performed was appropriately modeled. Thus, the virtual axis finder can be applied with a thorough understanding of its errors in a variety of test conditions.
Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio
2013-01-01
Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified. PMID:23766723
NASA Technical Reports Server (NTRS)
LaValley, Brian W.; Little, Phillip D.; Walter, Chris J.
2011-01-01
This report documents the capabilities of the EDICT tools for error modeling and error propagation analysis when operating with models defined in the Architecture Analysis & Design Language (AADL). We discuss our experience using the EDICT error analysis capabilities on a model of the Scalable Processor-Independent Design for Enhanced Reliability (SPIDER) architecture that uses the Reliable Optical Bus (ROBUS). Based on these experiences we draw some initial conclusions about model based design techniques for error modeling and analysis of highly reliable computing architectures.
Exploratory Model Analysis of the Space Based Infrared System (SBIRS) Low Global Scheduler Problem
1999-12-01
solution. The non- linear least squares model is defined as Y = f{e,t) where: 0 =M-element parameter vector Y =N-element vector of all data t...NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM (SBIRS) LOW GLOBAL SCHEDULER...December 1999 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE EXPLORATORY MODEL ANALYSIS OF THE SPACE BASED INFRARED SYSTEM
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
Mogol, Burçe Ataç; Gökmen, Vural
2014-05-01
Computer vision-based image analysis has been widely used in food industry to monitor food quality. It allows low-cost and non-contact measurements of colour to be performed. In this paper, two computer vision-based image analysis approaches are discussed to extract mean colour or featured colour information from the digital images of foods. These types of information may be of particular importance as colour indicates certain chemical changes or physical properties in foods. As exemplified here, the mean CIE a* value or browning ratio determined by means of computer vision-based image analysis algorithms can be correlated with acrylamide content of potato chips or cookies. Or, porosity index as an important physical property of breadcrumb can be calculated easily. In this respect, computer vision-based image analysis provides a useful tool for automatic inspection of food products in a manufacturing line, and it can be actively involved in the decision-making process where rapid quality/safety evaluation is needed. © 2013 Society of Chemical Industry.
An interactive web-based application for Comprehensive Analysis of RNAi-screen Data.
Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B; Germain, Ronald N; Smith, Jennifer A; Simpson, Kaylene J; Martin, Scott E; Buehler, Eugen; Beuhler, Eugen; Fraser, Iain D C
2016-02-23
RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment.
Health Seeking in Men: A Concept Analysis.
Hooper, Gwendolyn L; Quallich, Susanne A
2016-01-01
This article describes the analysis of the concept of health seeking in men. Men have shorter life expectancies and utilize health services less often than women, leading to poor health outcomes, but a gendered basis for health seeking remains poorly defined. Walker and Avant’s framework was used to guide this concept analysis. Literature published in English from 1990-2015 was reviewed. Thematic analysis identified attributes, antecedents, and consequences of the concept. Based on the analysis, a contemporary definition for health seeking in men was constructed, rooted in the concept of health. The definition is based on the concept analysis and the defining attributes that were identified. This analysis provides a definition specifically for health seeking in American men, making it more specific and gender-based than the parent concept of “health.” This concept analysis provides conceptual clarity that can guide development of a conceptual framework that may be uniquely relevant to providers in urology. Further exploration will uncover specific cultural, social, sexual, and geographic perspectives.
An interactive web-based application for Comprehensive Analysis of RNAi-screen Data
Dutta, Bhaskar; Azhir, Alaleh; Merino, Louis-Henri; Guo, Yongjian; Revanur, Swetha; Madhamshettiwar, Piyush B.; Germain, Ronald N.; Smith, Jennifer A.; Simpson, Kaylene J.; Martin, Scott E.; Beuhler, Eugen; Fraser, Iain D. C.
2016-01-01
RNAi screens are widely used in functional genomics. Although the screen data can be susceptible to a number of experimental biases, many of these can be corrected by computational analysis. For this purpose, here we have developed a web-based platform for integrated analysis and visualization of RNAi screen data named CARD (for Comprehensive Analysis of RNAi Data; available at https://card.niaid.nih.gov). CARD allows the user to seamlessly carry out sequential steps in a rigorous data analysis workflow, including normalization, off-target analysis, integration of gene expression data, optimal thresholds for hit selection and network/pathway analysis. To evaluate the utility of CARD, we describe analysis of three genome-scale siRNA screens and demonstrate: (i) a significant increase both in selection of subsequently validated hits and in rejection of false positives, (ii) an increased overlap of hits from independent screens of the same biology and (iii) insight to microRNA (miRNA) activity based on siRNA seed enrichment. PMID:26902267
High resolution melting (HRM) analysis of DNA--its role and potential in food analysis.
Druml, Barbara; Cichna-Markl, Margit
2014-09-01
DNA based methods play an increasing role in food safety control and food adulteration detection. Recent papers show that high resolution melting (HRM) analysis is an interesting approach. It involves amplification of the target of interest in the presence of a saturation dye by the polymerase chain reaction (PCR) and subsequent melting of the amplicons by gradually increasing the temperature. Since the melting profile depends on the GC content, length, sequence and strand complementarity of the product, HRM analysis is highly suitable for the detection of single-base variants and small insertions or deletions. The review gives an introduction into HRM analysis, covers important aspects in the development of an HRM analysis method and describes how HRM data are analysed and interpreted. Then we discuss the potential of HRM analysis based methods in food analysis, i.e. for the identification of closely related species and cultivars and the identification of pathogenic microorganisms. Copyright © 2014 Elsevier Ltd. All rights reserved.
Conjoint analysis: using a market-based research model for healthcare decision making.
Mele, Nancy L
2008-01-01
Conjoint analysis is a market-based research model that has been used by businesses for more than 35 years to predict consumer preferences in product design and purchasing. Researchers in medicine, healthcare economics, and health policy have discovered the value of this methodology in determining treatment preferences, resource allocation, and willingness to pay. To describe the conjoint analysis methodology and explore value-added applications in nursing research. Conjoint analysis methodology is described, using examples from the healthcare and business literature, and personal experience with the method. Nurses are called upon to increase interdisciplinary research, provide an evidence base for nursing practice, create patient-centered treatments, and revise nursing education. Other disciplines have met challenges like these using conjoint analysis and discrete choice modeling.
A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network
NASA Astrophysics Data System (ADS)
Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.
2018-02-01
Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.
Image encryption based on a delayed fractional-order chaotic logistic system
NASA Astrophysics Data System (ADS)
Wang, Zhen; Huang, Xia; Li, Ning; Song, Xiao-Na
2012-05-01
A new image encryption scheme is proposed based on a delayed fractional-order chaotic logistic system. In the process of generating a key stream, the time-varying delay and fractional derivative are embedded in the proposed scheme to improve the security. Such a scheme is described in detail with security analyses including correlation analysis, information entropy analysis, run statistic analysis, mean-variance gray value analysis, and key sensitivity analysis. Experimental results show that the newly proposed image encryption scheme possesses high security.
Analysis and design of algorithm-based fault-tolerant systems
NASA Technical Reports Server (NTRS)
Nair, V. S. Sukumaran
1990-01-01
An important consideration in the design of high performance multiprocessor systems is to ensure the correctness of the results computed in the presence of transient and intermittent failures. Concurrent error detection and correction have been applied to such systems in order to achieve reliability. Algorithm Based Fault Tolerance (ABFT) was suggested as a cost-effective concurrent error detection scheme. The research was motivated by the complexity involved in the analysis and design of ABFT systems. To that end, a matrix-based model was developed and, based on that, algorithms for both the design and analysis of ABFT systems are formulated. These algorithms are less complex than the existing ones. In order to reduce the complexity further, a hierarchical approach is developed for the analysis of large systems.
Game-based digital interventions for depression therapy: a systematic review and meta-analysis.
Li, Jinhui; Theng, Yin-Leng; Foo, Schubert
2014-08-01
The aim of this study was to review the existing literature on game-based digital interventions for depression systematically and examine their effectiveness through a meta-analysis of randomized controlled trials (RCTs). Database searching was conducted using specific search terms and inclusion criteria. A standard meta-analysis was also conducted of available RCT studies with a random effects model. The standard mean difference (Cohen's d) was used to calculate the effect size of each study. Nineteen studies were included in the review, and 10 RCTs (eight studies) were included in the meta-analysis. Four types of game interventions-psycho-education and training, virtual reality exposure therapy, exercising, and entertainment-were identified, with various types of support delivered and populations targeted. The meta-analysis revealed a moderate effect size of the game interventions for depression therapy at posttreatment (d=-0.47 [95% CI -0.69 to -0.24]). A subgroup analysis showed that interventions based on psycho-education and training had a smaller effect than those based on the other forms, and that self-help interventions yielded better outcomes than supported interventions. A higher effect was achieved when a waiting list was used as the control. The review and meta-analysis support the effectiveness of game-based digital interventions for depression. More large-scale, high-quality RCT studies with sufficient long-term data for treatment evaluation are needed.
Game-Based Digital Interventions for Depression Therapy: A Systematic Review and Meta-Analysis
Theng, Yin-Leng; Foo, Schubert
2014-01-01
Abstract The aim of this study was to review the existing literature on game-based digital interventions for depression systematically and examine their effectiveness through a meta-analysis of randomized controlled trials (RCTs). Database searching was conducted using specific search terms and inclusion criteria. A standard meta-analysis was also conducted of available RCT studies with a random effects model. The standard mean difference (Cohen's d) was used to calculate the effect size of each study. Nineteen studies were included in the review, and 10 RCTs (eight studies) were included in the meta-analysis. Four types of game interventions—psycho-education and training, virtual reality exposure therapy, exercising, and entertainment—were identified, with various types of support delivered and populations targeted. The meta-analysis revealed a moderate effect size of the game interventions for depression therapy at posttreatment (d=−0.47 [95% CI −0.69 to −0.24]). A subgroup analysis showed that interventions based on psycho-education and training had a smaller effect than those based on the other forms, and that self-help interventions yielded better outcomes than supported interventions. A higher effect was achieved when a waiting list was used as the control. The review and meta-analysis support the effectiveness of game-based digital interventions for depression. More large-scale, high-quality RCT studies with sufficient long-term data for treatment evaluation are needed. PMID:24810933
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Li, Dongmei; Le Pape, Marc A; Parikh, Nisha I; Chen, Will X; Dye, Timothy D
2013-01-01
Microarrays are widely used for examining differential gene expression, identifying single nucleotide polymorphisms, and detecting methylation loci. Multiple testing methods in microarray data analysis aim at controlling both Type I and Type II error rates; however, real microarray data do not always fit their distribution assumptions. Smyth's ubiquitous parametric method, for example, inadequately accommodates violations of normality assumptions, resulting in inflated Type I error rates. The Significance Analysis of Microarrays, another widely used microarray data analysis method, is based on a permutation test and is robust to non-normally distributed data; however, the Significance Analysis of Microarrays method fold change criteria are problematic, and can critically alter the conclusion of a study, as a result of compositional changes of the control data set in the analysis. We propose a novel approach, combining resampling with empirical Bayes methods: the Resampling-based empirical Bayes Methods. This approach not only reduces false discovery rates for non-normally distributed microarray data, but it is also impervious to fold change threshold since no control data set selection is needed. Through simulation studies, sensitivities, specificities, total rejections, and false discovery rates are compared across the Smyth's parametric method, the Significance Analysis of Microarrays, and the Resampling-based empirical Bayes Methods. Differences in false discovery rates controls between each approach are illustrated through a preterm delivery methylation study. The results show that the Resampling-based empirical Bayes Methods offer significantly higher specificity and lower false discovery rates compared to Smyth's parametric method when data are not normally distributed. The Resampling-based empirical Bayes Methods also offers higher statistical power than the Significance Analysis of Microarrays method when the proportion of significantly differentially expressed genes is large for both normally and non-normally distributed data. Finally, the Resampling-based empirical Bayes Methods are generalizable to next generation sequencing RNA-seq data analysis.
Prison-Based Educational Programs: A Content Analysis of Government Documents
ERIC Educational Resources Information Center
Piotrowski, Chris; Lathrop, Peter J.
2012-01-01
The literature provides limited, constructive, consensus-based information to correctional officials and administrators on the efficacy of prison-based programs. This study reports an analysis of 8 review government documents, that surveyed the research literature from 1980-2008, on the topic of educational rehabilitation programs available to…
A Pilot Meta-Analysis of Computer-Based Scaffolding in STEM Education
ERIC Educational Resources Information Center
Belland, Brian R.; Walker, Andrew E.; Olsen, Megan Whitney; Leary, Heather
2015-01-01
This paper employs meta-analysis to determine the influence of computer-based scaffolding characteristics and study and test score quality on cognitive outcomes in science, technology, engineering, and mathematics education at the secondary, college, graduate, and adult levels. Results indicate that (a) computer-based scaffolding positively…
A Review of Research Ethics in Internet-Based Research
ERIC Educational Resources Information Center
Convery, Ian; Cox, Diane
2012-01-01
Internet-based research methods can include: online surveys, web page content analysis, videoconferencing for online focus groups and/or interviews, analysis of "e-conversations" through social networking sites, email, chat rooms, discussion boards and/or blogs. Over the last ten years, an upsurge in internet-based research (IBR) has led…
On Bi-Grid Local Mode Analysis of Solution Techniques for 3-D Euler and Navier-Stokes Equations
NASA Technical Reports Server (NTRS)
Ibraheem, S. O.; Demuren, A. O.
1994-01-01
A procedure is presented for utilizing a bi-grid stability analysis as a practical tool for predicting multigrid performance in a range of numerical methods for solving Euler and Navier-Stokes equations. Model problems based on the convection, diffusion and Burger's equation are used to illustrate the superiority of the bi-grid analysis as a predictive tool for multigrid performance in comparison to the smoothing factor derived from conventional von Neumann analysis. For the Euler equations, bi-grid analysis is presented for three upwind difference based factorizations, namely Spatial, Eigenvalue and Combination splits, and two central difference based factorizations, namely LU and ADI methods. In the former, both the Steger-Warming and van Leer flux-vector splitting methods are considered. For the Navier-Stokes equations, only the Beam-Warming (ADI) central difference scheme is considered. In each case, estimates of multigrid convergence rates from the bi-grid analysis are compared to smoothing factors obtained from single-grid stability analysis. Effects of grid aspect ratio and flow skewness are examined. Both predictions are compared with practical multigrid convergence rates for 2-D Euler and Navier-Stokes solutions based on the Beam-Warming central scheme.
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
Nanochannel Electroporation as a Platform for Living Cell Interrogation in Acute Myeloid Leukemia.
Zhao, Xi; Huang, Xiaomeng; Wang, Xinmei; Wu, Yun; Eisfeld, Ann-Kathrin; Schwind, Sebastian; Gallego-Perez, Daniel; Boukany, Pouyan E; Marcucci, Guido I; Lee, Ly James
2015-12-01
A living cell interrogation platform based on nanochannel electroporation is demonstrated with analysis of RNAs in single cells. This minimally invasive process is based on individual cells and allows both multi-target analysis and stimulus-response analysis by sequential deliveries. The unique platform possesses a great potential to the comprehensive and lysis-free nucleic acid analysis on rare or hard-to-transfect cells.
ERIC Educational Resources Information Center
Santagata, Rossella; Zannoni, Claudia; Stigler, James W.
2007-01-01
A video-based program on lesson analysis for pre-service mathematics teachers was implemented for two consecutive years as part of a teacher education program at the University of Lazio, Italy. Two questions were addressed: What can preservice teachers learn from the analysis of videotaped lessons? How can preservice teachers' analysis ability,…
Integrated Formulation of Beacon-Based Exception Analysis for Multimissions
NASA Technical Reports Server (NTRS)
Mackey, Ryan; James, Mark; Park, Han; Zak, Mickail
2003-01-01
Further work on beacon-based exception analysis for multimissions (BEAM), a method of real-time, automated diagnosis of a complex electromechanical systems, has greatly expanded its capability and suitability of application. This expanded formulation, which fully integrates physical models and symbolic analysis, is described. The new formulation of BEAM expands upon previous advanced techniques for analysis of signal data, utilizing mathematical modeling of the system physics, and expert-system reasoning,
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Zhang, Zhongheng; Ni, Hongying; Xu, Xiao
2014-08-01
Propensity score (PS) analysis has been increasingly used in critical care medicine; however, its validation has not been systematically investigated. The present study aimed to compare effect sizes in PS-based observational studies vs. randomized controlled trials (RCTs) (or meta-analysis of RCTs). Critical care observational studies using PS were systematically searched in PubMed from inception to April 2013. Identified PS-based studies were matched to one or more RCTs in terms of population, intervention, comparison, and outcome. The effect sizes of experimental treatments were compared for PS-based studies vs. RCTs (or meta-analysis of RCTs) with sign test. Furthermore, ratio of odds ratio (ROR) was calculated from the interaction term of treatment × study type in a logistic regression model. A ROR < 1 indicates greater benefit for experimental treatment in RCTs compared with PS-based studies. RORs of each comparison were pooled by using meta-analytic approach with random-effects model. A total of 20 PS-based studies were identified and matched to RCTs. Twelve of the 20 comparisons showed greater beneficial effect for experimental treatment in RCTs than that in PS-based studies (sign test P = 0.503). The difference was statistically significant in four comparisons. ROR can be calculated from 13 comparisons, of which four showed significantly greater beneficial effect for experimental treatment in RCTs. The pooled ROR was 0.71 (95% CI: 0.63, 0.79; P = 0.002), suggesting that RCTs (or meta-analysis of RCTs) were more likely to report beneficial effect for the experimental treatment than PS-based studies. The result remained unchanged in sensitivity analysis and meta-regression. In critical care literature, PS-based observational study is likely to report less beneficial effect of experimental treatment compared with RCTs (or meta-analysis of RCTs). Copyright © 2014 Elsevier Inc. All rights reserved.
A method for data base management and analysis for wind tunnel data
NASA Technical Reports Server (NTRS)
Biser, Aileen O.
1987-01-01
To respond to the need for improved data base management and analysis capabilities for wind-tunnel data at the Langley 16-Foot Transonic Tunnel, research was conducted into current methods of managing wind-tunnel data and a method was developed as a solution to this need. This paper describes the development of the data base management and analysis method for wind-tunnel data. The design and implementation of the software system are discussed and examples of its use are shown.
NASA Technical Reports Server (NTRS)
1972-01-01
The tug design and performance data base for the economic analysis of space tug operation are presented. A compendium of the detailed design and performance information from the data base is developed. The design data are parametric across a range of reusable space tug sizes. The performance curves are generated for selected point designs of expendable orbit injection stages and reusable tugs. Data are presented in the form of graphs for various modes of operation.
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
Exploratory Analysis of Supply Chains in the Defense Industrial Base
2012-04-01
Instruments Industry Group 382: Laboratory Apparatus and Analytical, Optical, Measuring, and Controlling Instruments 3821 Laboratory Apparatus and Furniture ...I N S T I T U T E F O R D E F E N S E A N A LY S E S Exploratory Analysis of Supply Chains in the Defense Industrial Base James R. Dominy...contract DASW01-04-C-0003, AH-7-3315, “Exploratory Analysis of Supply Chains in the Defense Industrial Base,” for the Director, Industrial Policy. The
Content-based analysis of news video
NASA Astrophysics Data System (ADS)
Yu, Junqing; Zhou, Dongru; Liu, Huayong; Cai, Bo
2001-09-01
In this paper, we present a schema for content-based analysis of broadcast news video. First, we separate commercials from news using audiovisual features. Then, we automatically organize news programs into a content hierarchy at various levels of abstraction via effective integration of video, audio, and text data available from the news programs. Based on these news video structure and content analysis technologies, a TV news video Library is generated, from which users can retrieve definite news story according to their demands.
Use of Language Sample Analysis by School-Based SLPs: Results of a Nationwide Survey
ERIC Educational Resources Information Center
Pavelko, Stacey L.; Owens, Robert E., Jr.; Ireland, Marie; Hahs-Vaughn, Debbie L.
2016-01-01
Purpose: This article examines use of language sample analysis (LSA) by school-based speech-language pathologists (SLPs), including characteristics of language samples, methods of transcription and analysis, barriers to LSA use, and factors affecting LSA use, such as American Speech-Language-Hearing Association certification, number of years'…
Web-Based Trainer for Electrical Circuit Analysis
ERIC Educational Resources Information Center
Weyten, L.; Rombouts, P.; De Maeyer, J.
2009-01-01
A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…
The Development of a Web-Based Virtual Environment for Teaching Qualitative Analysis of Structures
ERIC Educational Resources Information Center
O'Dwyer, D. W.; Logan-Phelan, T. M.; O'Neill, E. A.
2007-01-01
The current paper describes the design and development of a qualitative analysis course and an interactive web-based teaching and assessment tool called VSE (virtual structural environment). The widespread reliance on structural analysis programs requires engineers to be able to verify computer output by carrying out qualitative analyses.…
Data Base Reexamination as Part of IDS Secondary Analysis.
ERIC Educational Resources Information Center
Curry, Blair H.; And Others
Data reexamination is a critical component for any study. The complexity of the study, the time available for data base development and analysis, and the relationship of the study to educational policy-making can all increase the criticality of such reexamination. Analysis of the error levels in the National Institute of Education's Instructional…
Clusters of Occupations Based on Systematically Derived Work Dimensions: An Exploratory Study.
ERIC Educational Resources Information Center
Cunningham, J. W.; And Others
The study explored the feasibility of deriving an educationally relevant occupational cluster structure based on Occupational Analysis Inventory (OAI) work dimensions. A hierarchical cluster analysis was applied to the factor score profiles of 814 occupations on 22 higher-order OAI work dimensions. From that analysis, 73 occupational clusters were…
Content Analysis of a Computer-Based Faculty Activity Repository
ERIC Educational Resources Information Center
Baker-Eveleth, Lori; Stone, Robert W.
2013-01-01
The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Hopkins, F B; Gravett, M R; Self, A J; Wang, M; Chua, Hoe-Chee; Hoe-Chee, C; Lee, H S Nancy; Sim, N Lee Hoi; Jones, J T A; Timperley, C M; Riches, J R
2014-08-01
Detailed chemical analysis of solutions used to decontaminate chemical warfare agents can be used to support verification and forensic attribution. Decontamination solutions are amongst the most difficult matrices for chemical analysis because of their corrosive and potentially emulsion-based nature. Consequently, there are relatively few publications that report their detailed chemical analysis. This paper describes the application of modern analytical techniques to the analysis of decontamination solutions following decontamination of the chemical warfare agent O-ethyl S-2-diisopropylaminoethyl methylphosphonothiolate (VX). We confirm the formation of N,N-diisopropylformamide and N,N-diisopropylamine following decontamination of VX with hypochlorite-based solution, whereas they were not detected in extracts of hydroxide-based decontamination solutions by nuclear magnetic resonance (NMR) spectroscopy or gas chromatography-mass spectrometry. We report the electron ionisation and chemical ionisation mass spectroscopic details, retention indices, and NMR spectra of N,N-diisopropylformamide and N,N-diisopropylamine, as well as analytical methods suitable for their analysis and identification in solvent extracts and decontamination residues.
Rapid quantification and sex determination of forensic evidence materials.
Andréasson, Hanna; Allen, Marie
2003-11-01
DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, T.F.; Gerhard, M.A.; Trummer, D.J.
CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user`s manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers withmore » a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.« less
An Evidence-Based Videotaped Running Biomechanics Analysis.
Souza, Richard B
2016-02-01
Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner. Copyright © 2016 Elsevier Inc. All rights reserved.
Ho, Derek; Drake, Tyler K.; Bentley, Rex C.; Valea, Fidel A.; Wax, Adam
2015-01-01
We evaluate a new hybrid algorithm for determining nuclear morphology using angle-resolved low coherence interferometry (a/LCI) measurements in ex vivo cervical tissue. The algorithm combines Mie theory based and continuous wavelet transform inverse light scattering analysis. The hybrid algorithm was validated and compared to traditional Mie theory based analysis using an ex vivo tissue data set. The hybrid algorithm achieved 100% agreement with pathology in distinguishing dysplastic and non-dysplastic biopsy sites in the pilot study. Significantly, the new algorithm performed over four times faster than traditional Mie theory based analysis. PMID:26309741
Valle, Susanne Collier; Støen, Ragnhild; Sæther, Rannei; Jensenius, Alexander Refsum; Adde, Lars
2015-10-01
A computer-based video analysis has recently been presented for quantitative assessment of general movements (GMs). This method's test-retest reliability, however, has not yet been evaluated. The aim of the current study was to evaluate the test-retest reliability of computer-based video analysis of GMs, and to explore the association between computer-based video analysis and the temporal organization of fidgety movements (FMs). Test-retest reliability study. 75 healthy, term-born infants were recorded twice the same day during the FMs period using a standardized video set-up. The computer-based movement variables "quantity of motion mean" (Qmean), "quantity of motion standard deviation" (QSD) and "centroid of motion standard deviation" (CSD) were analyzed, reflecting the amount of motion and the variability of the spatial center of motion of the infant, respectively. In addition, the association between the variable CSD and the temporal organization of FMs was explored. Intraclass correlation coefficients (ICC 1.1 and ICC 3.1) were calculated to assess test-retest reliability. The ICC values for the variables CSD, Qmean and QSD were 0.80, 0.80 and 0.86 for ICC (1.1), respectively; and 0.80, 0.86 and 0.90 for ICC (3.1), respectively. There were significantly lower CSD values in the recordings with continual FMs compared to the recordings with intermittent FMs (p<0.05). This study showed high test-retest reliability of computer-based video analysis of GMs, and a significant association between our computer-based video analysis and the temporal organization of FMs. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
de Jong, Jan A Stavenga; Wierstra, Ronny F A; Hermanussen, José
2006-03-01
Research on individual learning approaches (or learning styles) is split in two traditions, one of which is biased towards academic learning, and the other towards learning from direct experience. In the reported study, the two traditions are linked by investigating the relationships between school-based (academic) and work-based (experiential) learning approaches of students in vocational education programs. Participants were 899 students of a Dutch school for secondary vocational education; 758 provided data on school-based learning, and 407 provided data on work-based learning, resulting in an overlap of 266 students from whom data were obtained on learning in both settings. Learning approaches in school and work settings were measured with questionnaires. Using factor analysis and cluster analysis, items and students were grouped, both with respect to school- and work-based learning. The study identified two academic learning dimensions (constructive learning and reproductive learning), and three experiential learning dimensions (analysis, initiative, and immersion). Construction and analysis were correlated positively, and reproduction and initiative negatively. Cluster analysis resulted in the identification of three school-based learning orientations and three work-based learning orientations. The relation between the two types of learning orientations, expressed in Cramér's V, appeared to be weak. It is concluded that learning approaches are relatively context specific, which implies that neither theoretical tradition can claim general applicability.
A robust approach for ECG-based analysis of cardiopulmonary coupling.
Zheng, Jiewen; Wang, Weidong; Zhang, Zhengbo; Wu, Dalei; Wu, Hao; Peng, Chung-Kang
2016-07-01
Deriving respiratory signal from a surface electrocardiogram (ECG) measurement has advantage of simultaneously monitoring of cardiac and respiratory activities. ECG-based cardiopulmonary coupling (CPC) analysis estimated by heart period variability and ECG-derived respiration (EDR) shows promising applications in medical field. The aim of this paper is to provide a quantitative analysis of the ECG-based CPC, and further improve its performance. Two conventional strategies were tested to obtain EDR signal: R-S wave amplitude and area of the QRS complex. An adaptive filter was utilized to extract the common component of inter-beat interval (RRI) and EDR, generating enhanced versions of EDR signal. CPC is assessed through probing the nonlinear phase interactions between RRI series and respiratory signal. Respiratory oscillations presented in both RRI series and respiratory signals were extracted by ensemble empirical mode decomposition for coupling analysis via phase synchronization index. The results demonstrated that CPC estimated from conventional EDR series exhibits constant and proportional biases, while that estimated from enhanced EDR series is more reliable. Adaptive filtering can improve the accuracy of the ECG-based CPC estimation significantly and achieve robust CPC analysis. The improved ECG-based CPC estimation may provide additional prognostic information for both sleep medicine and autonomic function analysis. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Haile, Jemanesh K.; Cory, Aron T.; Clarke, Fran R.; Clarke, John M.; Knox, Ron E.; Pozniak, Curtis J.
2017-01-01
Association mapping is usually performed by testing the correlation between a single marker and phenotypes. However, because patterns of variation within genomes are inherited as blocks, clustering markers into haplotypes for genome-wide scans could be a worthwhile approach to improve statistical power to detect associations. The availability of high-density molecular data allows the possibility to assess the potential of both approaches to identify marker-trait associations in durum wheat. In the present study, we used single marker- and haplotype-based approaches to identify loci associated with semolina and pasta colour in durum wheat, the main objective being to evaluate the potential benefits of haplotype-based analysis for identifying quantitative trait loci. One hundred sixty-nine durum lines were genotyped using the Illumina 90K Infinium iSelect assay, and 12,234 polymorphic single nucleotide polymorphism (SNP) markers were generated and used to assess the population structure and the linkage disequilibrium (LD) patterns. A total of 8,581 SNPs previously localized to a high-density consensus map were clustered into 406 haplotype blocks based on the average LD distance of 5.3 cM. Combining multiple SNPs into haplotype blocks increased the average polymorphism information content (PIC) from 0.27 per SNP to 0.50 per haplotype. The haplotype-based analysis identified 12 loci associated with grain pigment colour traits, including the five loci identified by the single marker-based analysis. Furthermore, the haplotype-based analysis resulted in an increase of the phenotypic variance explained (50.4% on average) and the allelic effect (33.7% on average) when compared to single marker analysis. The presence of multiple allelic combinations within each haplotype locus offers potential for screening the most favorable haplotype series and may facilitate marker-assisted selection of grain pigment colour in durum wheat. These results suggest a benefit of haplotype-based analysis over single marker analysis to detect loci associated with colour traits in durum wheat. PMID:28135299
Ishikawa, Sohta A; Inagaki, Yuji; Hashimoto, Tetsuo
2012-01-01
In phylogenetic analyses of nucleotide sequences, 'homogeneous' substitution models, which assume the stationarity of base composition across a tree, are widely used, albeit individual sequences may bear distinctive base frequencies. In the worst-case scenario, a homogeneous model-based analysis can yield an artifactual union of two distantly related sequences that achieved similar base frequencies in parallel. Such potential difficulty can be countered by two approaches, 'RY-coding' and 'non-homogeneous' models. The former approach converts four bases into purine and pyrimidine to normalize base frequencies across a tree, while the heterogeneity in base frequency is explicitly incorporated in the latter approach. The two approaches have been applied to real-world sequence data; however, their basic properties have not been fully examined by pioneering simulation studies. Here, we assessed the performances of the maximum-likelihood analyses incorporating RY-coding and a non-homogeneous model (RY-coding and non-homogeneous analyses) on simulated data with parallel convergence to similar base composition. Both RY-coding and non-homogeneous analyses showed superior performances compared with homogeneous model-based analyses. Curiously, the performance of RY-coding analysis appeared to be significantly affected by a setting of the substitution process for sequence simulation relative to that of non-homogeneous analysis. The performance of a non-homogeneous analysis was also validated by analyzing a real-world sequence data set with significant base heterogeneity.
Adopting exergy analysis for use in aerospace
NASA Astrophysics Data System (ADS)
Hayes, David; Lone, Mudassir; Whidborne, James F.; Camberos, José; Coetzee, Etienne
2017-08-01
Thermodynamic analysis methods, based on an exergy metric, have been developed to improve system efficiency of traditional heat driven systems such as ground based power plants and aircraft propulsion systems. However, in more recent years interest in the topic has broadened to include applying these second law methods to the field of aerodynamics and complete aerospace vehicles. Work to date is based on highly simplified structures, but such a method could be shown to have benefit to the highly conservative and risk averse commercial aerospace sector. This review justifies how thermodynamic exergy analysis has the potential to facilitate a breakthrough in the optimization of aerospace vehicles based on a system of energy systems, through studying the exergy-based multidisciplinary design of future flight vehicles.
Base compaction specification feasibility analysis.
DOT National Transportation Integrated Search
2012-12-01
The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...
Minimum number of measurements for evaluating soursop (Annona muricata L.) yield.
Sánchez, C F B; Teodoro, P E; Londoño, S; Silva, L A; Peixoto, L A; Bhering, L L
2017-05-31
Repeatability studies on fruit species are of great importance to identify the minimum number of measurements necessary to accurately select superior genotypes. This study aimed to identify the most efficient method to estimate the repeatability coefficient (r) and predict the minimum number of measurements needed for a more accurate evaluation of soursop (Annona muricata L.) genotypes based on fruit yield. Sixteen measurements of fruit yield from 71 soursop genotypes were carried out between 2000 and 2016. In order to estimate r with the best accuracy, four procedures were used: analysis of variance, principal component analysis based on the correlation matrix, principal component analysis based on the phenotypic variance and covariance matrix, and structural analysis based on the correlation matrix. The minimum number of measurements needed to predict the actual value of individuals was estimated. Principal component analysis using the phenotypic variance and covariance matrix provided the most accurate estimates of both r and the number of measurements required for accurate evaluation of fruit yield in soursop. Our results indicate that selection of soursop genotypes with high fruit yield can be performed based on the third and fourth measurements in the early years and/or based on the eighth and ninth measurements at more advanced stages.
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Feasibility and demonstration of a cloud-based RIID analysis system
NASA Astrophysics Data System (ADS)
Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.
2015-06-01
A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.
RDNAnalyzer: A tool for DNA secondary structure prediction and sequence analysis
Afzal, Muhammad; Shahid, Ahmad Ali; Shehzadi, Abida; Nadeem, Shahid; Husnain, Tayyab
2012-01-01
RDNAnalyzer is an innovative computer based tool designed for DNA secondary structure prediction and sequence analysis. It can randomly generate the DNA sequence or user can upload the sequences of their own interest in RAW format. It uses and extends the Nussinov dynamic programming algorithm and has various application for the sequence analysis. It predicts the DNA secondary structure and base pairings. It also provides the tools for routinely performed sequence analysis by the biological scientists such as DNA replication, reverse compliment generation, transcription, translation, sequence specific information as total number of nucleotide bases, ATGC base contents along with their respective percentages and sequence cleaner. RDNAnalyzer is a unique tool developed in Microsoft Visual Studio 2008 using Microsoft Visual C# and Windows Presentation Foundation and provides user friendly environment for sequence analysis. It is freely available. Availability http://www.cemb.edu.pk/sw.html Abbreviations RDNAnalyzer - Random DNA Analyser, GUI - Graphical user interface, XAML - Extensible Application Markup Language. PMID:23055611
On developing the local research environment of the 1990s - The Space Station era
NASA Technical Reports Server (NTRS)
Chase, Robert; Ziel, Fred
1989-01-01
A requirements analysis for the Space Station's polar platform data system has been performed. Based upon this analysis, a cluster, layered cluster, and layered-modular implementation of one specific module within the Eos Data and Information System (EosDIS), an active data base for satellite remote sensing research has been developed. It is found that a distributed system based on a layered-modular architecture and employing current generation work station technologies has the requisite attributes ascribed by the remote sensing research community. Although, based on benchmark testing, probabilistic analysis, failure analysis and user-survey technique analysis, it is found that this architecture presents some operational shortcomings that will not be alleviated with new hardware or software developments. Consequently, the potential of a fully-modular layered architectural design for meeting the needs of Eos researchers has also been evaluated, concluding that it would be well suited to the evolving requirements of this multidisciplinary research community.
A comparison of analysis methods to estimate contingency strength.
Lloyd, Blair P; Staubitz, Johanna L; Tapp, Jon T
2018-05-09
To date, several data analysis methods have been used to estimate contingency strength, yet few studies have compared these methods directly. To compare the relative precision and sensitivity of four analysis methods (i.e., exhaustive event-based, nonexhaustive event-based, concurrent interval, concurrent+lag interval), we applied all methods to a simulated data set in which several response-dependent and response-independent schedules of reinforcement were programmed. We evaluated the degree to which contingency strength estimates produced from each method (a) corresponded with expected values for response-dependent schedules and (b) showed sensitivity to parametric manipulations of response-independent reinforcement. Results indicated both event-based methods produced contingency strength estimates that aligned with expected values for response-dependent schedules, but differed in sensitivity to response-independent reinforcement. The precision of interval-based methods varied by analysis method (concurrent vs. concurrent+lag) and schedule type (continuous vs. partial), and showed similar sensitivities to response-independent reinforcement. Recommendations and considerations for measuring contingencies are identified. © 2018 Society for the Experimental Analysis of Behavior.
NASA Technical Reports Server (NTRS)
Hill, Geoffrey A.; Olson, Erik D.
2004-01-01
Due to the growing problem of noise in today's air transportation system, there have arisen needs to incorporate noise considerations in the conceptual design of revolutionary aircraft. Through the use of response surfaces, complex noise models may be converted into polynomial equations for rapid and simplified evaluation. This conversion allows many of the commonly used response surface-based trade space exploration methods to be applied to noise analysis. This methodology is demonstrated using a noise model of a notional 300 passenger Blended-Wing-Body (BWB) transport. Response surfaces are created relating source noise levels of the BWB vehicle to its corresponding FAR-36 certification noise levels and the resulting trade space is explored. Methods demonstrated include: single point analysis, parametric study, an optimization technique for inverse analysis, sensitivity studies, and probabilistic analysis. Extended applications of response surface-based methods in noise analysis are also discussed.
NASA Astrophysics Data System (ADS)
Yuhendar, A. H.; Wusqa, U.; Kartiko, R. D.; Raya, N. R.; Misbahudin
2016-05-01
Large-scale landslide occurred in Margamukti village, Pangalengan, Bandung Regency, West Java Province, Indonesia. The landslide damaged geothermal gas pipeline along 300 m in Wayang Windu Geothermal Field. Based on field observation, landslide occured in rotational sliding movement. Laboratory analysis were conducted to obtain the characteristics of the soil. Based on the condition of the landslide in this area, the Factor of Safety can be simulated by the soil mechanics approach. Factor of safety analysis based on soil cohesion and internal friction angle was conducted using manual sensitivity analysis for back analysis. The analysis resulted soil cohesion in critical condition (FS<1) is 6.01 kPa. This value is smaller than cohesion of undisturbed slope soil sample. Water from rainfall is the most important instability factors in research area. Because it decreases cohesion in soils and increases weight and pore water pressure in granular media.
Double symbolic joint entropy in nonlinear dynamic complexity analysis
NASA Astrophysics Data System (ADS)
Yao, Wenpo; Wang, Jun
2017-07-01
Symbolizations, the base of symbolic dynamic analysis, are classified as global static and local dynamic approaches which are combined by joint entropy in our works for nonlinear dynamic complexity analysis. Two global static methods, symbolic transformations of Wessel N. symbolic entropy and base-scale entropy, and two local ones, namely symbolizations of permutation and differential entropy, constitute four double symbolic joint entropies that have accurate complexity detections in chaotic models, logistic and Henon map series. In nonlinear dynamical analysis of different kinds of heart rate variability, heartbeats of healthy young have higher complexity than those of the healthy elderly, and congestive heart failure (CHF) patients are lowest in heartbeats' joint entropy values. Each individual symbolic entropy is improved by double symbolic joint entropy among which the combination of base-scale and differential symbolizations have best complexity analysis. Test results prove that double symbolic joint entropy is feasible in nonlinear dynamic complexity analysis.
a Cognitive Approach to Teaching a Graduate-Level Geobia Course
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel A.
2016-06-01
Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.
Retinal status analysis method based on feature extraction and quantitative grading in OCT images.
Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri
2016-07-22
Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.
ERIC Educational Resources Information Center
Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.
2008-01-01
Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…
This project will develop a model for place-based green building guidelines based on an analysis of local environmental, social, and land use conditions. The ultimate goal of this project is to develop a methodology and model for placing green buildings within their local cont...
Students' Understanding of Acid, Base and Salt Reactions in Qualitative Analysis.
ERIC Educational Resources Information Center
Tan, Kim-Chwee Daniel; Goh, Ngoh-Khang; Chia, Lian-Sai; Treagust, David F.
2003-01-01
Uses a two-tier, multiple-choice diagnostic instrument to determine (n=915) grade 10 students' understanding of the acid, base, and salt reactions involved in basic qualitative analysis. Reports that many students did not understand the formation of precipitates and the complex salts, acid/salt-base reactions, and thermal decomposition involved in…
ERIC Educational Resources Information Center
Zhang, Zhidong
2016-01-01
This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…
Case Problems for Problem-Based Pedagogical Approaches: A Comparative Analysis
ERIC Educational Resources Information Center
Dabbagh, Nada; Dass, Susan
2013-01-01
A comparative analysis of 51 case problems used in five problem-based pedagogical models was conducted to examine whether there are differences in their characteristics and the implications of such differences on the selection and generation of ill-structured case problems. The five pedagogical models were: situated learning, goal-based scenario,…
Analysis of the Accuracy of a Proposed Target Motion Analysis Procedure
1989-09-01
Monte, Caracas. Venezuela 5. Comando de La Escuadra 1 Base Naval "CA Agustin Armario " Puerto Cabello, Edo. Carabobo. Venezuela 6. United States Naval...Base New London Groton, CT 06340 9. Comando del Escuadron de Submarinos I Base Naval "CA Agustin Armario " Puerto Cabello, Edo. Carabobo. Venezuela 10
ERIC Educational Resources Information Center
Ridley, D. Scott; Hurwitz, Sally; Hackett, Mary Ruth Davis; Miller, Kari Knutson
2005-01-01
To date, the professional development school (PDS) preservice teacher preparation literature base is long on attitudinal analysis and short on comparative analysis of outcome variables. This article reports on a 2-year study comparing the lesson planning, teaching effectiveness, postlesson reflectivity, and content retention of professional…
Network-based machine learning and graph theory algorithms for precision oncology.
Zhang, Wei; Chien, Jeremy; Yong, Jeongsik; Kuang, Rui
2017-01-01
Network-based analytics plays an increasingly important role in precision oncology. Growing evidence in recent studies suggests that cancer can be better understood through mutated or dysregulated pathways or networks rather than individual mutations and that the efficacy of repositioned drugs can be inferred from disease modules in molecular networks. This article reviews network-based machine learning and graph theory algorithms for integrative analysis of personal genomic data and biomedical knowledge bases to identify tumor-specific molecular mechanisms, candidate targets and repositioned drugs for personalized treatment. The review focuses on the algorithmic design and mathematical formulation of these methods to facilitate applications and implementations of network-based analysis in the practice of precision oncology. We review the methods applied in three scenarios to integrate genomic data and network models in different analysis pipelines, and we examine three categories of network-based approaches for repositioning drugs in drug-disease-gene networks. In addition, we perform a comprehensive subnetwork/pathway analysis of mutations in 31 cancer genome projects in the Cancer Genome Atlas and present a detailed case study on ovarian cancer. Finally, we discuss interesting observations, potential pitfalls and future directions in network-based precision oncology.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis.
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library.
pyAudioAnalysis: An Open-Source Python Library for Audio Signal Analysis
Giannakopoulos, Theodoros
2015-01-01
Audio information plays a rather important role in the increasing digital content that is available today, resulting in a need for methodologies that automatically analyze such content: audio event recognition for home automations and surveillance systems, speech recognition, music information retrieval, multimodal analysis (e.g. audio-visual analysis of online videos for content-based recommendation), etc. This paper presents pyAudioAnalysis, an open-source Python library that provides a wide range of audio analysis procedures including: feature extraction, classification of audio signals, supervised and unsupervised segmentation and content visualization. pyAudioAnalysis is licensed under the Apache License and is available at GitHub (https://github.com/tyiannak/pyAudioAnalysis/). Here we present the theoretical background behind the wide range of the implemented methodologies, along with evaluation metrics for some of the methods. pyAudioAnalysis has been already used in several audio analysis research applications: smart-home functionalities through audio event detection, speech emotion recognition, depression classification based on audio-visual features, music segmentation, multimodal content-based movie recommendation and health applications (e.g. monitoring eating habits). The feedback provided from all these particular audio applications has led to practical enhancement of the library. PMID:26656189
iSeq: Web-Based RNA-seq Data Analysis and Visualization.
Zhang, Chao; Fan, Caoqi; Gan, Jingbo; Zhu, Ping; Kong, Lei; Li, Cheng
2018-01-01
Transcriptome sequencing (RNA-seq) is becoming a standard experimental methodology for genome-wide characterization and quantification of transcripts at single base-pair resolution. However, downstream analysis of massive amount of sequencing data can be prohibitively technical for wet-lab researchers. A functionally integrated and user-friendly platform is required to meet this demand. Here, we present iSeq, an R-based Web server, for RNA-seq data analysis and visualization. iSeq is a streamlined Web-based R application under the Shiny framework, featuring a simple user interface and multiple data analysis modules. Users without programming and statistical skills can analyze their RNA-seq data and construct publication-level graphs through a standardized yet customizable analytical pipeline. iSeq is accessible via Web browsers on any operating system at http://iseq.cbi.pku.edu.cn .
Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D
2014-01-01
Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Sparse dictionary learning for resting-state fMRI analysis
NASA Astrophysics Data System (ADS)
Lee, Kangjoo; Han, Paul Kyu; Ye, Jong Chul
2011-09-01
Recently, there has been increased interest in the usage of neuroimaging techniques to investigate what happens in the brain at rest. Functional imaging studies have revealed that the default-mode network activity is disrupted in Alzheimer's disease (AD). However, there is no consensus, as yet, on the choice of analysis method for the application of resting-state analysis for disease classification. This paper proposes a novel compressed sensing based resting-state fMRI analysis tool called Sparse-SPM. As the brain's functional systems has shown to have features of complex networks according to graph theoretical analysis, we apply a graph model to represent a sparse combination of information flows in complex network perspectives. In particular, a new concept of spatially adaptive design matrix has been proposed by implementing sparse dictionary learning based on sparsity. The proposed approach shows better performance compared to other conventional methods, such as independent component analysis (ICA) and seed-based approach, in classifying the AD patients from normal using resting-state analysis.
NASA Astrophysics Data System (ADS)
Huang, Shaohua; Wang, Lan; Chen, Weisheng; Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Li, Buhong; Chen, Rong
2014-11-01
Non-invasive esophagus cancer detection based on urine surface-enhanced Raman spectroscopy (SERS) analysis was presented. Urine SERS spectra were measured on esophagus cancer patients (n = 56) and healthy volunteers (n = 36) for control analysis. Tentative assignments of the urine SERS spectra indicated some interesting esophagus cancer-specific biomolecular changes, including a decrease in the relative content of urea and an increase in the percentage of uric acid in the urine of esophagus cancer patients compared to that of healthy subjects. Principal component analysis (PCA) combined with linear discriminant analysis (LDA) was employed to analyze and differentiate the SERS spectra between normal and esophagus cancer urine. The diagnostic algorithms utilizing a multivariate analysis method achieved a diagnostic sensitivity of 89.3% and specificity of 83.3% for separating esophagus cancer samples from normal urine samples. These results from the explorative work suggested that silver nano particle-based urine SERS analysis coupled with PCA-LDA multivariate analysis has potential for non-invasive detection of esophagus cancer.
NASA Astrophysics Data System (ADS)
Norinder, Ulf
1990-12-01
An experimental design based 3-D QSAR analysis using a combination of principal component and PLS analysis is presented and applied to human corticosteroid-binding globulin complexes. The predictive capability of the created model is good. The technique can also be used as guidance when selecting new compounds to be investigated.
ERIC Educational Resources Information Center
Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.
Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…
Reduction method with system analysis for multiobjective optimization-based design
NASA Technical Reports Server (NTRS)
Azarm, S.; Sobieszczanski-Sobieski, J.
1993-01-01
An approach for reducing the number of variables and constraints, which is combined with System Analysis Equations (SAE), for multiobjective optimization-based design is presented. In order to develop a simplified analysis model, the SAE is computed outside an optimization loop and then approximated for use by an operator. Two examples are presented to demonstrate the approach.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PM2.5 violations”) must be based on quantitative analysis using the applicable air quality models... either: (i) Quantitative methods that represent reasonable and common professional practice; or (ii) A...) The hot-spot demonstration required by § 93.116 must be based on quantitative analysis methods for the...
Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism
ERIC Educational Resources Information Center
Boutot, E. Amanda; Hume, Kara
2010-01-01
Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…
77 FR 66149 - Significant New Use Rules on Certain Chemical Substances
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-02
... ecological structural activity relationship (EcoSAR) analysis of test data on analogous esters, EPA predicts... milligram/cubic meter (mg/m\\3\\) as an 8-hour time-weighted average. In addition, based on EcoSAR analysis of... the PMN substance via the inhalation route. In addition, based on EcoSAR analysis of test data on...
Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism
ERIC Educational Resources Information Center
Boutot, E. Amanda; Hume, Kara
2012-01-01
Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…
NASA Technical Reports Server (NTRS)
Grissom, D. S.; Schneider, W. C.
1971-01-01
The determination of a base line (minimum weight) design for the primary structure of the living quarters modules in an earth-orbiting space base was investigated. Although the design is preliminary in nature, the supporting analysis is sufficiently thorough to provide a reasonably accurate weight estimate of the major components that are considered to comprise the structural weight of the space base.
A Visual Analytics Approach for Station-Based Air Quality Data
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-01-01
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support. PMID:28029117
A Visual Analytics Approach for Station-Based Air Quality Data.
Du, Yi; Ma, Cuixia; Wu, Chao; Xu, Xiaowei; Guo, Yike; Zhou, Yuanchun; Li, Jianhui
2016-12-24
With the deployment of multi-modality and large-scale sensor networks for monitoring air quality, we are now able to collect large and multi-dimensional spatio-temporal datasets. For these sensed data, we present a comprehensive visual analysis approach for air quality analysis. This approach integrates several visual methods, such as map-based views, calendar views, and trends views, to assist the analysis. Among those visual methods, map-based visual methods are used to display the locations of interest, and the calendar and the trends views are used to discover the linear and periodical patterns. The system also provides various interaction tools to combine the map-based visualization, trends view, calendar view and multi-dimensional view. In addition, we propose a self-adaptive calendar-based controller that can flexibly adapt the changes of data size and granularity in trends view. Such a visual analytics system would facilitate big-data analysis in real applications, especially for decision making support.
Metrological analysis of a virtual flowmeter-based transducer for cryogenic helium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arpaia, P., E-mail: pasquale.arpaia@unina.it; Technology Department, European Organization for Nuclear Research; Girone, M., E-mail: mario.girone@cern.ch
2015-12-15
The metrological performance of a virtual flowmeter-based transducer for monitoring helium under cryogenic conditions is assessed. At this aim, an uncertainty model of the transducer, mainly based on a valve model, exploiting finite-element approach, and a virtual flowmeter model, based on the Sereg-Schlumberger method, are presented. The models are validated experimentally on a case study for helium monitoring in cryogenic systems at the European Organization for Nuclear Research (CERN). The impact of uncertainty sources on the transducer metrological performance is assessed by a sensitivity analysis, based on statistical experiment design and analysis of variance. In this way, the uncertainty sourcesmore » most influencing metrological performance of the transducer are singled out over the input range as a whole, at varying operating and setting conditions. This analysis turns out to be important for CERN cryogenics operation because the metrological design of the transducer is validated, and its components and working conditions with critical specifications for future improvements are identified.« less
Success Probability Analysis for Shuttle Based Microgravity Experiments
NASA Technical Reports Server (NTRS)
Liou, Ying-Hsin Andrew
1996-01-01
Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.
Analysis of swimming performance: perceptions and practices of US-based swimming coaches.
Mooney, Robert; Corley, Gavin; Godfrey, Alan; Osborough, Conor; Newell, John; Quinlan, Leo Richard; ÓLaighin, Gearóid
2016-01-01
In elite swimming, a broad range of methods are used to assess performance, inform coaching practices and monitor athletic progression. The aim of this paper was to examine the performance analysis practices of swimming coaches and to explore the reasons behind the decisions that coaches take when analysing performance. Survey data were analysed from 298 Level 3 competitive swimming coaches (245 male, 53 female) based in the United States. Results were compiled to provide a generalised picture of practices and perceptions and to examine key emerging themes. It was found that a disparity exists between the importance swim coaches place on biomechanical analysis of swimming performance and the types of analyses that are actually conducted. Video-based methods are most frequently employed, with over 70% of coaches using these methods at least monthly, with analyses being mainly qualitative in nature rather than quantitative. Barriers to the more widespread use of quantitative biomechanical analysis in elite swimming environments were explored. Constraints include time, cost and availability of resources, but other factors such as sources of information on swimming performance and analysis and control over service provision are also discussed, with particular emphasis on video-based methods and emerging sensor-based technologies.
NASA Technical Reports Server (NTRS)
Goldman, A.; Murcray, F. J.; Rinsland, C. P.; Blatherwick, R. D.; Murcray, F. H.; Murcray, D. G.
1991-01-01
Recent results and ongoing studies of high resolution solar absorption spectra will be presented. The analysis of these spectra is aimed at the identification and quantification of trace constituents important in atmospheric chemistry of the stratosphere and upper troposphere. Analysis of balloon-borne and ground-based spectra obtained at 0.0025/ cm covering the 700-2200/ cm interval will be presented. Results from ground-based 0.02/ cm solar spectra, from several locations such as Denver, South Pole, M. Loa, and New Zealand will also be shown. The 0.0025/ cm spectra show many new spectroscopic features. The analysis of these spectra, along with corresponding laboratory spectra, improves the spectral line parameters, and thus the accuracy of trace constituents quantification. The combination of the recent balloon flights, with earlier flights data since 1978 at 0.02/ cm resolution, provides trends analysis of several stratospheric trace species. Results for COF2, F22, SF6, and other species will be presented. Analysis of several ground-based solar spectra provides trends for HCl, HF and other species. The retrieval methods used for total column density and altitude distribution for both ground-based and balloon-borne spectra will be presented. These are extended for the analysis of the ground-based spectra to be obtained by the high resolution interferometers of the Network for Detection of Stratospheric Change (NDSC). Progress or the University of Denver studies for the NDSC will be presented. This will include intercomparison of solar spectra and trace gases retrievals obtained from simultaneous scans by the high resolution (0.0025/ cm) interferometers of BRUKER and BOMEM.
Sports Stars: Analyzing the Performance of Astronomers at Visualization-based Discovery
NASA Astrophysics Data System (ADS)
Fluke, C. J.; Parrington, L.; Hegarty, S.; MacMahon, C.; Morgan, S.; Hassan, A. H.; Kilborn, V. A.
2017-05-01
In this data-rich era of astronomy, there is a growing reliance on automated techniques to discover new knowledge. The role of the astronomer may change from being a discoverer to being a confirmer. But what do astronomers actually look at when they distinguish between “sources” and “noise?” What are the differences between novice and expert astronomers when it comes to visual-based discovery? Can we identify elite talent or coach astronomers to maximize their potential for discovery? By looking to the field of sports performance analysis, we consider an established, domain-wide approach, where the expertise of the viewer (i.e., a member of the coaching team) plays a crucial role in identifying and determining the subtle features of gameplay that provide a winning advantage. As an initial case study, we investigate whether the SportsCode performance analysis software can be used to understand and document how an experienced Hi astronomer makes discoveries in spectral data cubes. We find that the process of timeline-based coding can be applied to spectral cube data by mapping spectral channels to frames within a movie. SportsCode provides a range of easy to use methods for annotation, including feature-based codes and labels, text annotations associated with codes, and image-based drawing. The outputs, including instance movies that are uniquely associated with coded events, provide the basis for a training program or team-based analysis that could be used in unison with discipline specific analysis software. In this coordinated approach to visualization and analysis, SportsCode can act as a visual notebook, recording the insight and decisions in partnership with established analysis methods. Alternatively, in situ annotation and coding of features would be a valuable addition to existing and future visualization and analysis packages.
Biomedical image analysis and processing in clouds
NASA Astrophysics Data System (ADS)
Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John
2013-10-01
Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Knowledge-based low-level image analysis for computer vision systems
NASA Technical Reports Server (NTRS)
Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.
1988-01-01
Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.
Causality Analysis of fMRI Data Based on the Directed Information Theory Framework.
Wang, Zhe; Alahmadi, Ahmed; Zhu, David C; Li, Tongtong
2016-05-01
This paper aims to conduct fMRI-based causality analysis in brain connectivity by exploiting the directed information (DI) theory framework. Unlike the well-known Granger causality (GC) analysis, which relies on the linear prediction technique, the DI theory framework does not have any modeling constraints on the sequences to be evaluated and ensures estimation convergence. Moreover, it can be used to generate the GC graphs. In this paper, first, we introduce the core concepts in the DI framework. Second, we present how to conduct causality analysis using DI measures between two time series. We provide the detailed procedure on how to calculate the DI for two finite-time series. The two major steps involved here are optimal bin size selection for data digitization and probability estimation. Finally, we demonstrate the applicability of DI-based causality analysis using both the simulated data and experimental fMRI data, and compare the results with that of the GC analysis. Our analysis indicates that GC analysis is effective in detecting linear or nearly linear causal relationship, but may have difficulty in capturing nonlinear causal relationships. On the other hand, DI-based causality analysis is more effective in capturing both linear and nonlinear causal relationships. Moreover, it is observed that brain connectivity among different regions generally involves dynamic two-way information transmissions between them. Our results show that when bidirectional information flow is present, DI is more effective than GC to quantify the overall causal relationship.
An Android malware detection system based on machine learning
NASA Astrophysics Data System (ADS)
Wen, Long; Yu, Haiyang
2017-08-01
The Android smartphone, with its open source character and excellent performance, has attracted many users. However, the convenience of the Android platform also has motivated the development of malware. The traditional method which detects the malware based on the signature is unable to detect unknown applications. The article proposes a machine learning-based lightweight system that is capable of identifying malware on Android devices. In this system we extract features based on the static analysis and the dynamitic analysis, then a new feature selection approach based on principle component analysis (PCA) and relief are presented in the article to decrease the dimensions of the features. After that, a model will be constructed with support vector machine (SVM) for classification. Experimental results show that our system provides an effective method in Android malware detection.
Using Android-Based Educational Game for Learning Colloid Material
NASA Astrophysics Data System (ADS)
Sari, S.; Anjani, R.; Farida, I.; Ramdhani, M. A.
2017-09-01
This research is based on the importance of the development of student’s chemical literacy on Colloid material using Android-based educational game media. Educational game products are developed through research and development design. In the analysis phase, material analysis is performed to generate concept maps, determine chemical literacy indicators, game strategies and set game paths. In the design phase, product packaging is carried out, then validation and feasibility test are performed. Research produces educational game based on Android that has the characteristics that is: Colloid material presented in 12 levels of game in the form of questions and challenges, presents visualization of discourse, images and animation contextually to develop the process of thinking and attitude. Based on the analysis of validation and trial results, the product is considered feasible to use.
John, Majnu; Lencz, Todd; Malhotra, Anil K; Correll, Christoph U; Zhang, Jian-Ping
2018-06-01
Meta-analysis of genetic association studies is being increasingly used to assess phenotypic differences between genotype groups. When the underlying genetic model is assumed to be dominant or recessive, assessing the phenotype differences based on summary statistics, reported for individual studies in a meta-analysis, is a valid strategy. However, when the genetic model is additive, a similar strategy based on summary statistics will lead to biased results. This fact about the additive model is one of the things that we establish in this paper, using simulations. The main goal of this paper is to present an alternate strategy for the additive model based on simulating data for the individual studies. We show that the alternate strategy is far superior to the strategy based on summary statistics.
Fire flame detection based on GICA and target tracking
NASA Astrophysics Data System (ADS)
Rong, Jianzhong; Zhou, Dechuang; Yao, Wei; Gao, Wei; Chen, Juan; Wang, Jian
2013-04-01
To improve the video fire detection rate, a robust fire detection algorithm based on the color, motion and pattern characteristics of fire targets was proposed, which proved a satisfactory fire detection rate for different fire scenes. In this fire detection algorithm: (a) a rule-based generic color model was developed based on analysis on a large quantity of flame pixels; (b) from the traditional GICA (Geometrical Independent Component Analysis) model, a Cumulative Geometrical Independent Component Analysis (C-GICA) model was developed for motion detection without static background and (c) a BP neural network fire recognition model based on multi-features of the fire pattern was developed. Fire detection tests on benchmark fire video clips of different scenes have shown the robustness, accuracy and fast-response of the algorithm.
Terluin, Berend; Brouwers, Evelien P M; Marchand, Miquelle A G; de Vet, Henrica C W
2018-05-01
Many paper-and-pencil (P&P) questionnaires have been migrated to electronic platforms. Differential item and test functioning (DIF and DTF) analysis constitutes a superior research design to assess measurement equivalence across modes of administration. The purpose of this study was to demonstrate an item response theory (IRT)-based DIF and DTF analysis to assess the measurement equivalence of a Web-based version and the original P&P format of the Four-Dimensional Symptom Questionnaire (4DSQ), measuring distress, depression, anxiety, and somatization. The P&P group (n = 2031) and the Web group (n = 958) consisted of primary care psychology clients. Unidimensionality and local independence of the 4DSQ scales were examined using IRT and Yen's Q3. Bifactor modeling was used to assess the scales' essential unidimensionality. Measurement equivalence was assessed using IRT-based DIF analysis using a 3-stage approach: linking on the latent mean and variance, selection of anchor items, and DIF testing using the Wald test. DTF was evaluated by comparing expected scale scores as a function of the latent trait. The 4DSQ scales proved to be essentially unidimensional in both modalities. Five items, belonging to the distress and somatization scales, displayed small amounts of DIF. DTF analysis revealed that the impact of DIF on the scale level was negligible. IRT-based DIF and DTF analysis is demonstrated as a way to assess the equivalence of Web-based and P&P questionnaire modalities. Data obtained with the Web-based 4DSQ are equivalent to data obtained with the P&P version.
Pathway analysis with next-generation sequencing data.
Zhao, Jinying; Zhu, Yun; Boerwinkle, Eric; Xiong, Momiao
2015-04-01
Although pathway analysis methods have been developed and successfully applied to association studies of common variants, the statistical methods for pathway-based association analysis of rare variants have not been well developed. Many investigators observed highly inflated false-positive rates and low power in pathway-based tests of association of rare variants. The inflated false-positive rates and low true-positive rates of the current methods are mainly due to their lack of ability to account for gametic phase disequilibrium. To overcome these serious limitations, we develop a novel statistic that is based on the smoothed functional principal component analysis (SFPCA) for pathway association tests with next-generation sequencing data. The developed statistic has the ability to capture position-level variant information and account for gametic phase disequilibrium. By intensive simulations, we demonstrate that the SFPCA-based statistic for testing pathway association with either rare or common or both rare and common variants has the correct type 1 error rates. Also the power of the SFPCA-based statistic and 22 additional existing statistics are evaluated. We found that the SFPCA-based statistic has a much higher power than other existing statistics in all the scenarios considered. To further evaluate its performance, the SFPCA-based statistic is applied to pathway analysis of exome sequencing data in the early-onset myocardial infarction (EOMI) project. We identify three pathways significantly associated with EOMI after the Bonferroni correction. In addition, our preliminary results show that the SFPCA-based statistic has much smaller P-values to identify pathway association than other existing methods.
UK Parents' Beliefs about Applied Behaviour Analysis as an Approach to Autism Education
ERIC Educational Resources Information Center
Denne, Louise D.; Hastings, Richard P.; Hughes, J. Carl
2017-01-01
Research into factors underlying the dissemination of evidence-based practice is limited within the field of Applied Behaviour Analysis (ABA). This is pertinent, particularly in the UK where national policies and guidelines do not reflect the emerging ABA evidence base, or policies and practices elsewhere. Theories of evidence-based practice in…
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2017-01-01
This article offers an approach to examining differential item functioning (DIF) under its item response theory (IRT) treatment in the framework of confirmatory factor analysis (CFA). The approach is based on integrating IRT- and CFA-based testing of DIF and using bias-corrected bootstrap confidence intervals with a syntax code in Mplus.
Effect of Internet-Based Learning in Public Health Training: An Exploratory Meta-Analysis
ERIC Educational Resources Information Center
Peng, Ying; Yan, Weirong
2017-01-01
Internet-based learning is increasingly applied in medical education, but its effect in the field of public health training is still unclear. This meta-analysis was undertaken to explore the impact of Internet-based learning on students'/professionals' knowledge of public health compared with no intervention and with traditional face-to-face (FTF)…
ERIC Educational Resources Information Center
Mate, Karen; Sim, Alistair; Weidenhofer, Judith; Milward, Liz; Scott, Judith
2013-01-01
A blended approach encompassing problem-based learning (PBL) and structured inquiry was used in this laboratory exercise based on the congenital disease Osteogenesis imperfecta (OI), to introduce commonly used techniques in biomolecular analysis within a clinical context. During a series of PBL sessions students were presented with several…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Parts 1, 16, 106, 110... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of...-Based Preventive Controls for Human Food,'' that appeared in the Federal Register of January 16, 2013...
ERIC Educational Resources Information Center
Granner, Michelle L.; Sharpe, Patricia A.; Burroughs, Ericka L.; Fields, Regina; Hallenbeck, Joyce
2010-01-01
This study conducted a newspaper content analysis as part of an evaluation of a community-based participatory research project focused on increasing physical activity through policy and environmental changes, which included activities related to media advocacy and media-based community education. Daily papers (May 2003 to December 2005) from both…
A Constraint-Based Approach to Acquisition of Word-Final Consonant Clusters in Turkish Children
ERIC Educational Resources Information Center
Gokgoz-Kurt, Burcu
2017-01-01
The current study provides a constraint-based analysis of L1 word-final consonant cluster acquisition in Turkish child language, based on the data originally presented by Topbas and Kopkalli-Yavuz (2008). The present analysis was done using [?]+obstruent consonant cluster acquisition. A comparison of Gradual Learning Algorithm (GLA) under…
Web-Based Tools for Modelling and Analysis of Multivariate Data: California Ozone Pollution Activity
ERIC Educational Resources Information Center
Dinov, Ivo D.; Christou, Nicolas
2011-01-01
This article presents a hands-on web-based activity motivated by the relation between human health and ozone pollution in California. This case study is based on multivariate data collected monthly at 20 locations in California between 1980 and 2006. Several strategies and tools for data interrogation and exploratory data analysis, model fitting…
Latent Semantic Analysis as a Method of Content-Based Image Retrieval in Medical Applications
ERIC Educational Resources Information Center
Makovoz, Gennadiy
2010-01-01
The research investigated whether a Latent Semantic Analysis (LSA)-based approach to image retrieval can map pixel intensity into a smaller concept space with good accuracy and reasonable computational cost. From a large set of M computed tomography (CT) images, a retrieval query found all images for a particular patient based on semantic…
ERIC Educational Resources Information Center
Williams, Suzanne Ellen; Greene, Leon; Satinsky, Sonya; Neuberger, John
2016-01-01
Purpose: The purposes of this study were to explore PE in higher education through the offering of traditional activity- and skills-based physical education (ASPE) and conceptually-based physical education (CPE) courses, and to conduct an exploratory content analysis on the CPE available to students in randomized colleges and universities in the…
Groen-Blokhuis, Maria M.; Pourcain, Beate St.; Greven, Corina U.; Pappa, Irene; Tiesler, Carla M.T.; Ang, Wei; Nolte, Ilja M.; Vilor-Tejedor, Natalia; Bacelis, Jonas; Ebejer, Jane L.; Zhao, Huiying; Davies, Gareth E.; Ehli, Erik A.; Evans, David M.; Fedko, Iryna O.; Guxens, Mònica; Hottenga, Jouke-Jan; Hudziak, James J.; Jugessur, Astanand; Kemp, John P.; Krapohl, Eva; Martin, Nicholas G.; Murcia, Mario; Myhre, Ronny; Ormel, Johan; Ring, Susan M.; Standl, Marie; Stergiakouli, Evie; Stoltenberg, Camilla; Thiering, Elisabeth; Timpson, Nicholas J.; Trzaskowski, Maciej; van der Most, Peter J.; Wang, Carol; Nyholt, Dale R.; Medland, Sarah E.; Neale, Benjamin; Jacobsson, Bo; Sunyer, Jordi; Hartman, Catharina A.; Whitehouse, Andrew J.O.; Pennell, Craig E.; Heinrich, Joachim; Plomin, Robert; Smith, George Davey; Tiemeier, Henning; Posthuma, Danielle; Boomsma, Dorret I.
2016-01-01
Objective To elucidate the influence of common genetic variants on childhood attention-deficit/hyperactivity disorder (ADHD) symptoms, to identify genetic variants that explain its high heritability, and to investigate the genetic overlap of ADHD symptom scores with ADHD diagnosis. Method Within the EArly Genetics and Lifecourse Epidemiology (EAGLE) consortium, genome-wide single nucleotide polymorphisms (SNPs) and ADHD symptom scores were available for 17,666 children (< 13 years) from nine population-based cohorts. SNP-based heritability was estimated in data from the three largest cohorts. Meta-analysis based on genome-wide association (GWA) analyses with SNPs was followed by gene-based association tests, and the overlap in results with a meta-analysis in the Psychiatric Genomics Consortium (PGC) case-control ADHD study was investigated. Results SNP-based heritability ranged from 5% to 34%, indicating that variation in common genetic variants influences ADHD symptom scores. The meta-analysis did not detect genome-wide significant SNPs, but three genes, lying close to each other with SNPs in high linkage disequilibrium (LD), showed a gene-wide significant association (p values between 1.46×10-6 and 2.66×10-6). One gene, WASL, is involved in neuronal development. Both SNP- and gene-based analyses indicated overlap with the PGC meta-analysis results with the genetic correlation estimated at 0.96. Conclusion The SNP-based heritability for ADHD symptom scores indicates a polygenic architecture and genes involved in neurite outgrowth are possibly involved. Continuous and dichotomous measures of ADHD appear to assess a genetically common phenotype. A next step is to combine data from population-based and case-control cohorts in genetic association studies to increase sample size and improve statistical power for identifying genetic variants. PMID:27663945
Haystack, a web-based tool for metabolomics research
2014-01-01
Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods. PMID:25350247
Haystack, a web-based tool for metabolomics research.
Grace, Stephen C; Embry, Stephen; Luo, Heng
2014-01-01
Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods.
Safety and efficacy of cell-based therapy on critical limb ischemia: A meta-analysis.
Ai, Min; Yan, Chang-Fu; Xia, Fu-Chun; Zhou, Shuang-Lu; He, Jian; Li, Cui-Ping
2016-06-01
Critical limb ischemia (CLI) is a major health problem worldwide, affecting approximately 500-1000 people per million per annum. Cell-based therapy has given new hope for the treatment of limb ischemia. This study assessed the safety and efficacy of cellular therapy CLI treatment. We searched the PubMed, Embase and Cochrane databases through October 20, 2015, and selected the controlled trials with cell-based therapy for CLI treatment compared with cell-free treatment. We assessed the results by meta-analysis using a variety of outcome measures, as well as the association of mononuclear cell dosage with treatment effect by dose-response meta-analysis. Twenty-five trials were included. For the primary evaluation index, cell-based therapy significantly reduced the rate of major amputation (odds ratio [OR] 0.44, 95% confidence interval [CI] 0.32-0.60, P = 0.000) and significantly increased the rate of amputation-free survival (OR 2.80, 95% CI 1.70-4.61, P = 0.000). Trial sequence analysis indicated that optimal sample size (n = 3374) is needed to detect a plausible treatment effect in all-cause mortality. Cell-based therapy significantly improves ankle brachial index, increases the rate of ulcer healing, increases the transcutaneous pressure of oxygen, reduces limb pain and improves movement ability. Subgroup analysis indicated heterogeneity is caused by type of control, design bias and transplant route. In the dose-response analysis, there was no significant correlation between cell dosage and the therapeutic effect. Cell-based therapy has a significant therapeutic effect on CLI, but randomized double-blind placebo-controlled trials are needed to improve the credibility of this conclusion. Assessment of all-cause mortality also requires a larger sample size to arrive at a strong conclusion. In dose-response analysis, increasing the dosage of cell injections does not significantly improve the therapeutic effects of cell-based therapy. Copyright © 2016 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Bismuth-based electrochemical stripping analysis
Wang, Joseph
2004-01-27
Method and apparatus for trace metal detection and analysis using bismuth-coated electrodes and electrochemical stripping analysis. Both anodic stripping voltammetry and adsorptive stripping analysis may be employed.
Inquiry-based Laboratory Activities on Drugs Analysis for High School Chemistry Learning
NASA Astrophysics Data System (ADS)
Rahmawati, I.; Sholichin, H.; Arifin, M.
2017-09-01
Laboratory activity is an important part of chemistry learning, but cookbook instructions is still commonly used. However, the activity with that way do not improve students thinking skill, especially students creativity. This study aims to improve high school students creativity through inquiry-based laboratory on drugs analysis activity. Acid-base titration is used to be method for drugs analysis involving a color changing indicator. The following tools were used to assess the activity achievement: creative thinking test on acid base titration, creative attitude and action observation sheets, questionnaire of inquiry-based lab activities, and interviews. The results showed that the inquiry-based laboratory activity improving students creative thinking, creative attitude and creative action. The students reacted positively to this teaching strategy as demonstrated by results from questionnaire responses and interviews. This result is expected to help teachers to overcome the shortcomings in other laboratory learning.
NASA Technical Reports Server (NTRS)
LOVE EUGENE S
1957-01-01
An analysis has been made of available experimental data to show the effects of most of the variables that are more predominant in determining base pressure at supersonic speeds. The analysis covers base pressures for two-dimensional airfoils and for bodies of revolution with and without stabilizing fins and is restricted to turbulent boundary layers. The present status of available experimental information is summarized as are the existing methods for predicting base pressure. A simple semiempirical method is presented for estimating base pressure. For two-dimensional bases, this method stems from an analogy established between the base-pressure phenomena and the peak pressure rise associated with the separation of the boundary layer. An analysis made for axially symmetric flow indicates that the base pressure for bodies of revolution is subject to the same analogy. Based upon the methods presented, estimations are made of such effects as Mach number, angle of attack, boattailing, fineness ratio, and fins. These estimations give fair predictions of experimental results. (author)
[Construction and application of special analysis database of geoherbs based on 3S technology].
Guo, Lan-ping; Huang, Lu-qi; Lv, Dong-mei; Shao, Ai-juan; Wang, Jian
2007-09-01
In this paper,the structures, data sources, data codes of "the spacial analysis database of geoherbs" based 3S technology are introduced, and the essential functions of the database, such as data management, remote sensing, spacial interpolation, spacial statistics, spacial analysis and developing are described. At last, two examples for database usage are given, the one is classification and calculating of NDVI index of remote sensing image in geoherbal area of Atractylodes lancea, the other one is adaptation analysis of A. lancea. These indicate that "the spacial analysis database of geoherbs" has bright prospect in spacial analysis of geoherbs.
Analyses of S-Box in Image Encryption Applications Based on Fuzzy Decision Making Criterion
NASA Astrophysics Data System (ADS)
Rehman, Inayatur; Shah, Tariq; Hussain, Iqtadar
2014-06-01
In this manuscript, we put forward a standard based on fuzzy decision making criterion to examine the current substitution boxes and study their strengths and weaknesses in order to decide their appropriateness in image encryption applications. The proposed standard utilizes the results of correlation analysis, entropy analysis, contrast analysis, homogeneity analysis, energy analysis, and mean of absolute deviation analysis. These analyses are applied to well-known substitution boxes. The outcome of these analyses are additional observed and a fuzzy soft set decision making criterion is used to decide the suitability of an S-box to image encryption applications.
NASA Astrophysics Data System (ADS)
Hsiao, Y. R.; Tsai, C.
2017-12-01
As the WHO Air Quality Guideline indicates, ambient air pollution exposes world populations under threat of fatal symptoms (e.g. heart disease, lung cancer, asthma etc.), raising concerns of air pollution sources and relative factors. This study presents a novel approach to investigating the multiscale variations of PM2.5 in southern Taiwan over the past decade, with four meteorological influencing factors (Temperature, relative humidity, precipitation and wind speed),based on Noise-assisted Multivariate Empirical Mode Decomposition(NAMEMD) algorithm, Hilbert Spectral Analysis(HSA) and Time-dependent Intrinsic Correlation(TDIC) method. NAMEMD algorithm is a fully data-driven approach designed for nonlinear and nonstationary multivariate signals, and is performed to decompose multivariate signals into a collection of channels of Intrinsic Mode Functions (IMFs). TDIC method is an EMD-based method using a set of sliding window sizes to quantify localized correlation coefficients for multiscale signals. With the alignment property and quasi-dyadic filter bank of NAMEMD algorithm, one is able to produce same number of IMFs for all variables and estimates the cross correlation in a more accurate way. The performance of spectral representation of NAMEMD-HSA method is compared with Complementary Empirical Mode Decomposition/ Hilbert Spectral Analysis (CEEMD-HSA) and Wavelet Analysis. The nature of NAMAMD-based TDICC analysis is then compared with CEEMD-based TDIC analysis and the traditional correlation analysis.
ERIC Educational Resources Information Center
Kucan, Linda; Palincsar, Annemarie Sullivan
2018-01-01
This investigation focuses on a tool used in a reading methods course to introduce reading specialist candidates to text analysis as a critical component of planning for text-based discussions. Unlike planning that focuses mainly on important text content or information, a text analysis approach focuses both on content and how that content is…
Analysis of Biaxial Stress Fields in Plates Cracking at Elevated Temperatures
1989-10-19
used . Based on the enhanced theory, it is predicted that the minimum resolvable stress amplitude using thermographic stress analysis will be...because of limitations in the commercial thermographic equipment used . Based on the enhanced theory, it is predicted that the minimum resolvable stress...amplitude using thermographic stress analysis will be approximately independent of temperature, provided relevant thermal and mechanical material
Applying Model Analysis to a Resource-Based Analysis of the Force and Motion Conceptual Evaluation
ERIC Educational Resources Information Center
Smith, Trevor I.; Wittmann, Michael C.; Carter, Tom
2014-01-01
Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information…
Mitochondrial DNA diagnosis for taeniasis and cysticercosis.
Yamasaki, Hiroshi; Nakao, Minoru; Sako, Yasuhito; Nakaya, Kazuhiro; Sato, Marcello Otake; Ito, Akira
2006-01-01
Molecular diagnosis for taeniasis and cysticercosis in humans on the basis of mitochondrial DNA analysis was reviewed. Development and application of three different methods, including restriction fragment length polymorphism analysis, base excision sequence scanning thymine-base analysis and multiplex PCR, were described. Moreover, molecular diagnosis of cysticerci found in specimens submitted for histopathology and the molecular detection of taeniasis using copro-DNA were discussed.
Knowledge-Based Information Management for Watershed Analysis in the Pacific Northwest U.S.
Keith Reynolds; Richard Olson; Michael Saunders; Donald Latham; Michael Foster; Bruce Miller; Lawrence Bednar; Daniel Schmoldt; Patrick Cunningham; John Steffenson
1996-01-01
We are developing a knowledge-based information management system to provide decision support for watershed analysis in the Pacific Northwest region of the U.S. The system includes: (1) a GIS interface that allows users to graphically navigate to specific provinces and watersheds and display a variety of themes and other area-specific information, (2) an analysis...
A Corpus-Based Discourse Analysis of the Vision and Mission Statements of Universities in Turkey
ERIC Educational Resources Information Center
Efe, Ibrahim; Ozer, Omer
2015-01-01
This article presents findings from a corpus-assisted discourse analysis of mission and vision statements of 105 state and 66 private/foundation universities in Turkey. The paper combines a corpus-based approach with critical discourse analysis to interpret the data in relation to its institutional as well as socio-political context. It argues…
ERIC Educational Resources Information Center
Velastegui, Pamela J.
2013-01-01
This hypothesis-generating case study investigates the naturally emerging roles of technology brokers and technology leaders in three independent schools in New York involving 92 school educators. A multiple and mixed method design utilizing Social Network Analysis (SNA) and fuzzy set Qualitative Comparative Analysis (FSQCA) involved gathering…
Formal Foundations for the Specification of Software Architecture.
1995-03-01
Architectures For- mally: A Case-Study Using KWIC." Kestrel Institute, Palo Alto, CA 94304, April 1994. 58. Kang, Kyo C. Feature-Oriented Domain Analysis ( FODA ...6.3.5 Constraint-Based Architectures ................. 6-60 6.4 Summary ......... ............................. 6-63 VII. Analysis of Process-Based...between these architec- ture theories were investigated. A feasibility analysis on an image processing application demonstrated that architecture theories
Bruce G. Marcot; Peter H. Singleton; Nathan H. Schumaker
2015-01-01
Sensitivity analysisâdetermination of how prediction variables affect response variablesâof individual-based models (IBMs) are few but important to the interpretation of model output. We present sensitivity analysis of a spatially explicit IBM (HexSim) of a threatened species, the Northern Spotted Owl (NSO; Strix occidentalis caurina) in Washington...
ERIC Educational Resources Information Center
Oplatka, Izhar
2010-01-01
Purpose: Based on a qualitative content analysis of 57 curricula vitae of authors who published their work in the major journals of the educational administration (EA) field, this paper seeks to display the career of EA authors and to suggest some epistemological implications for the field. Design/methodology/approach: The analysis is based on…
NEAT: an efficient network enrichment analysis test.
Signorelli, Mirko; Vinciotti, Veronica; Wit, Ernst C
2016-09-05
Network enrichment analysis is a powerful method, which allows to integrate gene enrichment analysis with the information on relationships between genes that is provided by gene networks. Existing tests for network enrichment analysis deal only with undirected networks, they can be computationally slow and are based on normality assumptions. We propose NEAT, a test for network enrichment analysis. The test is based on the hypergeometric distribution, which naturally arises as the null distribution in this context. NEAT can be applied not only to undirected, but to directed and partially directed networks as well. Our simulations indicate that NEAT is considerably faster than alternative resampling-based methods, and that its capacity to detect enrichments is at least as good as the one of alternative tests. We discuss applications of NEAT to network analyses in yeast by testing for enrichment of the Environmental Stress Response target gene set with GO Slim and KEGG functional gene sets, and also by inspecting associations between functional sets themselves. NEAT is a flexible and efficient test for network enrichment analysis that aims to overcome some limitations of existing resampling-based tests. The method is implemented in the R package neat, which can be freely downloaded from CRAN ( https://cran.r-project.org/package=neat ).
interest: mechanical system design sensitivity analysis and optimization of linear and nonlinear structural systems, reliability analysis and reliability-based design optimization, computational methods in committee member, ISSMO; Associate Editor, Mechanics Based Design of Structures and Machines; Associate
An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.
Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes
2017-10-01
This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice. © 2017 Society for Risk Analysis.
Joint source based analysis of multiple brain structures in studying major depressive disorder
NASA Astrophysics Data System (ADS)
Ramezani, Mahdi; Rasoulian, Abtin; Hollenstein, Tom; Harkness, Kate; Johnsrude, Ingrid; Abolmaesumi, Purang
2014-03-01
We propose a joint Source-Based Analysis (jSBA) framework to identify brain structural variations in patients with Major Depressive Disorder (MDD). In this framework, features representing position, orientation and size (i.e. pose), shape, and local tissue composition are extracted. Subsequently, simultaneous analysis of these features within a joint analysis method is performed to generate the basis sources that show signi cant di erences between subjects with MDD and those in healthy control. Moreover, in a cross-validation leave- one-out experiment, we use a Fisher Linear Discriminant (FLD) classi er to identify individuals within the MDD group. Results show that we can classify the MDD subjects with an accuracy of 76% solely based on the information gathered from the joint analysis of pose, shape, and tissue composition in multiple brain structures.
Integrated approach to multimodal media content analysis
NASA Astrophysics Data System (ADS)
Zhang, Tong; Kuo, C.-C. Jay
1999-12-01
In this work, we present a system for the automatic segmentation, indexing and retrieval of audiovisual data based on the combination of audio, visual and textural content analysis. The video stream is demultiplexed into audio, image and caption components. Then, a semantic segmentation of the audio signal based on audio content analysis is conducted, and each segment is indexed as one of the basic audio types. The image sequence is segmented into shots based on visual information analysis, and keyframes are extracted from each shot. Meanwhile, keywords are detected from the closed caption. Index tables are designed for both linear and non-linear access to the video. It is shown by experiments that the proposed methods for multimodal media content analysis are effective. And that the integrated framework achieves satisfactory results for video information filtering and retrieval.
Sentiment Analysis Using Common-Sense and Context Information
Mittal, Namita; Bansal, Pooja; Garg, Sonal
2015-01-01
Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods. PMID:25866505
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
Sentiment analysis using common-sense and context information.
Agarwal, Basant; Mittal, Namita; Bansal, Pooja; Garg, Sonal
2015-01-01
Sentiment analysis research has been increasing tremendously in recent times due to the wide range of business and social applications. Sentiment analysis from unstructured natural language text has recently received considerable attention from the research community. In this paper, we propose a novel sentiment analysis model based on common-sense knowledge extracted from ConceptNet based ontology and context information. ConceptNet based ontology is used to determine the domain specific concepts which in turn produced the domain specific important features. Further, the polarities of the extracted concepts are determined using the contextual polarity lexicon which we developed by considering the context information of a word. Finally, semantic orientations of domain specific features of the review document are aggregated based on the importance of a feature with respect to the domain. The importance of the feature is determined by the depth of the feature in the ontology. Experimental results show the effectiveness of the proposed methods.
NASA Astrophysics Data System (ADS)
Liu, Feng; Wu, Chuanhai; Xu, Xinquan; Li, Hao; Wang, Zhixiang
2018-01-01
In order to grasp the rule of the strain change of the semi-rigid asphalt pavement structure under the FWD load and provide a reliable theoretical and practical basis for the design of the pavement structure, based on the test section of Guangdong Yunluo expressway, taking FWD as the loading tool, by using the finite element analysis software ANSYS, the internal variation rules of each pavement structural layer were obtained. Based on the results of the theoretical analysis, the measured strain sensor was set up in the corresponding layer of the pavement structure, and the strain test plan was determined. Based on the analysis of the strain data obtained from several structural layers and field monitoring, the rationality of the type pavement structure and the strain test scheme were verified, so as to provide useful help for the design and the maintenance of the pavement structure.
NASA Astrophysics Data System (ADS)
Ruan, Qingsong; Zhang, Shuhua; Lv, Dayong; Lu, Xinsheng
2018-02-01
Based on the implementation of Shanghai-Hong Kong Stock Connect in China, this paper examines the effects of financial liberalization on stock market comovement using both multifractal detrended fluctuation analysis (MF-DFA) and multifractal detrended cross-correlation analysis (MF-DCCA) methods. Results based on MF-DFA confirm the multifractality of Shanghai and Hong Kong stock markets, and the market efficiency of Shanghai stock market increased after the implementation of this connect program. Besides, analysis based on MF-DCCA has verified the existence of persistent cross-correlation between Shanghai and Hong Kong stock markets, and the cross-correlation gets stronger after the launch of this liberalization program. Finally, we find that fat-tail distribution is the main source of multifractality in the cross-correlations before the stock connect program, while long-range correlation contributes to the multifractality after this program.
Distefano, Gaetano; Caruso, Marco; La Malfa, Stefano; Gentile, Alessandra; Wu, Shu-Biao
2012-01-01
High resolution melting curve analysis (HRM) has been used as an efficient, accurate and cost-effective tool to detect single nucleotide polymorphisms (SNPs) or insertions or deletions (INDELs). However, its efficiency, accuracy and applicability to discriminate microsatellite polymorphism have not been extensively assessed. The traditional protocols used for SSR genotyping include PCR amplification of the DNA fragment and the separation of the fragments on electrophoresis-based platform. However, post-PCR handling processes are laborious and costly. Furthermore, SNPs present in the sequences flanking repeat motif cannot be detected by polyacrylamide-gel-electrophoresis based methods. In the present study, we compared the discriminating power of HRM with the traditional electrophoresis-based methods and provided a panel of primers for HRM genotyping in Citrus. The results showed that sixteen SSR markers produced distinct polymorphic melting curves among the Citrus spp investigated through HRM analysis. Among those, 10 showed more genotypes by HRM analysis than capillary electrophoresis owing to the presence of SNPs in the amplicons. For the SSR markers without SNPs present in the flanking region, HRM also gave distinct melting curves which detected same genotypes as were shown in capillary electrophoresis (CE) analysis. Moreover, HRM analysis allowed the discrimination of most of the 15 citrus genotypes and the resulting genetic distance analysis clustered them into three main branches. In conclusion, it has been approved that HRM is not only an efficient and cost-effective alternative of electrophoresis-based method for SSR markers, but also a method to uncover more polymorphisms contributed by SNPs present in SSRs. It was therefore suggested that the panel of SSR markers could be used in a variety of applications in the citrus biodiversity and breeding programs using HRM analysis. Furthermore, we speculate that the HRM analysis can be employed to analyse SSR markers in a wide range of applications in all other species.
Dairy Analytics and Nutrient Analysis (DANA) Prototype System User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sam Alessi; Dennis Keiser
2012-10-01
This document is a user manual for the Dairy Analytics and Nutrient Analysis (DANA) model. DANA provides an analysis of dairy anaerobic digestion technology and allows users to calculate biogas production, co-product valuation, capital costs, expenses, revenue and financial metrics, for user customizable scenarios, dairy and digester types. The model provides results for three anaerobic digester types; Covered Lagoons, Modified Plug Flow, and Complete Mix, and three main energy production technologies; electricity generation, renewable natural gas generation, and compressed natural gas generation. Additional options include different dairy types, bedding types, backend treatment type as well as numerous production, and economicmore » parameters. DANA’s goal is to extend the National Market Value of Anaerobic Digester Products analysis (informa economics, 2012; Innovation Center, 2011) to include a greater and more flexible set of regional digester scenarios and to provide a modular framework for creation of a tool to support farmer and investor needs. Users can set up scenarios from combinations of existing parameters or add new parameters, run the model and view a variety of reports, charts and tables that are automatically produced and delivered over the web interface. DANA is based in the INL’s analysis architecture entitled Generalized Environment for Modeling Systems (GEMS) , which offers extensive collaboration, analysis, and integration opportunities and greatly speeds the ability construct highly scalable web delivered user-oriented decision tools. DANA’s approach uses server-based data processing and web-based user interfaces, rather a client-based spreadsheet approach. This offers a number of benefits over the client-based approach. Server processing and storage can scale up to handle a very large number of scenarios, so that analysis of county, even field level, across the whole U.S., can be performed. Server based databases allow dairy and digester parameters be held and managed in a single managed data repository, while allows users to customize standard values and perform individual analysis. Server-based calculations can be easily extended, versions and upgrades managed, and any changes are immediately available to all users. This user manual describes how to use and/or modify input database tables, run DANA, view and modify reports.« less
NASA Astrophysics Data System (ADS)
Argyropoulou, Evangelia
2015-04-01
The current study was focused on the seafloor morphology of the North Aegean Basin in Greece, through Object Based Image Analysis (OBIA) using a Digital Elevation Model. The goal was the automatic extraction of morphologic and morphotectonic features, resulting into fault surface extraction. An Object Based Image Analysis approach was developed based on the bathymetric data and the extracted features, based on morphological criteria, were compared with the corresponding landforms derived through tectonic analysis. A digital elevation model of 150 meters spatial resolution was used. At first, slope, profile curvature, and percentile were extracted from this bathymetry grid. The OBIA approach was developed within the eCognition environment. Four segmentation levels were created having as a target "level 4". At level 4, the final classes of geomorphological features were classified: discontinuities, fault-like features and fault surfaces. On previous levels, additional landforms were also classified, such as continental platform and continental slope. The results of the developed approach were evaluated by two methods. At first, classification stability measures were computed within eCognition. Then, qualitative and quantitative comparison of the results took place with a reference tectonic map which has been created manually based on the analysis of seismic profiles. The results of this comparison were satisfactory, a fact which determines the correctness of the developed OBIA approach.
Paper-Plastic Hybrid Microfluidic Device for Smartphone-Based Colorimetric Analysis of Urine.
Jalal, Uddin M; Jin, Gyeong Jun; Shim, Joon S
2017-12-19
In this work, a disposable paper-plastic hybrid microfluidic lab-on-a-chip (LOC) has been developed and successfully applied for the colorimetric measurement of urine by the smartphone-based optical platform using a "UrineAnalysis" Android app. The developed device was cost-effectively implemented as a stand-alone hybrid LOC by incorporating the paper-based conventional reagent test strip inside the plastic-based LOC microchannel. The LOC device quantitatively investigated the small volume (40 μL) of urine analytes for the colorimetric reaction of glucose, protein, pH, and red blood cell (RBC) in integration with the finger-actuating micropump. On the basis of our experiments, the conventional urine strip showed large deviation as the reaction time goes by, because dipping the strip sensor in a bottle of urine could not control the reaction volume. By integrating the strip sensor in the LOC device for urine analysis, our device significantly improves the time-dependent inconstancy of the conventional dipstick-based urine strip, and the smartphone app used for image analysis enhances the visual assessment of the test strip, which is a major user concern for the colorimetric analysis in point-of-care (POC) applications. As a result, the user-friendly LOC, which is successfully implemented in a disposable format with the smartphone-based optical platform, may be applicable as an effective tool for rapid and qualitative POC urinalysis.
Jacewicz, R; Lewandowski, K; Rupa-Matysek, J; Jędrzejczyk, M; Berent, J
The study documents the risk that comes with DNA analysis of materials derived from patients after allogeneic hematopoietic stem cell transplantation (allo-HSCT) in forensic genetics. DNA chimerism was studied in 30 patients after allo-HSCT, based on techniques applied in contemporary forensic genetics, i.e. real-time PCR and multiplex PCR-STR with the use of autosomal DNA as well as Y-DNA markers. The results revealed that the DNA profile of the recipient's blood was identical with the donor's in the majority of cases. Therefore, blood analysis can lead to false conclusions in personal identification as well as kinship analysis. An investigation of buccal swabs revealed a mixture of DNA in the majority of recipients. Consequently, personal identification on the basis of stain analysis of the same origin may be impossible. The safest (but not ideal) material turned out to be the hair root. Its analysis based on autosomal DNA revealed 100% of the recipient's profile. However, an analysis based on Y-chromosome markers performed in female allo-HSCT recipients with male donors demonstrated the presence of donor DNA in hair cells - similarly to the blood and buccal swabs. In the light of potential risks arising from DNA profiling of biological materials derived from persons after allotransplantation in judicial aspects, certain procedures were proposed to eliminate such dangers. The basic procedures include abandoning the approach based exclusively on blood collection, both for kinship analysis and personal identification; asking persons who are to be tested about their history of allo-HSCT before sample collection and profile entry in the DNA database, and verification of DNA profiling based on hair follicles in uncertain cases.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs
2018-01-01
Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344
SaaS Platform for Time Series Data Handling
NASA Astrophysics Data System (ADS)
Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail
2018-02-01
The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
Frequency-Specific Fractal Analysis of Postural Control Accounts for Control Strategies
Gilfriche, Pierre; Deschodt-Arsac, Véronique; Blons, Estelle; Arsac, Laurent M.
2018-01-01
Diverse indicators of postural control in Humans have been explored for decades, mostly based on the trajectory of the center-of-pressure. Classical approaches focus on variability, based on the notion that if a posture is too variable, the subject is not stable. Going deeper, an improved understanding of underlying physiology has been gained from studying variability in different frequency ranges, pointing to specific short-loops (proprioception), and long-loops (visuo-vestibular) in neural control. More recently, fractal analyses have proliferated and become useful additional metrics of postural control. They allowed identifying two scaling phenomena, respectively in short and long timescales. Here, we show that one of the most widely used methods for fractal analysis, Detrended Fluctuation Analysis, could be enhanced to account for scalings on specific frequency ranges. By computing and filtering a bank of synthetic fractal signals, we established how scaling analysis can be focused on specific frequency components. We called the obtained method Frequency-specific Fractal Analysis (FsFA) and used it to associate the two scaling phenomena of postural control to proprioceptive-based control loop and visuo-vestibular based control loop. After that, convincing arguments of method validity came from an application on the study of unaltered vs. altered postural control in athletes. Overall, the analysis suggests that at least two timescales contribute to postural control: a velocity-based control in short timescales relying on proprioceptive sensors, and a position-based control in longer timescales with visuo-vestibular sensors, which is a brand-new vision of postural control. Frequency-specific scaling exponents are promising markers of control strategies in Humans. PMID:29643816
2009-01-01
Background Expressed sequence tags (ESTs) are an important source of gene-based markers such as those based on insertion-deletions (Indels) or single-nucleotide polymorphisms (SNPs). Several gel based methods have been reported for the detection of sequence variants, however they have not been widely exploited in common bean, an important legume crop of the developing world. The objectives of this project were to develop and map EST based markers using analysis of single strand conformation polymorphisms (SSCPs), to create a transcript map for common bean and to compare synteny of the common bean map with sequenced chromosomes of other legumes. Results A set of 418 EST based amplicons were evaluated for parental polymorphisms using the SSCP technique and 26% of these presented a clear conformational or size polymorphism between Andean and Mesoamerican genotypes. The amplicon based markers were then used for genetic mapping with segregation analysis performed in the DOR364 × G19833 recombinant inbred line (RIL) population. A total of 118 new marker loci were placed into an integrated molecular map for common bean consisting of 288 markers. Of these, 218 were used for synteny analysis and 186 presented homology with segments of the soybean genome with an e-value lower than 7 × 10-12. The synteny analysis with soybean showed a mosaic pattern of syntenic blocks with most segments of any one common bean linkage group associated with two soybean chromosomes. The analysis with Medicago truncatula and Lotus japonicus presented fewer syntenic regions consistent with the more distant phylogenetic relationship between the galegoid and phaseoloid legumes. Conclusion The SSCP technique is a useful and inexpensive alternative to other SNP or Indel detection techniques for saturating the common bean genetic map with functional markers that may be useful in marker assisted selection. In addition, the genetic markers based on ESTs allowed the construction of a transcript map and given their high conservation between species allowed synteny comparisons to be made to sequenced genomes. This synteny analysis may support positional cloning of target genes in common bean through the use of genomic information from these other legumes. PMID:20030833
Spreadsheet-based engine data analysis tool - user's guide.
DOT National Transportation Integrated Search
2016-07-01
This record refers to both the spreadsheet tool - Fleet Equipment Performance Measurement Preventive Maintenance Model: Spreadsheet-Based Engine Data Analysis Tool, http://ntl.bts.gov/lib/60000/60000/60007/0-6626-P1_Final.xlsm - and its accompanying ...
Instantiating the art of war for effects-based operations
NASA Astrophysics Data System (ADS)
Burns, Carla L.
2002-07-01
Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.
NASA Astrophysics Data System (ADS)
Feng, Shangyuan; Lin, Juqiang; Huang, Zufang; Chen, Guannan; Chen, Weisheng; Wang, Yue; Chen, Rong; Zeng, Haishan
2013-01-01
The capability of using silver nanoparticle based near-infrared surface enhanced Raman scattering (SERS) spectroscopy combined with principal component analysis (PCA) and linear discriminate analysis (LDA) to differentiate esophageal cancer tissue from normal tissue was presented. Significant differences in Raman intensities of prominent SERS bands were observed between normal and cancer tissues. PCA-LDA multivariate analysis of the measured tissue SERS spectra achieved diagnostic sensitivity of 90.9% and specificity of 97.8%. This exploratory study demonstrated great potential for developing label-free tissue SERS analysis into a clinical tool for esophageal cancer detection.
NASA Technical Reports Server (NTRS)
Sopher, R.; Hallock, D. W.
1985-01-01
A time history analysis for rotorcraft dynamics based on dynamical substructures, and nonstructural mathematical and aerodynamic components is described. The analysis is applied to predict helicopter ground resonance and response to rotor damage. Other applications illustrate the stability and steady vibratory response of stopped and gimballed rotors, representative of new technology. Desirable attributes expected from modern codes are realized, although the analysis does not employ a complete set of techniques identified for advanced software. The analysis is able to handle a comprehensive set of steady state and stability problems with a small library of components.
Spatial Paradigm for Information Retrieval and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
The SPIRE system consists of software for visual analysis of primarily text based information sources. This technology enables the content analysis of text documents without reading all the documents. It employs several algorithms for text and word proximity analysis. It identifies the key themes within the text documents. From this analysis, it projects the results onto a visual spatial proximity display (Galaxies or Themescape) where items (documents and/or themes) visually close to each other are known to have content which is close to each other. Innovative interaction techniques then allow for dynamic visual analysis of large text based information spaces.
SPIRE1.03. Spatial Paradigm for Information Retrieval and Exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, K.J.; Bohn, S.; Crow, V.
The SPIRE system consists of software for visual analysis of primarily text based information sources. This technology enables the content analysis of text documents without reading all the documents. It employs several algorithms for text and word proximity analysis. It identifies the key themes within the text documents. From this analysis, it projects the results onto a visual spatial proximity display (Galaxies or Themescape) where items (documents and/or themes) visually close to each other are known to have content which is close to each other. Innovative interaction techniques then allow for dynamic visual analysis of large text based information spaces.
Product Support Manager Guidebook
2011-04-01
package is being developed using supportability analysis concepts such as Failure Mode, Effects and Criticality Analysis (FMECA), Fault Tree Analysis ( FTA ...Analysis (LORA) Condition Based Maintenance + (CBM+) Fault Tree Analysis ( FTA ) Failure Mode, Effects, and Criticality Analysis (FMECA) Maintenance Task...Reporting and Corrective Action System (FRACAS), Fault Tree Analysis ( FTA ), Level of Repair Analysis (LORA), Maintenance Task Analysis (MTA
Neuromorphic Computing for Very Large Test and Evaluation Data Analysis
2014-05-01
analysis and utilization of newly available hardware- based artificial neural network chips. These two aspects of the program are complementary. The...neuromorphic architectures research focused on long term disruptive technologies with high risk but revolutionary potential. The hardware- based neural...today. Overall, hardware- based neural processing research allows us to study the fundamental system and architectural issues relevant for employing
ERIC Educational Resources Information Center
Vanfretti, Luigi; Farrokhabadi, Mostafa
2015-01-01
This article presents the implementation of the constructive alignment theory (CAT) in a power system analysis course through a consensus-based course design process. The consensus-based design process involves both the instructor and graduate-level students and it aims to develop the CAT framework in a holistic manner with the goal of including…
External Aiding Methods for IMU-Based Navigation
2016-11-26
Carlo simulation and particle filtering . This approach allows for the utilization of highly complex systems in a black box configuration with minimal...alternative method, which has the advantage of being less computationally demanding, is to use a Kalman filtering -based approach. The particular...Kalman filtering -based approach used here is known as linear covariance analysis. In linear covariance analysis, the nonlinear systems describing the
ERIC Educational Resources Information Center
McPhilemy, Catherine; Dillenburger, Karola
2013-01-01
Applied behaviour analysis (ABA)-based programmes are endorsed as the gold standard for treatment of children with autistic spectrum disorder (ASD) in most of North America. This is not the case in most of Europe, where instead a non-specified "eclectic" approach is adopted. We explored the social validity of ABA-based interventions with…
Skeletal maturity determination from hand radiograph by model-based analysis
NASA Astrophysics Data System (ADS)
Vogelsang, Frank; Kohnen, Michael; Schneider, Hansgerd; Weiler, Frank; Kilbinger, Markus W.; Wein, Berthold B.; Guenther, Rolf W.
2000-06-01
Derived from a model based segmentation algorithm for hand radiographs proposed in our former work we now present a method to determine skeletal maturity by an automated analysis of regions of interest (ROI). These ROIs including the epiphyseal and carpal bones, which are most important for skeletal maturity determination, can be extracted out of the radiograph by knowledge based algorithms.
ERIC Educational Resources Information Center
Lambert, Joseph M.; Lloyd, Blair P.; Staubitz, Johanna L.; Weaver, Emily S.; Jennings, Chelsea M.
2014-01-01
The trial-based functional analysis (FA) is a useful alternative to the traditional FA in contexts in which it is challenging to establish environmental control for extended periods of time. Previous researchers have demonstrated that others can be trained to conduct trial-based FAs with high procedural fidelity by providing a didactic…
Real-time image annotation by manifold-based biased Fisher discriminant analysis
NASA Astrophysics Data System (ADS)
Ji, Rongrong; Yao, Hongxun; Wang, Jicheng; Sun, Xiaoshuai; Liu, Xianming
2008-01-01
Automatic Linguistic Annotation is a promising solution to bridge the semantic gap in content-based image retrieval. However, two crucial issues are not well addressed in state-of-art annotation algorithms: 1. The Small Sample Size (3S) problem in keyword classifier/model learning; 2. Most of annotation algorithms can not extend to real-time online usage due to their low computational efficiencies. This paper presents a novel Manifold-based Biased Fisher Discriminant Analysis (MBFDA) algorithm to address these two issues by transductive semantic learning and keyword filtering. To address the 3S problem, Co-Training based Manifold learning is adopted for keyword model construction. To achieve real-time annotation, a Bias Fisher Discriminant Analysis (BFDA) based semantic feature reduction algorithm is presented for keyword confidence discrimination and semantic feature reduction. Different from all existing annotation methods, MBFDA views image annotation from a novel Eigen semantic feature (which corresponds to keywords) selection aspect. As demonstrated in experiments, our manifold-based biased Fisher discriminant analysis annotation algorithm outperforms classical and state-of-art annotation methods (1.K-NN Expansion; 2.One-to-All SVM; 3.PWC-SVM) in both computational time and annotation accuracy with a large margin.
Ganga, G M D; Esposto, K F; Braatz, D
2012-01-01
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Java Source Code Analysis for API Migration to Embedded Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winter, Victor; McCoy, James A.; Guerrero, Jonathan
Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered bymore » APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.« less
Ma, Chuang; Xin, Mingming; Feldmann, Kenneth A.; Wang, Xiangfeng
2014-01-01
Machine learning (ML) is an intelligent data mining technique that builds a prediction model based on the learning of prior knowledge to recognize patterns in large-scale data sets. We present an ML-based methodology for transcriptome analysis via comparison of gene coexpression networks, implemented as an R package called machine learning–based differential network analysis (mlDNA) and apply this method to reanalyze a set of abiotic stress expression data in Arabidopsis thaliana. The mlDNA first used a ML-based filtering process to remove nonexpressed, constitutively expressed, or non-stress-responsive “noninformative” genes prior to network construction, through learning the patterns of 32 expression characteristics of known stress-related genes. The retained “informative” genes were subsequently analyzed by ML-based network comparison to predict candidate stress-related genes showing expression and network differences between control and stress networks, based on 33 network topological characteristics. Comparative evaluation of the network-centric and gene-centric analytic methods showed that mlDNA substantially outperformed traditional statistical testing–based differential expression analysis at identifying stress-related genes, with markedly improved prediction accuracy. To experimentally validate the mlDNA predictions, we selected 89 candidates out of the 1784 predicted salt stress–related genes with available SALK T-DNA mutagenesis lines for phenotypic screening and identified two previously unreported genes, mutants of which showed salt-sensitive phenotypes. PMID:24520154
A Query Expansion Framework in Image Retrieval Domain Based on Local and Global Analysis
Rahman, M. M.; Antani, S. K.; Thoma, G. R.
2011-01-01
We present an image retrieval framework based on automatic query expansion in a concept feature space by generalizing the vector space model of information retrieval. In this framework, images are represented by vectors of weighted concepts similar to the keyword-based representation used in text retrieval. To generate the concept vocabularies, a statistical model is built by utilizing Support Vector Machine (SVM)-based classification techniques. The images are represented as “bag of concepts” that comprise perceptually and/or semantically distinguishable color and texture patches from local image regions in a multi-dimensional feature space. To explore the correlation between the concepts and overcome the assumption of feature independence in this model, we propose query expansion techniques in the image domain from a new perspective based on both local and global analysis. For the local analysis, the correlations between the concepts based on the co-occurrence pattern, and the metrical constraints based on the neighborhood proximity between the concepts in encoded images, are analyzed by considering local feedback information. We also analyze the concept similarities in the collection as a whole in the form of a similarity thesaurus and propose an efficient query expansion based on the global analysis. The experimental results on a photographic collection of natural scenes and a biomedical database of different imaging modalities demonstrate the effectiveness of the proposed framework in terms of precision and recall. PMID:21822350
NASA Technical Reports Server (NTRS)
Johnson, S. C.
1982-01-01
An interface system for passing data between a relational information management (RIM) data base complex and engineering analysis language (EAL), a finite element structural analysis program is documented. The interface system, implemented on a CDC Cyber computer, is composed of two FORTRAN programs called RIM2EAL and EAL2RIM. The RIM2EAL reads model definition data from RIM and creates a file of EAL commands to define the model. The EAL2RIM reads model definition and EAL generated analysis data from EAL's data library and stores these data dirctly in a RIM data base. These two interface programs and the format for the RIM data complex are described.
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
NASA Astrophysics Data System (ADS)
Donner, Reik; Balasis, Georgios; Stolbova, Veronika; Wiedermann, Marc; Georgiou, Marina; Kurths, Jürgen
2016-04-01
Magnetic storms are the most prominent global manifestations of out-of-equilibrium magnetospheric dynamics. Investigating the dynamical complexity exhibited by geomagnetic observables can provide valuable insights into relevant physical processes as well as temporal scales associated with this phenomenon. In this work, we introduce several innovative data analysis techniques enabling a quantitative analysis of the Dst index non-stationary behavior. Using recurrence quantification analysis (RQA) and recurrence network analysis (RNA), we obtain a variety of complexity measures serving as markers of quiet- and storm-time magnetospheric dynamics. We additionally apply these techniques to the main driver of Dst index variations, the V BSouth coupling function and interplanetary medium parameters Bz and Pdyn in order to discriminate internal processes from the magnetosphere's response directly induced by the external forcing by the solar wind. The derived recurrence-based measures allow us to improve the accuracy with which magnetospheric storms can be classified based on ground-based observations. The new methodology presented here could be of significant interest for the space weather research community working on time series analysis for magnetic storm forecasts.
Functional Interaction Network Construction and Analysis for Disease Discovery.
Wu, Guanming; Haw, Robin
2017-01-01
Network-based approaches project seemingly unrelated genes or proteins onto a large-scale network context, therefore providing a holistic visualization and analysis platform for genomic data generated from high-throughput experiments, reducing the dimensionality of data via using network modules and increasing the statistic analysis power. Based on the Reactome database, the most popular and comprehensive open-source biological pathway knowledgebase, we have developed a highly reliable protein functional interaction network covering around 60 % of total human genes and an app called ReactomeFIViz for Cytoscape, the most popular biological network visualization and analysis platform. In this chapter, we describe the detailed procedures on how this functional interaction network is constructed by integrating multiple external data sources, extracting functional interactions from human curated pathway databases, building a machine learning classifier called a Naïve Bayesian Classifier, predicting interactions based on the trained Naïve Bayesian Classifier, and finally constructing the functional interaction database. We also provide an example on how to use ReactomeFIViz for performing network-based data analysis for a list of genes.
Liu, Fang; Shen, Changqing; He, Qingbo; Zhang, Ao; Liu, Yongbin; Kong, Fanrang
2014-01-01
A fault diagnosis strategy based on the wayside acoustic monitoring technique is investigated for locomotive bearing fault diagnosis. Inspired by the transient modeling analysis method based on correlation filtering analysis, a so-called Parametric-Mother-Doppler-Wavelet (PMDW) is constructed with six parameters, including a center characteristic frequency and five kinematic model parameters. A Doppler effect eliminator containing a PMDW generator, a correlation filtering analysis module, and a signal resampler is invented to eliminate the Doppler effect embedded in the acoustic signal of the recorded bearing. Through the Doppler effect eliminator, the five kinematic model parameters can be identified based on the signal itself. Then, the signal resampler is applied to eliminate the Doppler effect using the identified parameters. With the ability to detect early bearing faults, the transient model analysis method is employed to detect localized bearing faults after the embedded Doppler effect is eliminated. The effectiveness of the proposed fault diagnosis strategy is verified via simulation studies and applications to diagnose locomotive roller bearing defects. PMID:24803197
García-Sevillano, M A; García-Barrera, T; Navarro, F; Montero-Lobato, Z; Gómez-Ariza, J L
2015-04-01
Mass spectrometry (MS)-based toxicometabolomics requires analytical approaches for obtaining unbiased metabolic profiles. The present work explores the general application of direct infusion MS using a high mass resolution analyzer (a hybrid systems triple quadrupole-time-of-flight) and a complementary gas chromatography-MS analysis to mitochondria extracts from mouse hepatic cells, emphasizing on mitochondria isolation from hepatic cells with a commercial kit, sample treatment after cell lysis, comprehensive metabolomic analysis and pattern recognition from metabolic profiles. Finally, the metabolomic platform was successfully checked on a case-study based on the exposure experiment of mice Mus musculus to inorganic arsenic during 12 days. Endogenous metabolites alterations were recognized by partial least squares-discriminant analysis. Subsequently, metabolites were identified by combining MS/MS analysis and metabolomics databases. This work reports for the first time the effects of As-exposure on hepatic mitochondria metabolic pathways based on MS, and reveals disturbances in Krebs cycle, β-oxidation pathway, amino acids degradation and perturbations in creatine levels. This non-target analysis provides extensive metabolic information from mitochondrial organelle, which could be applied to toxicology, pharmacology and clinical studies.
NASA Astrophysics Data System (ADS)
Sethuramalingam, Prabhu; Vinayagam, Babu Kupusamy
2016-07-01
Carbon nanotube mixed grinding wheel is used in the grinding process to analyze the surface characteristics of AISI D2 tool steel material. Till now no work has been carried out using carbon nanotube based grinding wheel. Carbon nanotube based grinding wheel has excellent thermal conductivity and good mechanical properties which are used to improve the surface finish of the workpiece. In the present study, the multi response optimization of process parameters like surface roughness and metal removal rate of grinding process of single wall carbon nanotube (CNT) in mixed cutting fluids is undertaken using orthogonal array with grey relational analysis. Experiments are performed with designated grinding conditions obtained using the L9 orthogonal array. Based on the results of the grey relational analysis, a set of optimum grinding parameters is obtained. Using the analysis of variance approach the significant machining parameters are found. Empirical model for the prediction of output parameters has been developed using regression analysis and the results are compared empirically, for conditions of with and without CNT grinding wheel in grinding process.
López-Pacheco, María G; Sánchez-Fernández, Luis P; Molina-Lozano, Herón
2014-01-15
Noise levels of common sources such as vehicles, whistles, sirens, car horns and crowd sounds are mixed in urban soundscapes. Nowadays, environmental acoustic analysis is performed based on mixture signals recorded by monitoring systems. These mixed signals make it difficult for individual analysis which is useful in taking actions to reduce and control environmental noise. This paper aims at separating, individually, the noise source from recorded mixtures in order to evaluate the noise level of each estimated source. A method based on blind deconvolution and blind source separation in the wavelet domain is proposed. This approach provides a basis to improve results obtained in monitoring and analysis of common noise sources in urban areas. The method validation is through experiments based on knowledge of the predominant noise sources in urban soundscapes. Actual recordings of common noise sources are used to acquire mixture signals using a microphone array in semi-controlled environments. The developed method has demonstrated great performance improvements in identification, analysis and evaluation of common urban sources. © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Ruf, Joseph; Holt, James B.; Canabal, Francisco
1999-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes code for ejector mode fluid dynamics. The Draco engine analysis is a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
White Matter Fiber-based Analysis of T1w/T2w Ratio Map.
Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D; Entringer, Sonja; Buss, Claudia; Styner, Martin
2017-02-01
To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.
Werner, René; Ehrhardt, Jan; Schmidt-Richberg, Alexander; Heiss, Anabell; Handels, Heinz
2010-11-01
Motivated by radiotherapy of lung cancer non- linear registration is applied to estimate 3D motion fields for local lung motion analysis in thoracic 4D CT images. Reliability of analysis results depends on the registration accuracy. Therefore, our study consists of two parts: optimization and evaluation of a non-linear registration scheme for motion field estimation, followed by a registration-based analysis of lung motion patterns. The study is based on 4D CT data of 17 patients. Different distance measures and force terms for thoracic CT registration are implemented and compared: sum of squared differences versus a force term related to Thirion's demons registration; masked versus unmasked force computation. The most accurate approach is applied to local lung motion analysis. Masked Thirion forces outperform the other force terms. The mean target registration error is 1.3 ± 0.2 mm, which is in the order of voxel size. Based on resulting motion fields and inter-patient normalization of inner lung coordinates and breathing depths a non-linear dependency between inner lung position and corresponding strength of motion is identified. The dependency is observed for all patients without or with only small tumors. Quantitative evaluation of the estimated motion fields indicates high spatial registration accuracy. It allows for reliable registration-based local lung motion analysis. The large amount of information encoded in the motion fields makes it possible to draw detailed conclusions, e.g., to identify the dependency of inner lung localization and motion. Our examinations illustrate the potential of registration-based motion analysis.
White matter fiber-based analysis of T1w/T2w ratio map
NASA Astrophysics Data System (ADS)
Chen, Haiwei; Budin, Francois; Noel, Jean; Prieto, Juan Carlos; Gilmore, John; Rasmussen, Jerod; Wadhwa, Pathik D.; Entringer, Sonja; Buss, Claudia; Styner, Martin
2017-02-01
Purpose: To develop, test, evaluate and apply a novel tool for the white matter fiber-based analysis of T1w/T2w ratio maps quantifying myelin content. Background: The cerebral white matter in the human brain develops from a mostly non-myelinated state to a nearly fully mature white matter myelination within the first few years of life. High resolution T1w/T2w ratio maps are believed to be effective in quantitatively estimating myelin content on a voxel-wise basis. We propose the use of a fiber-tract-based analysis of such T1w/T2w ratio data, as it allows us to separate fiber bundles that a common regional analysis imprecisely groups together, and to associate effects to specific tracts rather than large, broad regions. Methods: We developed an intuitive, open source tool to facilitate such fiber-based studies of T1w/T2w ratio maps. Via its Graphical User Interface (GUI) the tool is accessible to non-technical users. The framework uses calibrated T1w/T2w ratio maps and a prior fiber atlas as an input to generate profiles of T1w/T2w values. The resulting fiber profiles are used in a statistical analysis that performs along-tract functional statistical analysis. We applied this approach to a preliminary study of early brain development in neonates. Results: We developed an open-source tool for the fiber based analysis of T1w/T2w ratio maps and tested it in a study of brain development.
Identifying novel glioma associated pathways based on systems biology level meta-analysis.
Hu, Yangfan; Li, Jinquan; Yan, Wenying; Chen, Jiajia; Li, Yin; Hu, Guang; Shen, Bairong
2013-01-01
With recent advances in microarray technology, including genomics, proteomics, and metabolomics, it brings a great challenge for integrating this "-omics" data to analysis complex disease. Glioma is an extremely aggressive and lethal form of brain tumor, and thus the study of the molecule mechanism underlying glioma remains very important. To date, most studies focus on detecting the differentially expressed genes in glioma. However, the meta-analysis for pathway analysis based on multiple microarray datasets has not been systematically pursued. In this study, we therefore developed a systems biology based approach by integrating three types of omics data to identify common pathways in glioma. Firstly, the meta-analysis has been performed to study the overlapping of signatures at different levels based on the microarray gene expression data of glioma. Among these gene expression datasets, 12 pathways were found in GeneGO database that shared by four stages. Then, microRNA expression profiles and ChIP-seq data were integrated for the further pathway enrichment analysis. As a result, we suggest 5 of these pathways could be served as putative pathways in glioma. Among them, the pathway of TGF-beta-dependent induction of EMT via SMAD is of particular importance. Our results demonstrate that the meta-analysis based on systems biology level provide a more useful approach to study the molecule mechanism of complex disease. The integration of different types of omics data, including gene expression microarrays, microRNA and ChIP-seq data, suggest some common pathways correlated with glioma. These findings will offer useful potential candidates for targeted therapeutic intervention of glioma.
Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang
2017-05-01
A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.
Digital microarray analysis for digital artifact genomics
NASA Astrophysics Data System (ADS)
Jaenisch, Holger; Handley, James; Williams, Deborah
2013-06-01
We implement a Spatial Voting (SV) based analogy of microarray analysis for digital gene marker identification in malware code sections. We examine a famous set of malware formally analyzed by Mandiant and code named Advanced Persistent Threat (APT1). APT1 is a Chinese organization formed with specific intent to infiltrate and exploit US resources. Manidant provided a detailed behavior and sting analysis report for the 288 malware samples available. We performed an independent analysis using a new alternative to the traditional dynamic analysis and static analysis we call Spatial Analysis (SA). We perform unsupervised SA on the APT1 originating malware code sections and report our findings. We also show the results of SA performed on some members of the families associated by Manidant. We conclude that SV based SA is a practical fast alternative to dynamics analysis and static analysis.
ERIC Educational Resources Information Center
Zheng, Lanqin
2016-01-01
This meta-analysis examined research on the effects of self-regulated learning scaffolds on academic performance in computer-based learning environments from 2004 to 2015. A total of 29 articles met inclusion criteria and were included in the final analysis with a total sample size of 2,648 students. Moderator analyses were performed using a…
Process for computing geometric perturbations for probabilistic analysis
Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX
2012-04-10
A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.
A Classroom Research Skills Development Emphasizing Data Analysis and Result of SSRU Students by RBL
ERIC Educational Resources Information Center
Waree, Chaiwat
2017-01-01
The purpose of the study is the learning using research as a base. To strengthen the skills of classroom research Emphasizing Data Analysis and Result and to study the development of research skills in the class Emphasizing Data Analysis and Result of SSRU' Students by learning using research base. The target group are students in the 2nd semester…
ERIC Educational Resources Information Center
Zhu, Wenzhong; Liu, Dan
2014-01-01
Based on a review of the literature on ESP and needs analysis, this paper is intended to offer some theoretical supports and inspirations for BE instructors to develop BE curricula for business contexts. It discusses how the theory of need analysis can be used in Business English curriculum design, and proposes some principles of BE curriculum…
Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noonan, Nicholas James
2015-07-01
This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).
Wheeze sound analysis using computer-based techniques: a systematic review.
Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian
2017-10-31
Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.
A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.
Choi, Young Hae; Sertic, Sarah; Kim, Hye Kyong; Wilson, Erica G; Michopoulos, Filippos; Lefeber, Alfons W M; Erkelens, Cornelis; Prat Kricun, Sergio D; Verpoorte, Robert
2005-02-23
The metabolomic analysis of 11 Ilex species, I. argentina, I. brasiliensis, I. brevicuspis, I. dumosavar. dumosa, I. dumosa var. guaranina, I. integerrima, I. microdonta, I. paraguariensis var. paraguariensis, I. pseudobuxus, I. taubertiana, and I. theezans, was carried out by NMR spectroscopy and multivariate data analysis. The analysis using principal component analysis and classification of the (1)H NMR spectra showed a clear discrimination of those samples based on the metabolites present in the organic and aqueous fractions. The major metabolites that contribute to the discrimination are arbutin, caffeine, phenylpropanoids, and theobromine. Among those metabolites, arbutin, which has not been reported yet as a constituent of Ilex species, was found to be a biomarker for I. argentina,I. brasiliensis, I. brevicuspis, I. integerrima, I. microdonta, I. pseudobuxus, I. taubertiana, and I. theezans. This reliable method based on the determination of a large number of metabolites makes the chemotaxonomical analysis of Ilex species possible.
NASA Technical Reports Server (NTRS)
Starnes, James H., Jr.; Newman, James C., Jr.; Harris, Charles E.; Piascik, Robert S.; Young, Richard D.; Rose, Cheryl A.
2003-01-01
Analysis methodologies for predicting fatigue-crack growth from rivet holes in panels subjected to cyclic loads and for predicting the residual strength of aluminum fuselage structures with cracks and subjected to combined internal pressure and mechanical loads are described. The fatigue-crack growth analysis methodology is based on small-crack theory and a plasticity induced crack-closure model, and the effect of a corrosive environment on crack-growth rate is included. The residual strength analysis methodology is based on the critical crack-tip-opening-angle fracture criterion that characterizes the fracture behavior of a material of interest, and a geometric and material nonlinear finite element shell analysis code that performs the structural analysis of the fuselage structure of interest. The methodologies have been verified experimentally for structures ranging from laboratory coupons to full-scale structural components. Analytical and experimental results based on these methodologies are described and compared for laboratory coupons and flat panels, small-scale pressurized shells, and full-scale curved stiffened panels. The residual strength analysis methodology is sufficiently general to include the effects of multiple-site damage on structural behavior.
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
NASA Astrophysics Data System (ADS)
Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang
2018-04-01
Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.
Tan, Ming T; Liu, Jian-ping; Lao, Lixing
2012-08-01
Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.
Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda
2014-01-01
In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467
DOT National Transportation Integrated Search
2009-12-22
This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...
DOT National Transportation Integrated Search
2009-11-23
This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Vanderberg, J. D.; Woodbury, N. W.
1974-01-01
A method for rapidly examining the probable applicability of weight estimating formulae to a specific aerospace vehicle design is presented. The Multivariate Analysis Retrieval and Storage System (MARS) is comprised of three computer programs which sequentially operate on the weight and geometry characteristics of past aerospace vehicles designs. Weight and geometric characteristics are stored in a set of data bases which are fully computerized. Additional data bases are readily added to the MARS system and/or the existing data bases may be easily expanded to include additional vehicles or vehicle characteristics.
Mazumdar, Debapriya; Liu, Juewen; Lu, Yi
2010-09-21
An analytical test for an analyte comprises (a) a base, having a reaction area and a visualization area, (b) a capture species, on the base in the visualization area, comprising nucleic acid, and (c) analysis chemistry reagents, on the base in the reaction area. The analysis chemistry reagents comprise (i) a substrate comprising nucleic acid and a first label, and (ii) a reactor comprising nucleic acid. The analysis chemistry reagents can react with a sample comprising the analyte and water, to produce a visualization species comprising nucleic acid and the first label, and the capture species can bind the visualization species.
A Government/Industry Summary of the Design Analysis Methods for Vibrations (DAMVIBS) Program
NASA Technical Reports Server (NTRS)
Kvaternik, Raymond G. (Compiler)
1993-01-01
The NASA Langley Research Center in 1984 initiated a rotorcraft structural dynamics program, designated DAMVIBS (Design Analysis Methods for VIBrationS), with the objective of establishing the technology base needed by the rotorcraft industry for developing an advanced finite-element-based dynamics design analysis capability for vibrations. An assessment of the program showed that the DAMVIBS Program has resulted in notable technical achievements and major changes in industrial design practice, all of which have significantly advanced the industry's capability to use and rely on finite-element-based dynamics analyses during the design process.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Paper-based analytical devices for environmental analysis.
Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S
2016-03-21
The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.
NASA Astrophysics Data System (ADS)
Didenko, A. N.; Nosyrev, M. Yu.; Shevchenko, B. F.; Gilmanova, G. Z.
2017-11-01
The depth of the base of the magnetoactive layer and the geothermal gradient in the Sikhote Alin crust are estimated based on a method determining the Curie depth point of magnetoactive masses by using spectral analysis of the anomalous magnetic field. A detailed map of the geothermal gradient is constructed for the first time for the Sikhote Alin and adjacent areas of the Central Asian belt. Analysis of this map shows that the zones with a higher geothermal gradient geographically fit the areas with a higher level of seismicity.
Graph-based urban scene analysis using symbolic data
NASA Astrophysics Data System (ADS)
Moissinac, Henri; Maitre, Henri; Bloch, Isabelle
1995-07-01
A framework is presented for the interpretation of a urban landscape based on the analysis of aerial pictures. This method has been designed for the use of a priori knowledge provided by a geographic map in order to improve the image analysis stage. A coherent final interpretation of the studied area is proposed. It relies on a graph based data structure to modelize the urban landscape, and on a global uncertainty management to evaluate the final confidence we can have in the results presented. This structure and uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels.
Systems and methods for analyzing building operations sensor data
Mezic, Igor; Eisenhower, Bryan A.
2015-05-26
Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.
Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki
2009-11-01
We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.
Finite Element Analysis of a Copper Single Crystal Shape Memory Alloy-Based Endodontic Instruments
NASA Astrophysics Data System (ADS)
Vincent, Marin; Thiebaud, Frédéric; Bel Haj Khalifa, Saifeddine; Engels-Deutsch, Marc; Ben Zineb, Tarak
2015-10-01
The aim of the present paper is the development of endodontic Cu-based single crystal Shape Memory Alloy (SMA) instruments in order to eliminate the antimicrobial and mechanical deficiencies observed with the conventional Nickel-Titane (NiTi) SMA files. A thermomechanical constitutive law, already developed and implemented in a finite element code by our research group, is adopted for the simulation of the single crystal SMA behavior. The corresponding material parameters were identified starting from experimental results for a tensile test at room temperature. A computer-aided design geometry has been achieved and considered for a finite element structural analysis of the endodontic Cu-based single crystal SMA files. They are meshed with tetrahedral continuum elements to improve the computation time and the accuracy of results. The geometric parameters tested in this study are the length of the active blade, the rod length, the pitch, the taper, the tip diameter, and the rod diameter. For each set of adopted parameters, a finite element model is built and tested in a combined bending-torsion loading in accordance with ISO 3630-1 norm. The numerical analysis based on finite element procedure allowed purposing an optimal geometry suitable for Cu-based single crystal SMA endodontic files. The same analysis was carried out for the classical NiTi SMA files and a comparison was made between the two kinds of files. It showed that Cu-based single crystal SMA files are less stiff than the NiTi files. The Cu-based endodontic files could be used to improve the root canal treatments. However, the finite element analysis brought out the need for further investigation based on experiments.
Wavelet-based image analysis system for soil texture analysis
NASA Astrophysics Data System (ADS)
Sun, Yun; Long, Zhiling; Jang, Ping-Rey; Plodinec, M. John
2003-05-01
Soil texture is defined as the relative proportion of clay, silt and sand found in a given soil sample. It is an important physical property of soil that affects such phenomena as plant growth and agricultural fertility. Traditional methods used to determine soil texture are either time consuming (hydrometer), or subjective and experience-demanding (field tactile evaluation). Considering that textural patterns observed at soil surfaces are uniquely associated with soil textures, we propose an innovative approach to soil texture analysis, in which wavelet frames-based features representing texture contents of soil images are extracted and categorized by applying a maximum likelihood criterion. The soil texture analysis system has been tested successfully with an accuracy of 91% in classifying soil samples into one of three general categories of soil textures. In comparison with the common methods, this wavelet-based image analysis approach is convenient, efficient, fast, and objective.
CASAS: Cancer Survival Analysis Suite, a web based application
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/. PMID:28928946
CASAS: Cancer Survival Analysis Suite, a web based application.
Rupji, Manali; Zhang, Xinyan; Kowalski, Jeanne
2017-01-01
We present CASAS, a shiny R based tool for interactive survival analysis and visualization of results. The tool provides a web-based one stop shop to perform the following types of survival analysis: quantile, landmark and competing risks, in addition to standard survival analysis. The interface makes it easy to perform such survival analyses and obtain results using the interactive Kaplan-Meier and cumulative incidence plots. Univariate analysis can be performed on one or several user specified variable(s) simultaneously, the results of which are displayed in a single table that includes log rank p-values and hazard ratios along with their significance. For several quantile survival analyses from multiple cancer types, a single summary grid is constructed. The CASAS package has been implemented in R and is available via http://shinygispa.winship.emory.edu/CASAS/. The developmental repository is available at https://github.com/manalirupji/CASAS/.
Priority of VHS Development Based in Potential Area using Principal Component Analysis
NASA Astrophysics Data System (ADS)
Meirawan, D.; Ana, A.; Saripudin, S.
2018-02-01
The current condition of VHS is still inadequate in quality, quantity and relevance. The purpose of this research is to analyse the development of VHS based on the development of regional potential by using principal component analysis (PCA) in Bandung, Indonesia. This study used descriptive qualitative data analysis using the principle of secondary data reduction component. The method used is Principal Component Analysis (PCA) analysis with Minitab Statistics Software tool. The results of this study indicate the value of the lowest requirement is a priority of the construction of development VHS with a program of majors in accordance with the development of regional potential. Based on the PCA score found that the main priority in the development of VHS in Bandung is in Saguling, which has the lowest PCA value of 416.92 in area 1, Cihampelas with the lowest PCA value in region 2 and Padalarang with the lowest PCA value.
NASA Astrophysics Data System (ADS)
Vieira, Rodrigo Drumond; Kelly, Gregory J.
2014-11-01
In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.
Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet
2018-01-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.
EBprot: Statistical analysis of labeling-based quantitative proteomics data.
Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon
2015-08-01
Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
General Framework for Meta-analysis of Rare Variants in Sequencing Association Studies
Lee, Seunggeun; Teslovich, Tanya M.; Boehnke, Michael; Lin, Xihong
2013-01-01
We propose a general statistical framework for meta-analysis of gene- or region-based multimarker rare variant association tests in sequencing association studies. In genome-wide association studies, single-marker meta-analysis has been widely used to increase statistical power by combining results via regression coefficients and standard errors from different studies. In analysis of rare variants in sequencing studies, region-based multimarker tests are often used to increase power. We propose meta-analysis methods for commonly used gene- or region-based rare variants tests, such as burden tests and variance component tests. Because estimation of regression coefficients of individual rare variants is often unstable or not feasible, the proposed method avoids this difficulty by calculating score statistics instead that only require fitting the null model for each study and then aggregating these score statistics across studies. Our proposed meta-analysis rare variant association tests are conducted based on study-specific summary statistics, specifically score statistics for each variant and between-variant covariance-type (linkage disequilibrium) relationship statistics for each gene or region. The proposed methods are able to incorporate different levels of heterogeneity of genetic effects across studies and are applicable to meta-analysis of multiple ancestry groups. We show that the proposed methods are essentially as powerful as joint analysis by directly pooling individual level genotype data. We conduct extensive simulations to evaluate the performance of our methods by varying levels of heterogeneity across studies, and we apply the proposed methods to meta-analysis of rare variant effects in a multicohort study of the genetics of blood lipid levels. PMID:23768515
Smitha, K A; Arun, K M; Rajesh, P G; Thomas, B; Kesavadas, C
2017-06-01
Language is a cardinal function that makes human unique. Preservation of language function poses a great challenge for surgeons during resection. The aim of the study was to assess the efficacy of resting-state fMRI in the lateralization of language function in healthy subjects to permit its further testing in patients who are unable to perform task-based fMRI. Eighteen healthy right-handed volunteers were prospectively evaluated with resting-state fMRI and task-based fMRI to assess language networks. The laterality indices of Broca and Wernicke areas were calculated by using task-based fMRI via a voxel-value approach. We adopted seed-based resting-state fMRI connectivity analysis together with parameters such as amplitude of low-frequency fluctuation and fractional amplitude of low-frequency fluctuation (fALFF). Resting-state fMRI connectivity maps for language networks were obtained from Broca and Wernicke areas in both hemispheres. We performed correlation analysis between the laterality index and the z scores of functional connectivity, amplitude of low-frequency fluctuation, and fALFF. Pearson correlation analysis between signals obtained from the z score of fALFF and the laterality index yielded a correlation coefficient of 0.849 ( P < .05). Regression analysis of the fALFF with the laterality index yielded an R 2 value of 0.721, indicating that 72.1% of the variance in the laterality index of task-based fMRI could be predicted from the fALFF of resting-state fMRI. The present study demonstrates that fALFF can be used as an alternative to task-based fMRI for assessing language laterality. There was a strong positive correlation between the fALFF of the Broca area of resting-state fMRI with the laterality index of task-based fMRI. Furthermore, we demonstrated the efficacy of fALFF for predicting the laterality of task-based fMRI. © 2017 by American Journal of Neuroradiology.
Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation
NASA Technical Reports Server (NTRS)
Roberts, Barry; Bhanu, Bir
1992-01-01
Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.
Performance analysis of mini-propellers based on FlightGear
NASA Astrophysics Data System (ADS)
Vogeltanz, Tomáš
2016-06-01
This paper presents a performance analysis of three mini-propellers based on the FlightGear flight simulator. Although a basic propeller analysis has to be performed before the use of FlightGear, for a complex and more practical performance analysis, it is advantageous to use a propeller model in cooperation with a particular aircraft model. This approach may determine whether the propeller has sufficient quality in respect of aircraft requirements. In the first section, the software used for the analysis is illustrated. Then, the parameters of the analyzed mini-propellers and the tested UAV are described. Finally, the main section shows and discusses the results of the performance analysis of the mini-propellers.
A study of microstructural characteristics and differential thermal analysis of Ni-based superalloys
NASA Technical Reports Server (NTRS)
Aggarwal, M. D.; Lal, R. B.; Oyekenu, Samuel A.; Parr, Richard; Gentz, Stephen
1989-01-01
The objective of this work is to correlate the mechanical properties of the Ni-based superalloy MAR M246(Hf) used in the Space Shuttle Main Engine with its structural characteristics by systematic study of optical photomicrographs and differential thermal analysis. The authors developed a method of predicting the liquidus and solidus temperature of various nickel based superalloys (MAR-M247, Waspaloy, Udimet-41, polycrystalline and single crystals of CMSX-2 and CMSX-3) and comparing the predictions with the experimental differential thermal analysis (DTA) curves using Perkin-Elmer DTA 1700. The method of predicting these temperatures is based on the additive effect of the components dissolved in nickel. The results were compared with the experimental values.
Coarse-to-fine markerless gait analysis based on PCA and Gauss-Laguerre decomposition
NASA Astrophysics Data System (ADS)
Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Carli, Marco; Neri, Alessandro; D'Alessio, Tommaso
2005-04-01
Human movement analysis is generally performed through the utilization of marker-based systems, which allow reconstructing, with high levels of accuracy, the trajectories of markers allocated on specific points of the human body. Marker based systems, however, show some drawbacks that can be overcome by the use of video systems applying markerless techniques. In this paper, a specifically designed computer vision technique for the detection and tracking of relevant body points is presented. It is based on the Gauss-Laguerre Decomposition, and a Principal Component Analysis Technique (PCA) is used to circumscribe the region of interest. Results obtained on both synthetic and experimental tests provide significant reduction of the computational costs, with no significant reduction of the tracking accuracy.
Sub-pattern based multi-manifold discriminant analysis for face recognition
NASA Astrophysics Data System (ADS)
Dai, Jiangyan; Guo, Changlu; Zhou, Wei; Shi, Yanjiao; Cong, Lin; Yi, Yugen
2018-04-01
In this paper, we present a Sub-pattern based Multi-manifold Discriminant Analysis (SpMMDA) algorithm for face recognition. Unlike existing Multi-manifold Discriminant Analysis (MMDA) approach which is based on holistic information of face image for recognition, SpMMDA operates on sub-images partitioned from the original face image and then extracts the discriminative local feature from the sub-images separately. Moreover, the structure information of different sub-images from the same face image is considered in the proposed method with the aim of further improve the recognition performance. Extensive experiments on three standard face databases (Extended YaleB, CMU PIE and AR) demonstrate that the proposed method is effective and outperforms some other sub-pattern based face recognition methods.
Marsh, Terence L.; Saxman, Paul; Cole, James; Tiedje, James
2000-01-01
Rapid analysis of microbial communities has proven to be a difficult task. This is due, in part, to both the tremendous diversity of the microbial world and the high complexity of many microbial communities. Several techniques for community analysis have emerged over the past decade, and most take advantage of the molecular phylogeny derived from 16S rRNA comparative sequence analysis. We describe a web-based research tool located at the Ribosomal Database Project web site (http://www.cme.msu.edu/RDP/html/analyses.html) that facilitates microbial community analysis using terminal restriction fragment length polymorphism of 16S ribosomal DNA. The analysis function (designated TAP T-RFLP) permits the user to perform in silico restriction digestions of the entire 16S sequence database and derive terminal restriction fragment sizes, measured in base pairs, from the 5′ terminus of the user-specified primer to the 3′ terminus of the restriction endonuclease target site. The output can be sorted and viewed either phylogenetically or by size. It is anticipated that the site will guide experimental design as well as provide insight into interpreting results of community analysis with terminal restriction fragment length polymorphisms. PMID:10919828
[Procedural analysis of acid-base balance disorder: case serials in 4 patents].
Ma, Chunyuan; Wang, Guijie
2017-05-01
To establish the standardization process of acid-base balance analysis, analyze cases of acid-base balance disorder with the aid of acid-base balance coordinate graph. The acid-base balance theory were reviewed systematically on recent research progress, and the important concepts, definitions, formulas, parameters, regularity and inference in the analysis of acid-base balance were studied. The analysis of acid-base balance disordered processes and steps were figured. The application of acid-base balance coordinate graph in the cases was introduced. The method of "four parameters-four steps" analysis was put forward to analyze the acid-base balance disorders completely. "Four parameters" included pH, arterial partial pressure of carbon dioxide (PaCO 2 ), HCO 3 - and anion gap (AG). "Four steps" were outlined by following aspects: (1) according to the pH, PaCO 2 and HCO 3 - , the primary or main types of acid-base balance disorder was determined; (2) primary or main types of acid-base disorder were used to choose the appropriate compensation formula and to determine the presence of double mixed acid-base balance disorder; (3) the primary acid-base balance disorders were divided into two parts: respiratory acidosis or respiratory alkalosis, at the same time, the potential HCO 3 - should be calculated, the measured HCO 3 - should be replaced with potential HCO 3 - , to determine whether there were three mixed acid-base disorders; (4) based on the above analysis the data judged as the simple AG increased-metabolic acidosis was needed to be further analyzed. The ratio of ΔAG↑/ΔHCO 3 - ↓ was also needed to be calculated, to determine whether there was normal AG metabolic acidosis or metabolic alkalosis. In the clinical practice, PaCO 2 (as the abscissa) and HCO 3 - (as the ordinate) were used to establish a rectangular coordinate system, through origin (0, 0) and coordinate point (40, 24) could be a straight line, and all points on the straight line pH were equal to 7.40. The acid-base balance coordinate graph could be divided into seven areas by three straight lines [namely pH = 7.40 isoline, PaCO 2 = 40 mmHg (1 mmHg = 0.133 kPa) line and HCO 3 - = 24 mmol/L line]: main respiratory alkalosis area, main metabolic alkalosis area, respiratory + metabolic alkalosis area, main respiratory acidosis area, main metabolic acidosis area, respiratory + metabolic acidosis area and normal area. It was easier to determine the type of acid-base balance disorders by identifying the location of the (PaCO 2 , HCO 3 - ) or (PaCO 2 , potential HCO 3 - ) point on the acid-base balance coordinate graph. "Four parameters-four steps" method is systematic and comprehensive. At the same time, by using the acid-base balance coordinate graph, it is simpler to estimate the types of acid-base balance disorders. It is worthy of popularizing and generalizing.
Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C
2011-04-01
The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF clusters. © Georg Thieme Verlag KG Stuttgart · New York.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain
NASA Technical Reports Server (NTRS)
Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem
2016-01-01
The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.
Analysis of DNA Sequences by an Optical ime-Integrating Correlator: Proposal
1991-11-01
CURRENT TECHNOLOGY 2 3.0 TIME-INTEGRATING CORRELATOR 2 4.0 REPRESENTATIONS OF THE DNA BASES 8 5.0 DNA ANALYSIS STRATEGY 8 6.0 STRATEGY FOR COARSE...1)-correlation peak formed by the AxB term and (2)-pedestal formed by the A + B terms. 7 Figure 4: Short representations of the DNA bases where each...linear scale. 15 x LIST OF TABLES PAGE Table 1: Short representations of the DNA bases where each base is represented by 7-bits long pseudorandom
Support Air and Space Expeditionary Forces. Analysis of Combat Support Basing Options
2004-01-01
Mahyar A . Amouzegar, Robert S. Tripp, Ronald G. McGarve Edward W Chan C. Robert Roll, Jr. _77 Ap L_ L; Reý PROJECT AIR FORCE - Supporting Air and Space...Expeditionary Forces Analysis of Combat Support Basing Options Mahyar A . Amouzegar Robert S. Tripp Ronald G. McGarvey Edward W. Chan C. Robert Roll...support basing options / Mahyar A . Amouzegar ... [et al. p. cm. "’MG-261." Indudes bibliographical references. ISBN 0-8330-3675-0 (pbk.) 1. Air bases
A Comparative Analysis of User Preferences for for Major Internet Based Education Media in China
ERIC Educational Resources Information Center
Wan, Chunyang; Jiang, Yanqing
2014-01-01
Internet based education media are developing at an amazing rate and being seen as an upstart that will likely take the place of traditional education means worldwide in the future. This paper presents the results of a comparative analysis on user preferences for four major categories of internet-based media used in China. In this paper, we first…
ERIC Educational Resources Information Center
Komatsu, Taro
2013-01-01
This paper presents a political analysis of school-based management reform in Bosnia and Herzegovina (BiH). School-based management (SBM), based on the principle of school autonomy and community participation, is a school governance system introduced in many parts of the world, including post-conflict nations. Such a phenomenon seems to follow the…
Code of Federal Regulations, 2011 CFR
2011-07-01
... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...
Code of Federal Regulations, 2013 CFR
2013-07-01
... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...
Code of Federal Regulations, 2012 CFR
2012-07-01
... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...
Code of Federal Regulations, 2010 CFR
2010-07-01
... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...
Science-based Framework for Environmental Benefits Assessment
2013-03-01
ER D C/ EL T R -1 3 -4 Environmental Benefits Analysis Program Science-based Framework for Environmental Benefits Assessment E nv ir...acwc.sdp.sirsi.net/client/default. Environmental Benefits Analysis Program ERDC/EL TR-13-4 March 2013 Science-based Framework for Environmental Benefits ...evaluating ecosystem restoration benefits within the context of USACE Civil Works planning process. An emphasis is placed on knowledge gained from
Using the DOE Knowledge Base for Special Event Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Armstrong, H.M.; Harris, J.M.; Young, C.J.
1998-10-20
The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyzemore » an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by spatial proximity searches or through waveform correlation processing. The locations and waveforms of these events can then be made available for side-by-side comparison and processing. If synthetic modeling is thought to be warranted, a wide variety of rele- vant contextu~l information (e.g. crustal thickness and layering, seismic velocities, attenuation factors) can be retrieved and sent to the appropriate applications. Once formedj the synthetics can then be brought in for side-by-side comparison and fhrther processing. Based on our study, we make two general recommendations. First, proper inter-process communication between sensor data analysis software and contextual data analysis sofisvare should be developed. Second, some of the Knowl- edge Base data sets should be prioritized or winnowed to streamline comparison with observed quantities.« less
UNCERTAINTY ANALYSIS OF TCE USING THE DOSE EXPOSURE ESTIMATING MODEL (DEEM) IN ACSL
The ACSL-based Dose Exposure Estimating Model(DEEM) under development by EPA is used to perform art uncertainty analysis of a physiologically based pharmacokinetic (PSPK) model of trichloroethylene (TCE). This model involves several circulating metabolites such as trichloroacet...
Nested PCR and RFLP analysis based on the 16S rRNA gene
USDA-ARS?s Scientific Manuscript database
Current phytoplasma detection and identification method is primarily based on nested PCR followed by restriction fragment length polymorphism analysis and gel electrophoresis. This method can potentially detect and differentiate all phytoplasmas including those previously not described. The present ...
HISTORICAL ANALYSIS, A VALUABLE TOOL IN COMMUNITY-BASED ENVIRONMENTAL PROTECTION
A historical analysis of the ecological consequences of development can be a valuable tool in community-based environmental protection. These studies can engage the public in environmental issues and lead to informed decision making. Historical studies provide an understanding of...
CHAPTER 10: CURRENT TECHNICAL PROBLEMS IN EMERGY ANALYSIS
Technical problems related to the determination of the emergy base for self-organization in environmental systems are considered in this paper. The comparability of emergy analysis results depends on emergy analysts making similar choices in determining the emergy base for a part...
Analysis of Aerospike Plume Induced Base-Heating Environment
NASA Technical Reports Server (NTRS)
Wang, Ten-See
1998-01-01
Computational analysis is conducted to study the effect of an aerospike engine plume on X-33 base-heating environment during ascent flight. To properly account for the effect of forebody and aftbody flowfield such as shocks and to allow for potential plume-induced flow-separation, thermo-flowfield of trajectory points is computed. The computational methodology is based on a three-dimensional finite-difference, viscous flow, chemically reacting, pressure-base computational fluid dynamics formulation, and a three-dimensional, finite-volume, spectral-line based weighted-sum-of-gray-gases radiation absorption model computational heat transfer formulation. The predicted convective and radiative base-heat fluxes are presented.
A Statistical Discrimination Experiment for Eurasian Events Using a Twenty-Seven-Station Network
1980-07-08
to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...to test the effectiveness of a multivariate method of analysis for distinguishing earthquakes from explosions. The data base for the experiment...the weight assigned to each variable whenever a new one is added. Jennrich, R. I. (1977). Stepwise discriminant analysis , in Statistical Methods for
ERIC Educational Resources Information Center
Hao, Jiangang; Smith, Lawrence; Mislevy, Robert; von Davier, Alina; Bauer, Malcolm
2016-01-01
Extracting information efficiently from game/simulation-based assessment (G/SBA) logs requires two things: a well-structured log file and a set of analysis methods. In this report, we propose a generic data model specified as an extensible markup language (XML) schema for the log files of G/SBAs. We also propose a set of analysis methods for…
Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M
2007-02-19
A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.
Stochastic Game Analysis and Latency Awareness for Self-Adaptation
2014-01-01
this paper, we introduce a formal analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to quantify the...Additional Key Words and Phrases: Proactive adaptation, Stochastic multiplayer games , Latency 1. INTRODUCTION When planning how to adapt, self-adaptive...contribution of this paper is twofold: (1) A novel analysis technique based on model checking of stochastic multiplayer games (SMGs) that enables us to
1996-06-01
for Software Synthesis." KBSE . IEEE, 1993. 51. Kang, Kyo C., et al. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report...and usefulness in domain analysis and modeling. Rumbaugh uses three distinct views to describe a domain: (1) the object model describes structural...Gibbons describe a methodology where Structured Analysis is used to build a hierarchical system structure chart. This structure chart is then translated
Zhang, L; Price, R; Aweeka, F; Bellibas, S E; Sheiner, L B
2001-02-01
A small-scale clinical investigation was done to quantify the penetration of stavudine (D4T) into cerebrospinal fluid (CSF). A model-based analysis estimates the steady-state ratio of AUCs of CSF and plasma concentrations (R(AUC)) to be 0.270, and the mean residence time of drug in the CSF to be 7.04 h. The analysis illustrates the advantages of a causal (scientific, predictive) model-based approach to analysis over a noncausal (empirical, descriptive) approach when the data, as here, demonstrate certain problematic features commonly encountered in clinical data, namely (i) few subjects, (ii) sparse sampling, (iii) repeated measures, (iv) imbalance, and (v) individual design variation. These features generally require special attention in data analysis. The causal-model-based analysis deals with features (i) and (ii), both of which reduce efficiency, by combining data from different studies and adding subject-matter prior information. It deals with features (iii)--(v), all of which prevent 'averaging' individual data points directly, first, by adjusting in the model for interindividual data differences due to design differences, secondly, by explicitly differentiating between interpatient, interoccasion, and measurement error variation, and lastly, by defining a scientifically meaningful estimand (R(AUC)) that is independent of design.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.
Desai, Trunil S; Srivastava, Shireesh
2018-01-01
13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses
Desai, Trunil S.
2018-01-01
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347
[Value-based medicine in ophthalmology].
Hirneiss, C; Neubauer, A S; Tribus, C; Kampik, A
2006-06-01
Value-based medicine (VBM) unifies costs and patient-perceived value (improvement in quality of life, length of life, or both) of an intervention. Value-based ophthalmology is of increasing importance for decisions in eye care. The methods of VBM are explained and definitions for a specific terminology in this field are given. The cost-utility analysis as part of health care economic analyses is explained. VBM exceeds evidence-based medicine by incorporating parameters of cost and benefits from an ophthalmological intervention. The benefit of the intervention is defined as an increase or maintenance of visual quality of life and can be determined by utility analysis. The time trade-off method is valid and reliable for utility analysis. The resources expended for the value gained in VBM are measured with cost-utility analysis in terms of cost per quality-adjusted life years gained (euros/QALY). Numerous cost-utility analyses of different ophthalmological interventions have been published. The fundamental instrument of VBM is cost-utility analysis. The results in cost per QALY allow estimation of cost effectiveness of an ophthalmological intervention. Using the time trade-off method for utility analysis allows the comparison of ophthalmological cost-utility analyses with those of other medical interventions. VBM is important for individual medical decision making and for general health care.
Analysis of a Rocket Based Combined Cycle Engine during Rocket Only Operation
NASA Technical Reports Server (NTRS)
Smith, T. D.; Steffen, C. J., Jr.; Yungster, S.; Keller, D. J.
1998-01-01
The all rocket mode of operation is a critical factor in the overall performance of a rocket based combined cycle (RBCC) vehicle. However, outside of performing experiments or a full three dimensional analysis, there are no first order parametric models to estimate performance. As a result, an axisymmetric RBCC engine was used to analytically determine specific impulse efficiency values based upon both full flow and gas generator configurations. Design of experiments methodology was used to construct a test matrix and statistical regression analysis was used to build parametric models. The main parameters investigated in this study were: rocket chamber pressure, rocket exit area ratio, percent of injected secondary flow, mixer-ejector inlet area, mixer-ejector area ratio, and mixer-ejector length-to-inject diameter ratio. A perfect gas computational fluid dynamics analysis was performed to obtain values of vacuum specific impulse. Statistical regression analysis was performed based on both full flow and gas generator engine cycles. Results were also found to be dependent upon the entire cycle assumptions. The statistical regression analysis determined that there were five significant linear effects, six interactions, and one second-order effect. Two parametric models were created to provide performance assessments of an RBCC engine in the all rocket mode of operation.
Li, Daojin; Yin, Danyang; Chen, Yang; Liu, Zhen
2017-05-19
Protein phosphorylation is a major post-translational modification, which plays a vital role in cellular signaling of numerous biological processes. Mass spectrometry (MS) has been an essential tool for the analysis of protein phosphorylation, for which it is a key step to selectively enrich phosphopeptides from complex biological samples. In this study, metal-organic frameworks (MOFs)-based monolithic capillary has been successfully prepared as an effective sorbent for the selective enrichment of phosphopeptides and has been off-line coupled with matrix-assisted laser desorption ionization-time-of-flight mass spectrometry (MALDI-TOF MS) for efficient analysis of phosphopeptides. Using š-casein as a representative phosphoprotein, efficient phosphorylation analysis by this off-line platform was verified. Phosphorylation analysis of a nonfat milk sample was also demonstrated. Through introducing large surface areas and highly ordered pores of MOFs into monolithic column, the MOFs-based monolithic capillary exhibited several significant advantages, such as excellent selectivity toward phosphopeptides, superb tolerance to interference and simple operation procedure. Because of these highly desirable properties, the MOFs-based monolithic capillary could be a useful tool for protein phosphorylation analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
The Statistical Value of Raw Fluorescence Signal in Luminex xMAP Based Multiplex Immunoassays
Breen, Edmond J.; Tan, Woei; Khan, Alamgir
2016-01-01
Tissue samples (plasma, saliva, serum or urine) from 169 patients classified as either normal or having one of seven possible diseases are analysed across three 96-well plates for the presences of 37 analytes using cytokine inflammation multiplexed immunoassay panels. Censoring for concentration data caused problems for analysis of the low abundant analytes. Using fluorescence analysis over concentration based analysis allowed analysis of these low abundant analytes. Mixed-effects analysis on the resulting fluorescence and concentration responses reveals a combination of censoring and mapping the fluorescence responses to concentration values, through a 5PL curve, changed observed analyte concentrations. Simulation verifies this, by showing a dependence on the mean florescence response and its distribution on the observed analyte concentration levels. Differences from normality, in the fluorescence responses, can lead to differences in concentration estimates and unreliable probabilities for treatment effects. It is seen that when fluorescence responses are normally distributed, probabilities of treatment effects for fluorescence based t-tests has greater statistical power than the same probabilities from concentration based t-tests. We add evidence that the fluorescence response, unlike concentration values, doesn’t require censoring and we show with respect to differential analysis on the fluorescence responses that background correction is not required. PMID:27243383
Penning, Holger; Elsner, Martin
2007-11-01
Potentially, compound-specific isotope analysis may provide unique information on source and fate of pesticides in natural systems. Yet for isotope analysis, LC-based methods that are based on the use of organic solvents often cannot be used and GC-based analysis is frequently not possible due to thermolability of the analyte. A typical example of a compound with such properties is isoproturon (3-(4-isopropylphenyl)-1,1-dimethylurea), belonging to the worldwide extensively used phenylurea herbicides. To make isoproturon accessible to carbon and nitrogen isotope analysis, we developed a GC-based method during which isoproturon was quantitatively fragmented to dimethylamine and 4-isopropylphenylisocyanate. Fragmentation occurred only partially in the injector but was mainly achieved on a heated capillary column. The fragments were then chromatographically separated and individually measured by isotope ratio mass spectrometry. The reliability of the method was tested in hydrolysis experiments with three isotopically different batches of isoproturon. For all three products, the same isotope fractionation factors were observed during conversion and the difference in isotope composition between the batches was preserved. This study demonstrates that fragmentation of phenylurea herbicides does not only make them accessible to isotope analysis but even enables determination of intramolecular isotope fractionation.
Zhang, Jing; Zhang, Lu; Zhang, Yan; Yang, Jing; Guo, Mengbiao; Sun, Liangdan; Pan, Hai-Feng; Hirankarn, Nattiya; Ying, Dingge; Zeng, Shuai; Lee, Tsz Leung; Lau, Chak Sing; Chan, Tak Mao; Leung, Alexander Moon Ho; Mok, Chi Chiu; Wong, Sik Nin; Lee, Ka Wing; Ho, Marco Hok Kung; Lee, Pamela Pui Wah; Chung, Brian Hon-Yin; Chong, Chun Yin; Wong, Raymond Woon Sing; Mok, Mo Yin; Wong, Wilfred Hing Sang; Tong, Kwok Lung; Tse, Niko Kei Chiu; Li, Xiang-Pei; Avihingsanon, Yingyos; Rianthavorn, Pornpimol; Deekajorndej, Thavatchai; Suphapeetiporn, Kanya; Shotelersuk, Vorasuk; Ying, Shirley King Yee; Fung, Samuel Ka Shun; Lai, Wai Ming; Garcia-Barceló, Maria-Mercè; Cherny, Stacey S; Sham, Pak Chung; Cui, Yong; Yang, Sen; Ye, Dong Qing; Zhang, Xue-Jun; Lau, Yu Lung; Yang, Wanling
2015-11-01
Previous genome-wide association studies (GWAS), which were mainly based on single-variant analysis, have identified many systemic lupus erythematosus (SLE) susceptibility loci. However, the genetic architecture of this complex disease is far from being understood. The aim of this study was to investigate whether using a gene-based analysis may help to identify novel loci, by considering global evidence of association from a gene or a genomic region rather than focusing on evidence for individual variants. Based on the results of a meta-analysis of 2 GWAS of SLE conducted in 2 Asian cohorts, we performed an in-depth gene-based analysis followed by replication in a total of 4,626 patients and 7,466 control subjects of Asian ancestry. Differential allelic expression was measured by pyrosequencing. More than one-half of the reported SLE susceptibility loci showed evidence of independent effects, and this finding is important for understanding the mechanisms of association and explaining disease heritability. ANXA6 was detected as a novel SLE susceptibility gene, with several single-nucleotide polymorphisms (SNPs) contributing independently to the association with disease. The risk allele of rs11960458 correlated significantly with increased expression of ANXA6 in peripheral blood mononuclear cells from heterozygous healthy control subjects. Several other associated SNPs may also regulate ANXA6 expression, according to data obtained from public databases. Higher expression of ANXA6 in patients with SLE was also reported previously. Our study demonstrated the merit of using gene-based analysis to identify novel susceptibility loci, especially those with independent effects, and also demonstrated the widespread presence of loci with independent effects in SLE susceptibility genes. © 2015, American College of Rheumatology.
A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.
Gupta, Omesh P; Brown, Gary C; Brown, Melissa M
2008-05-01
To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.
A catalog of automated analysis methods for enterprise models.
Florez, Hector; Sánchez, Mario; Villalobos, Jorge
2016-01-01
Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.
When is hub gene selection better than standard meta-analysis?
Langfelder, Peter; Mischel, Paul S; Horvath, Steve
2013-01-01
Since hub nodes have been found to play important roles in many networks, highly connected hub genes are expected to play an important role in biology as well. However, the empirical evidence remains ambiguous. An open question is whether (or when) hub gene selection leads to more meaningful gene lists than a standard statistical analysis based on significance testing when analyzing genomic data sets (e.g., gene expression or DNA methylation data). Here we address this question for the special case when multiple genomic data sets are available. This is of great practical importance since for many research questions multiple data sets are publicly available. In this case, the data analyst can decide between a standard statistical approach (e.g., based on meta-analysis) and a co-expression network analysis approach that selects intramodular hubs in consensus modules. We assess the performance of these two types of approaches according to two criteria. The first criterion evaluates the biological insights gained and is relevant in basic research. The second criterion evaluates the validation success (reproducibility) in independent data sets and often applies in clinical diagnostic or prognostic applications. We compare meta-analysis with consensus network analysis based on weighted correlation network analysis (WGCNA) in three comprehensive and unbiased empirical studies: (1) Finding genes predictive of lung cancer survival, (2) finding methylation markers related to age, and (3) finding mouse genes related to total cholesterol. The results demonstrate that intramodular hub gene status with respect to consensus modules is more useful than a meta-analysis p-value when identifying biologically meaningful gene lists (reflecting criterion 1). However, standard meta-analysis methods perform as good as (if not better than) a consensus network approach in terms of validation success (criterion 2). The article also reports a comparison of meta-analysis techniques applied to gene expression data and presents novel R functions for carrying out consensus network analysis, network based screening, and meta analysis.
Gait Analysis Using Wearable Sensors
Tao, Weijun; Liu, Tao; Zheng, Rencheng; Feng, Hutian
2012-01-01
Gait analysis using wearable sensors is an inexpensive, convenient, and efficient manner of providing useful information for multiple health-related applications. As a clinical tool applied in the rehabilitation and diagnosis of medical conditions and sport activities, gait analysis using wearable sensors shows great prospects. The current paper reviews available wearable sensors and ambulatory gait analysis methods based on the various wearable sensors. After an introduction of the gait phases, the principles and features of wearable sensors used in gait analysis are provided. The gait analysis methods based on wearable sensors is divided into gait kinematics, gait kinetics, and electromyography. Studies on the current methods are reviewed, and applications in sports, rehabilitation, and clinical diagnosis are summarized separately. With the development of sensor technology and the analysis method, gait analysis using wearable sensors is expected to play an increasingly important role in clinical applications. PMID:22438763
Arcaya, Mariana; Reardon, Timothy; Vogel, Joshua; Andrews, Bonnie K; Li, Wenjun; Land, Thomas
2014-02-13
Community-based approaches to preventing chronic diseases are attractive because of their broad reach and low costs, and as such, are integral components of health care reform efforts. Implementing community-based initiatives across Massachusetts' municipalities presents both programmatic and evaluation challenges. For effective delivery and evaluation of the interventions, establishing a community typology that groups similar municipalities provides a balanced and cost-effective approach. Through a series of key informant interviews and exploratory data analysis, we identified 55 municipal-level indicators of 6 domains for the typology analysis. The domains were health behaviors and health outcomes, housing and land use, transportation, retail environment, socioeconomics, and demographic composition. A latent class analysis was used to identify 10 groups of municipalities based on similar patterns of municipal-level indicators across the domains. Our model with 10 latent classes yielded excellent classification certainty (relative entropy = .995, minimum class probability for any class = .871), and differentiated distinct groups of municipalities based on health-relevant needs and resources. The classes differentiated healthy and racially and ethnically diverse urban areas from cities with similar population densities and diversity but worse health outcomes, affluent communities from lower-income rural communities, and mature suburban areas from rapidly suburbanizing communities with different healthy-living challenges. Latent class analysis is a tool that may aid in the planning, communication, and evaluation of community-based wellness initiatives such as Community Transformation Grants projects administrated by the Centers for Disease Control and Prevention.
Classification of cassava genotypes based on qualitative and quantitative data.
Oliveira, E J; Oliveira Filho, O S; Santos, V S
2015-02-02
We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.
Watson, Nathanial E; Parsons, Brendon A; Synovec, Robert E
2016-08-12
Performance of tile-based Fisher Ratio (F-ratio) data analysis, recently developed for discovery-based studies using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC-TOFMS), is evaluated with a metabolomics dataset that had been previously analyzed in great detail, but while taking a brute force approach. The previously analyzed data (referred to herein as the benchmark dataset) were intracellular extracts from Saccharomyces cerevisiae (yeast), either metabolizing glucose (repressed) or ethanol (derepressed), which define the two classes in the discovery-based analysis to find metabolites that are statistically different in concentration between the two classes. Beneficially, this previously analyzed dataset provides a concrete means to validate the tile-based F-ratio software. Herein, we demonstrate and validate the significant benefits of applying tile-based F-ratio analysis. The yeast metabolomics data are analyzed more rapidly in about one week versus one year for the prior studies with this dataset. Furthermore, a null distribution analysis is implemented to statistically determine an adequate F-ratio threshold, whereby the variables with F-ratio values below the threshold can be ignored as not class distinguishing, which provides the analyst with confidence when analyzing the hit table. Forty-six of the fifty-four benchmarked changing metabolites were discovered by the new methodology while consistently excluding all but one of the benchmarked nineteen false positive metabolites previously identified. Copyright © 2016 Elsevier B.V. All rights reserved.
Design of microcontroller-based EMG and the analysis of EMG signals.
Güler, Nihal Fatma; Hardalaç, Firat
2002-04-01
In this work, a microcontroller-based EMG designed and tested on 40 patients. When the patients are in rest, the fast Fourier transform (FFT) analysis was applied to EMG signals recorded from right leg peroneal region. The histograms are constructed from the results of the FFT analysis. The analysis results shows that the amplitude of fibrillation potential of the muscle fiber of 30 patients measured from peroneal region is low and the duration is short. This is the reason why the motor nerves degenerated and 10 patients were found to be healthy.
Asymmetric transmission and reflection spectra of FBG in single-multi-single mode fiber structure.
Chai, Quan; Liu, Yanlei; Zhang, Jianzhong; Yang, Jun; Chen, Yujin; Yuan, Libo; Peng, Gang-Ding
2015-05-04
We give a comprehensive theoretical analysis and simulation of a FBG in single-multi-single mode fiber structure (FBG-in-SMS), based on the coupled mode analysis and the mode interference analysis. This enables us to explain the experimental observations, its asymmetric transmission and reflection spectra with the similar temperature responses near the spectral range of Bragg wavelengths. The transmission spectrum shift during FBG written-in process is observed and discussed. The analysis results are useful in the design of the SMS structure based sensors and filters.
Identification and human condition analysis based on the human voice analysis
NASA Astrophysics Data System (ADS)
Mieshkov, Oleksandr Yu.; Novikov, Oleksandr O.; Novikov, Vsevolod O.; Fainzilberg, Leonid S.; Kotyra, Andrzej; Smailova, Saule; Kozbekova, Ainur; Imanbek, Baglan
2017-08-01
The paper presents a two-stage biotechnical system for human condition analysis that is based on analysis of human voice signal. At the initial stage, the voice signal is pre-processed and its characteristics in time domain are determined. At the first stage, the developed system is capable of identifying the person in the database on the basis of the extracted characteristics. At the second stage, the model of a human voice is built on the basis of the real voice signals after clustering the whole database.
Vibrations Detection in Industrial Pumps Based on Spectral Analysis to Increase Their Efficiency
NASA Astrophysics Data System (ADS)
Rachid, Belhadef; Hafaifa, Ahmed; Boumehraz, Mohamed
2016-03-01
Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analysis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.
Gautam, Arvind; Callejas, Miguel A; Acharyya, Amit; Acharyya, Swati Ghosh
2018-05-01
This study introduced a shape memory alloy (SMA)-based smart knee spacer for total knee arthroplasty (TKA). Subsequently, a 3D CAD model of a smart tibial component of TKA was designed in Solidworks software, and verified using a finite element analysis in ANSYS Workbench. The two major properties of the SMA (NiTi), the pseudoelasticity (PE) and shape memory effect (SME), were exploited, modelled, and analysed for a TKA application. The effectiveness of the proposed model was verified in ANSYS Workbench through the finite element analysis (FEA) of the maximum deformation and equivalent (von Mises) stress distribution. The proposed model was also compared with a polymethylmethacrylate (PMMA)-based spacer for the upper portion of the tibial component for three subjects with body mass index (BMI) of 23.88, 31.09, and 38.39. The proposed SMA -based smart knee spacer contained 96.66978% less deformation with a standard deviation of 0.01738 than that of the corresponding PMMA based counterpart for the same load and flexion angle. Based on the maximum deformation analysis, the PMMA-based spacer had 30 times more permanent deformation than that of the proposed SMA-based spacer for the same load and flexion angle. The SME property of the lower portion of the tibial component for fixation of the spacer at its position was verified by an FEA in ANSYS. Wherein, a strain life-based fatigue analysis was performed and tested for the PE and SME built spacers through the FEA. Therefore, the SMA-based smart knee spacer eliminated the drawbacks of the PMMA-based spacer, including spacer fracture, loosening, dislocation, tilting or translation, and knee subluxation. Copyright © 2018. Published by Elsevier Ltd.
Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle
NASA Technical Reports Server (NTRS)
Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero
2000-01-01
This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.
geospatial data analysis using parallel processing High performance computing Renewable resource technical potential and supply curve analysis Spatial database utilization Rapid analysis of large geospatial datasets energy and geospatial analysis products Research Interests Rapid, web-based renewable resource analysis
NASA Technical Reports Server (NTRS)
1979-01-01
Information to identify viable coal gasification and utilization technologies is presented. Analysis capabilities required to support design and implementation of coal based synthetic fuels complexes are identified. The potential market in the Southeast United States for coal based synthetic fuels is investigated. A requirements analysis to identify the types of modeling and analysis capabilities required to conduct and monitor coal gasification project designs is discussed. Models and methodologies to satisfy these requirements are identified and evaluated, and recommendations are developed. Requirements for development of technology and data needed to improve gasification feasibility and economies are examined.
The impact of chimerism in DNA-based forensic sex determination analysis.
George, Renjith; Donald, Preethy Mary; Nagraj, Sumanth Kumbargere; Idiculla, Jose Joy; Hj Ismail, Rashid
2013-01-01
Sex determination is the most important step in personal identification in forensic investigations. DNA-based sex determination analysis is comparatively more reliable than the other conventional methods of sex determination analysis. Advanced technology like real-time polymerase chain reaction (PCR) offers accurate and reproducible results and is at the level of legal acceptance. But still there are situations like chimerism where an individual possess both male and female specific factors together in their body. Sex determination analysis in such cases can give erroneous results. This paper discusses the phenomenon of chimerism and its impact on sex determination analysis in forensic investigations.
Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan
2010-08-01
With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.
Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura
2017-01-01
A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.
Open Reading Frame Phylogenetic Analysis on the Cloud
2013-01-01
Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843
[Study of post marketing safety reevaluation of shenqi fuzheng injection].
Ai, Qing-Hua; Li, Yuan-Yuan; Xie, Yan-Ming
2014-09-01
In order to promote the Shenqifuzheng injection (SQFZ) clinical medication safety, this study reevaluate on SQFZ post marketing safety study systematically. Including multi center large sample registration type safety monitoring research, the analysis based on national spontaneous reporting system data, the analysis based on the 20 national hospital information system data and literature research. Above the analysis, it suggests that SQFZ has good security. The more adverse drug reaction (ADR) as allergic reactions, mainly involved in the damage of skin, appendages and its systemic damage, serious person can appear allergic shock. ADR/E is more common in the elderly, may be related to medication (tumor) populations. Early warning analysis based on SRS data and literature research are of the view that "phlebitis" has a strong association with SQFZ used.
An efficient current-based logic cell model for crosstalk delay analysis
NASA Astrophysics Data System (ADS)
Nazarian, Shahin; Das, Debasish
2013-04-01
Logic cell modelling is an important component in the analysis and design of CMOS integrated circuits, mostly due to nonlinear behaviour of CMOS cells with respect to the voltage signal at their input and output pins. A current-based model for CMOS logic cells is presented, which can be used for effective crosstalk noise and delta delay analysis in CMOS VLSI circuits. Existing current source models are expensive and need a new set of Spice-based characterisation, which is not compatible with typical EDA tools. In this article we present Imodel, a simple nonlinear logic cell model that can be derived from the typical cell libraries such as NLDM, with accuracy much higher than NLDM-based cell delay models. In fact, our experiments show an average error of 3% compared to Spice. This level of accuracy comes with a maximum runtime penalty of 19% compared to NLDM-based cell delay models on medium-sized industrial designs.
Target Identification Using Harmonic Wavelet Based ISAR Imaging
NASA Astrophysics Data System (ADS)
Shreyamsha Kumar, B. K.; Prabhakar, B.; Suryanarayana, K.; Thilagavathi, V.; Rajagopal, R.
2006-12-01
A new approach has been proposed to reduce the computations involved in the ISAR imaging, which uses harmonic wavelet-(HW) based time-frequency representation (TFR). Since the HW-based TFR falls into a category of nonparametric time-frequency (T-F) analysis tool, it is computationally efficient compared to parametric T-F analysis tools such as adaptive joint time-frequency transform (AJTFT), adaptive wavelet transform (AWT), and evolutionary AWT (EAWT). Further, the performance of the proposed method of ISAR imaging is compared with the ISAR imaging by other nonparametric T-F analysis tools such as short-time Fourier transform (STFT) and Choi-Williams distribution (CWD). In the ISAR imaging, the use of HW-based TFR provides similar/better results with significant (92%) computational advantage compared to that obtained by CWD. The ISAR images thus obtained are identified using a neural network-based classification scheme with feature set invariant to translation, rotation, and scaling.
NASA Technical Reports Server (NTRS)
Padavala, Satyasrinivas; Palazzolo, Alan B.; Vallely, Pat; Ryan, Steve
1994-01-01
An improved dynamic analysis for liquid annular seals with arbitrary profile based on a method, first proposed by Nelson and Nguyen, is presented. An improved first order solution that incorporates a continuous interpolation of perturbed quantities in the circumferential direction, is presented. The original method uses an approximation scheme for circumferential gradients, based on Fast Fourier Transforms (FFT). A simpler scheme based on cubic splines is found to be computationally more efficient with better convergence at higher eccentricities. A new approach of computing dynamic coefficients based on external specified load is introduced. This improved analysis is extended to account for arbitrarily varying seal profile in both axial and circumferential directions. An example case of an elliptical seal with varying degrees of axial curvature is analyzed. A case study based on actual operating clearances of an interstage seal of the Space Shuttle Main Engine High Pressure Oxygen Turbopump is presented.
Commerce Laboratory: Mission analysis payload integration study
NASA Technical Reports Server (NTRS)
Bannister, T. C.
1984-01-01
A mission model which will accommodate commercial users and provide a basic data base for further mission planning is reported. The data bases to be developed are: (1) user requirements; (2) apparatus capabilities and availabilities; and (3) carrier capabilities. These data bases are synthesized in a trades and analysis phase along with the STS flight apparatus, and optimum missions will be identified. The completed work is reported. The user requirements data base was expanded to identify within the six scientific disciplines the areas of investigation, investigation categories and status, potential commercial application, interested parties, process, and experiment requirements. The scope of the apparatus data base was expanded to indicate apparatus status as to whether it is ground or flight equipment and, within both categories, whether the apparatus is: (1) existing, (2) under development, (3) planned, or (4) needed. Applications for the apparatus are listed. The methodology is revised in the areas of trades and analysis and mission planning. The carrier capabilities data base was updated and completed.
Friberg, Febe; Lyckhage, Elisabeth Dahlborg
2013-01-01
This article describes the development of literature-based models for bachelor degree essays in Swedish undergraduate nursing education. Students' experiences in a course with literature-based models for bachelor degree essays are discussed. The ever-growing body of nursing research and specialized and complex health care practices make great demands on nursing education in terms of preparing students to be both skilled practitioners and users of research. Teaching to help students understand evidence-based practice is a challenge for nursing education. Action research was used to generate knowledge of and practical solutions to problems in everyday locations. Six models were developed: concept analysis, contributing to evidence-based nursing by means of quantitative research, contributing to evidence-based nursing by means of qualitative research, discourse analysis, analysis of narratives, and literature review. Action research was found to be a relevant procedure for changing ways of working with literature-based, bachelor degree essays. The models that were developed increased students' confidence in writing essays and preparedness for the nursing role.
Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.
Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie
2017-01-01
Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.
Pelios, L; Morren, J; Tesch, D; Axelrod, S
1999-01-01
Self-injurious behavior (SIB) and aggression have been the concern of researchers because of the serious impact these behaviors have on individuals' lives. Despite the plethora of research on the treatment of SIB and aggressive behavior, the reported findings have been inconsistent regarding the effectiveness of reinforcement-based versus punishment-based procedures. We conducted a literature review to determine whether a trend could be detected in researchers' selection of reinforcement-based procedures versus punishment-based procedures, particularly since the introduction of functional analysis to behavioral assessment. The data are consistent with predictions made in the past regarding the potential impact of functional analysis methodology. Specifically, the findings indicate that, once maintaining variables for problem behavior are identified, experimenters tend to choose reinforcement-based procedures rather than punishment-based procedures as treatment for both SIB and aggressive behavior. Results indicated an increased interest in studies on the treatment of SIB and aggressive behavior, particularly since 1988. PMID:10396771
Low-cost bioanalysis on paper-based and its hybrid microfluidic platforms.
Dou, Maowei; Sanjay, Sharma Timilsina; Benhabib, Merwan; Xu, Feng; Li, XiuJun
2015-12-01
Low-cost assays have broad applications ranging from human health diagnostics and food safety inspection to environmental analysis. Hence, low-cost assays are especially attractive for rural areas and developing countries, where financial resources are limited. Recently, paper-based microfluidic devices have emerged as a low-cost platform which greatly accelerates the point of care (POC) analysis in low-resource settings. This paper reviews recent advances of low-cost bioanalysis on paper-based microfluidic platforms, including fully paper-based and paper hybrid microfluidic platforms. In this review paper, we first summarized the fabrication techniques of fully paper-based microfluidic platforms, followed with their applications in human health diagnostics and food safety analysis. Then we highlighted paper hybrid microfluidic platforms and their applications, because hybrid platforms could draw benefits from multiple device substrates. Finally, we discussed the current limitations and perspective trends of paper-based microfluidic platforms for low-cost assays. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Razali, Nur Fhathyhah; Mohd Suradi, Nur Riza; Ahmad Shahabuddin, Faridatul Azna; Ismail, Wan Rosmanira; Abidin, Norkisme Zainal; Ahmad, Nor Amalina; Mustafa, Zainol
2013-04-01
This study aims to identify the determinants of technological innovation capability of Malaysian-owned companies in the resources-based manufacturing, to identify the relationship between technological innovation capability (TIC) and technological innovation performance (TIP) for the resource-based manufacturing. Furthermore, this study also aims to identify innovation capability factors that need more emphasis and improvements from the respective authority. The scope of the study covers four industries which are petrochemical industries, pharmaceutical industries, palm oil-based industries and food processing industries which are located in the state of Selangor. Descriptive analysis, correlation analysis and performance capability analysis were used in this study. It was found that, technological innovation capabilities (TIC) for companies in the resource-based manufacturing are moderate. Factors such as policies capability, human resources capability and facilities capability have a positive relationship with the performance of technological innovation (TIP). These findings will help the government in making decisions and better implementation of policies to strengthen the competitiveness of the company, particularly in resource-based manufacturing.
NASA Technical Reports Server (NTRS)
Ruf, Joseph H.; Holt, James B.; Canabal, Francisco
2001-01-01
This paper presents the status of analyses on three Rocket Based Combined Cycle (RBCC) configurations underway in the Applied Fluid Dynamics Analysis Group (TD64). TD64 is performing computational fluid dynamics (CFD) analysis on a Penn State RBCC test rig, the proposed Draco axisymmetric RBCC engine and the Trailblazer engine. The intent of the analysis on the Penn State test rig is to benchmark the Finite Difference Navier Stokes (FDNS) code for ejector mode fluid dynamics. The Draco analysis was a trade study to determine the ejector mode performance as a function of three engine design variables. The Trailblazer analysis is to evaluate the nozzle performance in scramjet mode. Results to date of each analysis are presented.
Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.
Chen, Rong; Nixon, Erika; Herskovits, Edward
2016-04-01
Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.
Graph-based normalization and whitening for non-linear data analysis.
Aaron, Catherine
2006-01-01
In this paper we construct a graph-based normalization algorithm for non-linear data analysis. The principle of this algorithm is to get a spherical average neighborhood with unit radius. First we present a class of global dispersion measures used for "global normalization"; we then adapt these measures using a weighted graph to build a local normalization called "graph-based" normalization. Then we give details of the graph-based normalization algorithm and illustrate some results. In the second part we present a graph-based whitening algorithm built by analogy between the "global" and the "local" problem.
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
Kok, Annette M; Nguyen, V Lai; Speelman, Lambert; Brands, Peter J; Schurink, Geert-Willem H; van de Vosse, Frans N; Lopata, Richard G P
2015-05-01
Abdominal aortic aneurysms (AAAs) are local dilations that can lead to a fatal hemorrhage when ruptured. Wall stress analysis of AAAs is a novel tool that has proven high potential to improve risk stratification. Currently, wall stress analysis of AAAs is based on computed tomography (CT) and magnetic resonance imaging; however, three-dimensional (3D) ultrasound (US) has great advantages over CT and magnetic resonance imaging in terms of costs, speed, and lack of radiation. In this study, the feasibility of 3D US as input for wall stress analysis is investigated. Second, 3D US-based wall stress analysis was compared with CT-based results. The 3D US and CT data were acquired in 12 patients (diameter, 35-90 mm). US data were segmented manually and compared with automatically acquired CT geometries by calculating the similarity index and Hausdorff distance. Wall stresses were simulated at P = 140 mm Hg and compared between both modalities. The similarity index of US vs CT was 0.75 to 0.91 (n = 12), with a median Hausdorff distance ranging from 4.8 to 13.9 mm, with the higher values found at the proximal and distal sides of the AAA. Wall stresses were in accordance with literature, and a good agreement was found between US- and CT-based median stresses and interquartile stresses, which was confirmed by Bland-Altman and regression analysis (n = 8). Wall stresses based on US were typically higher (+23%), caused by geometric irregularities due to the registration of several 3D volumes and manual segmentation. In future work, an automated US registration and segmentation approach is the essential point of improvement before pursuing large-scale patient studies. This study is a first step toward US-based wall stress analysis, which would be the modality of choice to monitor wall stress development over time because no ionizing radiation and contrast material are involved. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Sievers, Aaron; Bosiek, Katharina; Bisch, Marc; Dreessen, Chris; Riedel, Jascha; Froß, Patrick; Hausmann, Michael; Hildenbrand, Georg
2017-01-01
In genome analysis, k-mer-based comparison methods have become standard tools. However, even though they are able to deliver reliable results, other algorithms seem to work better in some cases. To improve k-mer-based DNA sequence analysis and comparison, we successfully checked whether adding positional resolution is beneficial for finding and/or comparing interesting organizational structures. A simple but efficient algorithm for extracting and saving local k-mer spectra (frequency distribution of k-mers) was developed and used. The results were analyzed by including positional information based on visualizations as genomic maps and by applying basic vector correlation methods. This analysis was concentrated on small word lengths (1 ≤ k ≤ 4) on relatively small viral genomes of Papillomaviridae and Herpesviridae, while also checking its usability for larger sequences, namely human chromosome 2 and the homologous chromosomes (2A, 2B) of a chimpanzee. Using this alignment-free analysis, several regions with specific characteristics in Papillomaviridae and Herpesviridae formerly identified by independent, mostly alignment-based methods, were confirmed. Correlations between the k-mer content and several genes in these genomes have been found, showing similarities between classified and unclassified viruses, which may be potentially useful for further taxonomic research. Furthermore, unknown k-mer correlations in the genomes of Human Herpesviruses (HHVs), which are probably of major biological function, are found and described. Using the chromosomes of a chimpanzee and human that are currently known, identities between the species on every analyzed chromosome were reproduced. This demonstrates the feasibility of our approach for large data sets of complex genomes. Based on these results, we suggest k-mer analysis with positional resolution as a method for closing a gap between the effectiveness of alignment-based methods (like NCBI BLAST) and the high pace of standard k-mer analysis. PMID:28422050
Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis
ERIC Educational Resources Information Center
Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara
2014-01-01
This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…
GIS-based accident location and analysis system (GIS-ALAS) : project report : phase I
DOT National Transportation Integrated Search
1998-04-06
This report summarizes progress made in Phase I of the geographic information system (GIS) based Accident Location and Analysis System (GIS-ALAS). The GIS-ALAS project builds on PC-ALAS, a locationally-referenced highway crash database query system d...
The Analysis of Image Segmentation Hierarchies with a Graph-based Knowledge Discovery System
NASA Technical Reports Server (NTRS)
Tilton, James C.; Cooke, diane J.; Ketkar, Nikhil; Aksoy, Selim
2008-01-01
Currently available pixel-based analysis techniques do not effectively extract the information content from the increasingly available high spatial resolution remotely sensed imagery data. A general consensus is that object-based image analysis (OBIA) is required to effectively analyze this type of data. OBIA is usually a two-stage process; image segmentation followed by an analysis of the segmented objects. We are exploring an approach to OBIA in which hierarchical image segmentations provided by the Recursive Hierarchical Segmentation (RHSEG) software developed at NASA GSFC are analyzed by the Subdue graph-based knowledge discovery system developed by a team at Washington State University. In this paper we discuss out initial approach to representing the RHSEG-produced hierarchical image segmentations in a graphical form understandable by Subdue, and provide results on real and simulated data. We also discuss planned improvements designed to more effectively and completely convey the hierarchical segmentation information to Subdue and to improve processing efficiency.
Hong, Na; Prodduturi, Naresh; Wang, Chen; Jiang, Guoqian
2017-01-01
In this study, we describe our efforts in building a clinical statistics and analysis application platform using an emerging clinical data standard, HL7 FHIR, and an open source web application framework, Shiny. We designed two primary workflows that integrate a series of R packages to enable both patient-centered and cohort-based interactive analyses. We leveraged Shiny with R to develop interactive interfaces on FHIR-based data and used ovarian cancer study datasets as a use case to implement a prototype. Specifically, we implemented patient index, patient-centered data report and analysis, and cohort analysis. The evaluation of our study was performed by testing the adaptability of the framework on two public FHIR servers. We identify common research requirements and current outstanding issues, and discuss future enhancement work of the current studies. Overall, our study demonstrated that it is feasible to use Shiny for implementing interactive analysis on FHIR-based standardized clinical data.
Laser-based methods for the analysis of low molecular weight compounds in biological matrices.
Kiss, András; Hopfgartner, Gérard
2016-07-15
Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Ultrathin inorganic molecular nanowire based on polyoxometalates
Zhang, Zhenxin; Murayama, Toru; Sadakane, Masahiro; Ariga, Hiroko; Yasuda, Nobuhiro; Sakaguchi, Norihito; Asakura, Kiyotaka; Ueda, Wataru
2015-01-01
The development of metal oxide-based molecular wires is important for fundamental research and potential practical applications. However, examples of these materials are rare. Here we report an all-inorganic transition metal oxide molecular wire prepared by disassembly of larger crystals. The wires are comprised of molybdenum(VI) with either tellurium(IV) or selenium(IV): {(NH4)2[XMo6O21]}n (X=tellurium(IV) or selenium(IV)). The ultrathin molecular nanowires with widths of 1.2 nm grow to micrometre-scale crystals and are characterized by single-crystal X-ray analysis, Rietveld analysis, scanning electron microscopy, X-ray photoelectron spectroscopy, ultraviolet–visible spectroscopy, thermal analysis and elemental analysis. The crystals can be disassembled into individual molecular wires through cation exchange and subsequent ultrasound treatment, as visualized by atomic force microscopy and transmission electron microscopy. The ultrathin molecular wire-based material exhibits high activity as an acid catalyst, and the band gap of the molecular wire-based crystal is tunable by heat treatment. PMID:26139011
Chindelevitch, Leonid; Trigg, Jason; Regev, Aviv; Berger, Bonnie
2014-01-01
Constraint-based models are currently the only methodology that allows the study of metabolism at the whole-genome scale. Flux balance analysis is commonly used to analyse constraint-based models. Curiously, the results of this analysis vary with the software being run, a situation that we show can be remedied by using exact rather than floating-point arithmetic. Here we introduce MONGOOSE, a toolbox for analysing the structure of constraint-based metabolic models in exact arithmetic. We apply MONGOOSE to the analysis of 98 existing metabolic network models and find that the biomass reaction is surprisingly blocked (unable to sustain non-zero flux) in nearly half of them. We propose a principled approach for unblocking these reactions and extend it to the problems of identifying essential and synthetic lethal reactions and minimal media. Our structural insights enable a systematic study of constraint-based metabolic models, yielding a deeper understanding of their possibilities and limitations. PMID:25291352
NoSQL Based 3D City Model Management System
NASA Astrophysics Data System (ADS)
Mao, B.; Harrie, L.; Cao, J.; Wu, Z.; Shen, J.
2014-04-01
To manage increasingly complicated 3D city models, a framework based on NoSQL database is proposed in this paper. The framework supports import and export of 3D city model according to international standards such as CityGML, KML/COLLADA and X3D. We also suggest and implement 3D model analysis and visualization in the framework. For city model analysis, 3D geometry data and semantic information (such as name, height, area, price and so on) are stored and processed separately. We use a Map-Reduce method to deal with the 3D geometry data since it is more complex, while the semantic analysis is mainly based on database query operation. For visualization, a multiple 3D city representation structure CityTree is implemented within the framework to support dynamic LODs based on user viewpoint. Also, the proposed framework is easily extensible and supports geoindexes to speed up the querying. Our experimental results show that the proposed 3D city management system can efficiently fulfil the analysis and visualization requirements.