Efficient Web Vulnerability Detection Tool for Sleeping Giant-Cross Site Request Forgery
NASA Astrophysics Data System (ADS)
Parimala, G.; Sangeetha, M.; AndalPriyadharsini, R.
2018-04-01
Now day’s web applications are very high in the rate of usage due to their user friendly environment and getting any information via internet but these web applications are affected by lot of threats. CSRF attack is one of the serious threats to web applications which is based on the vulnerabilities present in the normal web request and response of HTTP protocol. It is hard to detect but hence still it is present in most of the existing web applications. In CSRF attack, without user knowledge the unwanted actions on a reliable websites are forced to happen. So it is placed in OWASP’s top 10 Web Application attacks list. My proposed work is to do a real time scan of CSRF vulnerability attack in given URL of the web applications as well as local host address for any organization using python language. Client side detection of CSRF is depended on Form count which is presented in that given web site.
Local File Disclosure Vulnerability: A Case Study of Public-Sector Web Applications
NASA Astrophysics Data System (ADS)
Ahmed, M. Imran; Maruf Hassan, Md; Bhuyian, Touhid
2018-01-01
Almost all public-sector organisations in Bangladesh now offer online services through web applications, along with the existing channels, in their endeavour to realise the dream of a ‘Digital Bangladesh’. Nations across the world have joined the online environment thanks to training and awareness initiatives by their government. File sharing and downloading activities using web applications have now become very common, not only ensuring the easy distribution of different types of files and documents but also enormously reducing the time and effort of users. Although the online services that are being used frequently have made users’ life easier, it has increased the risk of exploitation of local file disclosure (LFD) vulnerability in the web applications of different public-sector organisations due to unsecure design and careless coding. This paper analyses the root cause of LFD vulnerability, its exploitation techniques, and its impact on 129 public-sector websites in Bangladesh by examining the use of manual black box testing approach.
Web vulnerability study of online pharmacy sites.
Kuzma, Joanne
2011-01-01
Consumers are increasingly using online pharmacies, but these sites may not provide an adequate level of security with the consumers' personal data. There is a gap in this research addressing the problems of security vulnerabilities in this industry. The objective is to identify the level of web application security vulnerabilities in online pharmacies and the common types of flaws, thus expanding on prior studies. Technical, managerial and legal recommendations on how to mitigate security issues are presented. The proposed four-step method first consists of choosing an online testing tool. The next steps involve choosing a list of 60 online pharmacy sites to test, and then running the software analysis to compile a list of flaws. Finally, an in-depth analysis is performed on the types of web application vulnerabilities. The majority of sites had serious vulnerabilities, with the majority of flaws being cross-site scripting or old versions of software that have not been updated. A method is proposed for the securing of web pharmacy sites, using a multi-phased approach of technical and managerial techniques together with a thorough understanding of national legal requirements for securing systems.
Protecting Database Centric Web Services against SQL/XPath Injection Attacks
NASA Astrophysics Data System (ADS)
Laranjeiro, Nuno; Vieira, Marco; Madeira, Henrique
Web services represent a powerful interface for back-end database systems and are increasingly being used in business critical applications. However, field studies show that a large number of web services are deployed with security flaws (e.g., having SQL Injection vulnerabilities). Although several techniques for the identification of security vulnerabilities have been proposed, developing non-vulnerable web services is still a difficult task. In fact, security-related concerns are hard to apply as they involve adding complexity to already complex code. This paper proposes an approach to secure web services against SQL and XPath Injection attacks, by transparently detecting and aborting service invocations that try to take advantage of potential vulnerabilities. Our mechanism was applied to secure several web services specified by the TPC-App benchmark, showing to be 100% effective in stopping attacks, non-intrusive and very easy to use.
Ultrabroadband photonic internet: safety aspects
NASA Astrophysics Data System (ADS)
Kalicki, Arkadiusz; Romaniuk, Ryszard
2008-11-01
Web applications became most popular medium in the Internet. Popularity, easiness of web application frameworks together with careless development results in high number of vulnerabilities and attacks. There are several types of attacks possible because of improper input validation. SQL injection is ability to execute arbitrary SQL queries in a database through an existing application. Cross-site scripting is the vulnerability which allows malicious web users to inject code into the web pages viewed by other users. Cross-Site Request Forgery (CSRF) is an attack that tricks the victim into loading a page that contains malicious request. Web spam in blogs. There are several techniques to mitigate attacks. Most important are web application strong design, correct input validation, defined data types for each field and parameterized statements in SQL queries. Server hardening with firewall, modern security policies systems and safe web framework interpreter configuration are essential. It is advised to keep proper security level on client side, keep updated software and install personal web firewalls or IDS/IPS systems. Good habits are logging out from services just after finishing work and using even separate web browser for most important sites, like e-banking.
Design of Provider-Provisioned Website Protection Scheme against Malware Distribution
NASA Astrophysics Data System (ADS)
Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka
Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.
NASA Astrophysics Data System (ADS)
Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.
2018-05-01
This paper presents statistical results and their consolidation, which were received in the study into security of various web-application against cross-site request forgery attacks. Some of the results were received in the study carried out within the framework of certification for compliance with information security requirements. The paper provides the results of consolidating information about the attack and protection measures, which are currently used by the developers of web-applications. It specifies results of the study, which demonstrate various distribution types: distribution of identified vulnerabilities as per the developer type (Russian and foreign), distribution of the security measures used in web-applications, distribution of the identified vulnerabilities as per the programming languages, data on the number of security measures that are used in the studied web-applications. The results of the study show that in most cases the developers of web-applications do not pay due attention to protection against cross-site request forgery attacks. The authors give recommendations to the developers that are planning to undergo a certification process for their software applications.
Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks.
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
2014-02-01
Hybrid mobile applications (apps) combine the features of Web applications and "native" mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources-file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies "bridges" that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources-the ability to read and write contacts list, local files, etc.-to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model.
Identification and Illustration of Insecure Direct Object References and their Countermeasures
NASA Astrophysics Data System (ADS)
KumarShrestha, Ajay; Singh Maharjan, Pradip; Paudel, Santosh
2015-03-01
The insecure direct object reference simply represents the flaws in the system design without the full protection mechanism for the sensitive system resources or data. It basically occurs when the web application developer provides direct access to objects in accordance with the user input. So any attacker can exploit this web vulnerability and gain access to privileged information by bypassing the authorization. The main aim of this paper is to demonstrate the real effect and the identification of the insecure direct object references and then to provide the feasible preventive solutions such that the web applications do not allow direct object references to be manipulated by attackers. The experiment of the insecure direct object referencing is carried out using the insecure J2EE web application called WebGoat and its security testing is being performed using another JAVA based tool called BURP SUITE. The experimental result shows that the access control check for gaining access to privileged information is a very simple problem but at the same time its correct implementation is a tricky task. The paper finally presents some ways to overcome this web vulnerability.
Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks
Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly
2014-01-01
Hybrid mobile applications (apps) combine the features of Web applications and “native” mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources—file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies “bridges” that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources—the ability to read and write contacts list, local files, etc.—to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model. PMID:25485311
ESB-based Sensor Web integration for the prediction of electric power supply system vulnerability.
Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja
2013-08-15
Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application.
ESB-Based Sensor Web Integration for the Prediction of Electric Power Supply System Vulnerability
Stoimenov, Leonid; Bogdanovic, Milos; Bogdanovic-Dinic, Sanja
2013-01-01
Electric power supply companies increasingly rely on enterprise IT systems to provide them with a comprehensive view of the state of the distribution network. Within a utility-wide network, enterprise IT systems collect data from various metering devices. Such data can be effectively used for the prediction of power supply network vulnerability. The purpose of this paper is to present the Enterprise Service Bus (ESB)-based Sensor Web integration solution that we have developed with the purpose of enabling prediction of power supply network vulnerability, in terms of a prediction of defect probability for a particular network element. We will give an example of its usage and demonstrate our vulnerability prediction model on data collected from two different power supply companies. The proposed solution is an extension of the GinisSense Sensor Web-based architecture for collecting, processing, analyzing, decision making and alerting based on the data received from heterogeneous data sources. In this case, GinisSense has been upgraded to be capable of operating in an ESB environment and combine Sensor Web and GIS technologies to enable prediction of electric power supply system vulnerability. Aside from electrical values, the proposed solution gathers ambient values from additional sensors installed in the existing power supply network infrastructure. GinisSense aggregates gathered data according to an adapted Omnibus data fusion model and applies decision-making logic on the aggregated data. Detected vulnerabilities are visualized to end-users through means of a specialized Web GIS application. PMID:23955435
Do You Ignore Information Security in Your Journal Website?
Dadkhah, Mehdi; Borchardt, Glenn; Lagzian, Mohammad
2017-08-01
Nowadays, web-based applications extend to all businesses due to their advantages and easy usability. The most important issue in web-based applications is security. Due to their advantages, most academic journals are now using these applications, with papers being submitted and published through their websites. As these websites are resources for knowledge, information security is primary for maintaining their integrity. In this opinion piece, we point out vulnerabilities in certain websites and introduce the potential for future threats. We intend to present how some journals are vulnerable and what will happen if a journal can be infected by attackers. This opinion is not a technical manual in information security, it is a short inspection that we did to improve the security of academic journals.
Interactive Vulnerability Analysis Enhancement Results
2012-12-01
from JavaEE web based applications to other non-web based Java programs. Technology developed in this effort should be generally applicable to other...Generating a rule is a 2 click process that requires no input from the user. • Task 3: Added support for non- Java EE applications Aspect’s...investigated a variety of Java -based technologies and how IAST can support them. We were successful in adding support for Scala, a popular new language, and
Classification of HTTP Attacks: A Study on the ECML/PKDD 2007 Discovery Challenge
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallagher, Brian; Eliassi-Rad, Tina
2009-07-08
As the world becomes more reliant on Web applications for commercial, financial, and medical transactions, cyber attacks on the World Wide Web are increasing in frequency and severity. Web applications provide an attractive alternative to traditional desktop applications due to their accessibility and ease of deployment. However, the accessibility of Web applications also makes them extremely vulnerable to attack. This inherent vulnerability is intensified by the distributed nature ofWeb applications and the complexity of configuring application servers. These factors have led to a proliferation of Web-based attacks, in which attackers surreptitiously inject code into HTTP requests, allowing them to executemore » arbitrary commands on remote systems and perform malicious activities such as reading, altering, or destroying sensitive data. One approach for dealing with HTTP-based attacks is to identify malicious code in incoming HTTP requests and eliminate bad requests before they are processed. Using machine learning techniques, we can build a classifier to automatically label requests as “Valid” or “Attack.” For this study, we develop a simple, but effective HTTP attack classifier, based on the vector space model used commonly for Information Retrieval. Our classifier not only separates attacks from valid requests, but can also identify specific attack types (e.g., “SQL Injection” or “Path Traversal”). We demonstrate the effectiveness of our approach through experiments on the ECML/PKDD 2007 Discovery Challenge data set. Specifically, we show that our approach achieves higher precision and recall than previous methods. In addition, our approach has a number of desirable characteristics, including robustness to missing contextual information, interpretability of models, and scalability.« less
Design and implementation of website information disclosure assessment system.
Cho, Ying-Chiang; Pan, Jen-Yi
2015-01-01
Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people's lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website's information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites.
Business logic for geoprocessing of distributed geodata
NASA Astrophysics Data System (ADS)
Kiehle, Christian
2006-12-01
This paper describes the development of a business-logic component for the geoprocessing of distributed geodata. The business logic acts as a mediator between the data and the user, therefore playing a central role in any spatial information system. The component is used in service-oriented architectures to foster the reuse of existing geodata inventories. Based on a geoscientific case study of groundwater vulnerability assessment and mapping, the demands for such architectures are identified with special regard to software engineering tasks. Methods are derived from the field of applied Geosciences (Hydrogeology), Geoinformatics, and Software Engineering. In addition to the development of a business logic component, a forthcoming Open Geospatial Consortium (OGC) specification is introduced: the OGC Web Processing Service (WPS) specification. A sample application is introduced to demonstrate the potential of WPS for future information systems. The sample application Geoservice Groundwater Vulnerability is described in detail to provide insight into the business logic component, and demonstrate how information can be generated out of distributed geodata. This has the potential to significantly accelerate the assessment and mapping of groundwater vulnerability. The presented concept is easily transferable to other geoscientific use cases dealing with distributed data inventories. Potential application fields include web-based geoinformation systems operating on distributed data (e.g. environmental planning systems, cadastral information systems, and others).
Towards a framework for assessment and management of cumulative human impacts on marine food webs.
Giakoumi, Sylvaine; Halpern, Benjamin S; Michel, Loïc N; Gobert, Sylvie; Sini, Maria; Boudouresque, Charles-François; Gambi, Maria-Cristina; Katsanevakis, Stelios; Lejeune, Pierre; Montefalcone, Monica; Pergent, Gerard; Pergent-Martini, Christine; Sanchez-Jerez, Pablo; Velimirov, Branko; Vizzini, Salvatrice; Abadie, Arnaud; Coll, Marta; Guidetti, Paolo; Micheli, Fiorenza; Possingham, Hugh P
2015-08-01
Effective ecosystem-based management requires understanding ecosystem responses to multiple human threats, rather than focusing on single threats. To understand ecosystem responses to anthropogenic threats holistically, it is necessary to know how threats affect different components within ecosystems and ultimately alter ecosystem functioning. We used a case study of a Mediterranean seagrass (Posidonia oceanica) food web and expert knowledge elicitation in an application of the initial steps of a framework for assessment of cumulative human impacts on food webs. We produced a conceptual seagrass food web model, determined the main trophic relationships, identified the main threats to the food web components, and assessed the components' vulnerability to those threats. Some threats had high (e.g., coastal infrastructure) or low impacts (e.g., agricultural runoff) on all food web components, whereas others (e.g., introduced carnivores) had very different impacts on each component. Partitioning the ecosystem into its components enabled us to identify threats previously overlooked and to reevaluate the importance of threats commonly perceived as major. By incorporating this understanding of system vulnerability with data on changes in the state of each threat (e.g., decreasing domestic pollution and increasing fishing) into a food web model, managers may be better able to estimate and predict cumulative human impacts on ecosystems and to prioritize conservation actions. © 2015 Society for Conservation Biology.
Design and Implementation of Website Information Disclosure Assessment System
Cho, Ying-Chiang; Pan, Jen-Yi
2015-01-01
Internet application technologies, such as cloud computing and cloud storage, have increasingly changed people’s lives. Websites contain vast amounts of personal privacy information. In order to protect this information, network security technologies, such as database protection and data encryption, attract many researchers. The most serious problems concerning web vulnerability are e-mail address and network database leakages. These leakages have many causes. For example, malicious users can steal database contents, taking advantage of mistakes made by programmers and administrators. In order to mitigate this type of abuse, a website information disclosure assessment system is proposed in this study. This system utilizes a series of technologies, such as web crawler algorithms, SQL injection attack detection, and web vulnerability mining, to assess a website’s information disclosure. Thirty websites, randomly sampled from the top 50 world colleges, were used to collect leakage information. This testing showed the importance of increasing the security and privacy of website information for academic websites. PMID:25768434
Supporting secure programming in web applications through interactive static analysis.
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
2014-07-01
Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.
Supporting secure programming in web applications through interactive static analysis
Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill
2013-01-01
Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513
Food web structure and interaction strength pave the way for vulnerability to extinction.
Karlsson, Patrik; Jonsson, Tomas; Jonsson, Annie
2007-11-07
This paper focuses on how food web structure and interactions among species affects the vulnerability, due to environmental variability, to extinction of species at different positions in model food webs. Vulnerability is here not measured by a traditional extinction threshold but is instead inspired by the IUCN criteria for endangered species: an observed rapid decline in population abundance. Using model webs influenced by stochasticity with zero autocorrelation, we investigate the ecological determinants of species vulnerability, i.e. the trophic interactions between species and food web structure and how these interact with the risk of sudden drops in abundance of species. We find that (i) producers fulfil the criterion of vulnerable species more frequently than other species, (ii) food web structure is related to vulnerability, and (iii) the vulnerability of species is greater when involved in a strong trophic interaction than when not. We note that our result on the relationship between extinction risk and trophic position of species contradict previous suggestions and argue that the main reason for the discrepancy probably is due to the fact that we study the vulnerability to environmental stochasticity and not extinction risk due to overexploitation, habitat destruction or interactions with introduced species. Thus, we suggest that the vulnerability of species to environmental stochasticity may be differently related to trophic position than the vulnerability of species to other factors. Earlier research on species extinctions has looked for intrinsic traits of species that correlate with increased vulnerability to extinction. However, to fully understand the extinction process we must also consider that species interactions may affect vulnerability and that not all extinctions are the result of long, gradual reductions in species abundances. Under environmental stochasticity (which importance frequently is assumed to increase as a result of climate change) and direct and indirect interactions with other species some extinctions may occur rapidly and apparently unexpectedly. To identify the first declines of population abundances that may escalate and lead to extinctions as early as possible, we need to recognize which species are at greatest risk of entering such dangerous routes and under what circumstances. This new perspective may contribute to our understanding of the processes leading to extinction of populations and eventually species. This is especially urgent in the light of the current biodiversity crisis where a large fraction of the world's biodiversity is threatened.
Systemic Vulnerabilities in Customer-Premises Equipment (CPE) Routers
2017-07-01
equipment (CPE),1 specifically small office/home office (SOHO) routers, has become ubiquitous. CPE routers are notorious for their web interface...and enabling remote management, although all settings controllable over the web -management interface can be manipulated. • 85% (11 of 13) of...specifically small office/home office (SOHO) routers— has become ubiquitous. CPE routers are notorious for their web interface vulnerabilities, old ver- sions
National Vulnerability Database (NVD)
National Institute of Standards and Technology Data Gateway
National Vulnerability Database (NVD) (Web, free access) NVD is a comprehensive cyber security vulnerability database that integrates all publicly available U.S. Government vulnerability resources and provides references to industry resources. It is based on and synchronized with the CVE vulnerability naming standard.
Atlas, William I.; Palen, Wendy J.
2014-01-01
Resource subsidies increase the productivity of recipient food webs and can affect ecosystem dynamics. Subsidies of prey often support elevated predator biomass which may intensify top-down control and reduce the flow of reciprocal subsidies into adjacent ecosystems. However, top-down control in subsidized food webs may be limited if primary consumers posses morphological or behavioral traits that limit vulnerability to predation. In forested streams, terrestrial prey support high predator biomass creating the potential for strong top-down control, however armored primary consumers often dominate the invertebrate assemblage. Using empirically based simulation models, we tested the response of stream food webs to variations in subsidy magnitude, prey vulnerability, and the presence of two top predators. While terrestrial prey inputs increased predator biomass (+12%), the presence of armored primary consumers inhibited top-down control, and diverted most aquatic energy (∼75%) into the riparian forest through aquatic insect emergence. Food webs without armored invertebrates experienced strong trophic cascades, resulting in higher algal (∼50%) and detrital (∼1600%) biomass, and reduced insect emergence (−90%). These results suggest prey vulnerability can mediate food web responses to subsidies, and that top-down control can be arrested even when predator-invulnerable consumers are uncommon (20%) regardless of the level of subsidy. PMID:24465732
NV: Nessus Vulnerability Visualization for the Web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Lane; Spahn, Riley B; Iannacone, Michael D
2012-01-01
Network vulnerability is a critical component of network se- curity. Yet vulnerability analysis has received relatively lit- tle attention from the security visualization community. In this paper we describe nv, a web-based Nessus vulnerability visualization. Nv utilizes treemaps and linked histograms to allow system administrators to discover, analyze, and man- age vulnerabilities on their networks. In addition to visual- izing single Nessus scans, nv supports the analysis of sequen- tial scans by showing which vulnerabilities have been fixed, remain open, or are newly discovered. Nv was also designed to operate completely in-browser, to avoid sending sensitive data to outside servers.more » We discuss the design of nv, as well as provide case studies demonstrating vulnerability analysis workflows which include a multiple-node testbed and data from the 2011 VAST Challenge.« less
Geographical Assesment of Results from Preventing the Parameter Tampering in a Web Application
NASA Astrophysics Data System (ADS)
Menemencioğlu, O.; Orak, İ. M.
2017-11-01
The improving usage of internet and attained intensity of usage rate attracts the malicious in around the world. Many preventing systems are offered by researchers with different infrastructures. Very effective preventing system was proposed most recently by the researchers. The previously offered mechanism has prevented the multi-type vulnerabilities after preventing system was put into use. The attack attempts have been recorded. The researchers analysed the results geographically, discussed the obtained results and made some inference of the results. Our assessments show that the geographical findings can be used to retrieve some implication and build an infrastructure which prevents the vulnerabilities by location.
GUIDED TOUR OF A WEB-BASED ENVIRONMENTAL DECISION TOOLKIT
Decision-making regarding the targeting of vulnerable resources and prioritization of actions requires synthesis of data on condition, vulnerability, and feasibility of risk management alternatives. EP A's Regional Vulnerability Assessment (ReV A) Program has evaluated existing a...
Baier, Rosa R; Cooper, Emily; Wysocki, Andrea; Gravenstein, Stefan; Clark, Melissa
2015-01-01
Despite the investment in public reporting for a number of healthcare settings, evidence indicates that consumers do not routinely use available data to select providers. This suggests that existing reports do not adequately incorporate recommendations for consumer-facing reports or web applications. Healthcentric Advisors and Brown University undertook a multi-phased approach to create a consumer-facing home health web application in Rhode Island. This included reviewing the evidence base review to identify design recommendations and then creating a paper prototype and wireframe. We performed qualitative research to iteratively test our proposed user interface with two user groups, home health consumers and hospital case managers, refining our design to create the final web application. To test our prototype, we conducted two focus groups, with a total of 13 consumers, and 28 case manager interviews. Both user groups responded favorably to the prototype, with the majority commenting that they felt this type of tool would be useful. Case managers suggested revisions to ensure the application conformed to laws requiring Medicare patients to have the freedom to choose among providers and could be incorporated into hospital workflow. After incorporating changes and creating the wireframe, we conducted usability testing interviews with 14 home health consumers and six hospital case managers. We found that consumers needed prompting to navigate through the wireframe; they demonstrated confusion through both their words and body language. As a result, we modified the web application's sequence, navigation, and function to provide additional instructions and prompts. Although we designed our web application for low literacy and low health literacy, using recommendations from the evidence base, we overestimated the extent to which older adults were familiar with using computers. Some of our key learnings and recommendations run counter to general web design principles, leading us to believe that such guidelines need to be adapted for this user group. As web applications proliferate, it is important to ensure those who are most vulnerable-who have the least knowledge and the lowest literacy, health literacy, and computer proficiency-can access, understand, and use them. In order for the investment in public reporting to produce value, consumer-facing web applications need to be designed to address end users' unique strengths and limitations. Our findings may help others to build consumer-facing tools or technology targeted to a predominantly older population. We encourage others designing consumer-facing web technologies to critically evaluate their assumptions about user interface design, particularly if they are designing tools for older adults, and to test products with their end users.
Ecosystem Vulnerability Review: Proposal of an Interdisciplinary Ecosystem Assessment Approach
NASA Astrophysics Data System (ADS)
Weißhuhn, Peter; Müller, Felix; Wiggering, Hubert
2018-06-01
To safeguard the sustainable use of ecosystems and their services, early detection of potentially damaging changes in functional capabilities is needed. To support a proper ecosystem management, the analysis of an ecosystem's vulnerability provide information on its weaknesses as well as on its capacity to recover after suffering an impact. However, the application of the vulnerability concept to ecosystems is still an emerging topic. After providing background on the vulnerability concept, we summarize existing ecosystem vulnerability research on the basis of a systematic literature review with a special focus on ecosystem type, disciplinary background, and more detailed definition of the ecosystem vulnerability components. Using the Web of ScienceTM Core Collection, we overviewed the literature from 1991 onwards but used the 5 years from 2011 to 2015 for an in-depth analysis, including 129 articles. We found that ecosystem vulnerability analysis has been applied most notably in conservation biology, climate change research, and ecological risk assessments, pinpointing a limited spreading across the environmental sciences. It occurred primarily within marine and freshwater ecosystems. To avoid confusion, we recommend using the unambiguous term ecosystem vulnerability rather than ecological, environmental, population, or community vulnerability. Further, common ground has been identified, on which to define the ecosystem vulnerability components exposure, sensitivity, and adaptive capacity. We propose a framework for ecosystem assessments that coherently connects the concepts of vulnerability, resilience, and adaptability as different ecosystem responses. A short outlook on the possible operationalization of the concept by ecosystem vulnerabilty indices, and a conclusion section complete the review.
Ecosystem Vulnerability Review: Proposal of an Interdisciplinary Ecosystem Assessment Approach.
Weißhuhn, Peter; Müller, Felix; Wiggering, Hubert
2018-06-01
To safeguard the sustainable use of ecosystems and their services, early detection of potentially damaging changes in functional capabilities is needed. To support a proper ecosystem management, the analysis of an ecosystem's vulnerability provide information on its weaknesses as well as on its capacity to recover after suffering an impact. However, the application of the vulnerability concept to ecosystems is still an emerging topic. After providing background on the vulnerability concept, we summarize existing ecosystem vulnerability research on the basis of a systematic literature review with a special focus on ecosystem type, disciplinary background, and more detailed definition of the ecosystem vulnerability components. Using the Web of Science TM Core Collection, we overviewed the literature from 1991 onwards but used the 5 years from 2011 to 2015 for an in-depth analysis, including 129 articles. We found that ecosystem vulnerability analysis has been applied most notably in conservation biology, climate change research, and ecological risk assessments, pinpointing a limited spreading across the environmental sciences. It occurred primarily within marine and freshwater ecosystems. To avoid confusion, we recommend using the unambiguous term ecosystem vulnerability rather than ecological, environmental, population, or community vulnerability. Further, common ground has been identified, on which to define the ecosystem vulnerability components exposure, sensitivity, and adaptive capacity. We propose a framework for ecosystem assessments that coherently connects the concepts of vulnerability, resilience, and adaptability as different ecosystem responses. A short outlook on the possible operationalization of the concept by ecosystem vulnerabilty indices, and a conclusion section complete the review.
Genesis: A Framework for Achieving Software Component Diversity
2007-01-01
correctly—the initial filters develop to fix the Hotmail vulnerability could be circumvented by using alternate character encodings4. Hence, we focus on...Remotely Exploitable Cross-Site Scripting in Hotmail and Yahoo, (March 2004); http://www.greymagic.com/security/advisories/gm005-mc/. 4...EyeonSecurity, Microsoft Passport Account Hijack Attack: Hacking Hotmail and More, Hacker’s Digest. 5. Y.-W. Huang et al., Web Application Security Assessment by
Aphinyanaphongs, Yin; Fu, Lawrence D; Aliferis, Constantin F
2013-01-01
Building machine learning models that identify unproven cancer treatments on the Health Web is a promising approach for dealing with the dissemination of false and dangerous information to vulnerable health consumers. Aside from the obvious requirement of accuracy, two issues are of practical importance in deploying these models in real world applications. (a) Generalizability: The models must generalize to all treatments (not just the ones used in the training of the models). (b) Scalability: The models can be applied efficiently to billions of documents on the Health Web. First, we provide methods and related empirical data demonstrating strong accuracy and generalizability. Second, by combining the MapReduce distributed architecture and high dimensionality compression via Markov Boundary feature selection, we show how to scale the application of the models to WWW-scale corpora. The present work provides evidence that (a) a very small subset of unproven cancer treatments is sufficient to build a model to identify unproven treatments on the web; (b) unproven treatments use distinct language to market their claims and this language is learnable; (c) through distributed parallelization and state of the art feature selection, it is possible to prepare the corpora and build and apply models with large scalability.
Designing, Implementing, and Evaluating Secure Web Browsers
ERIC Educational Resources Information Center
Grier, Christopher L.
2009-01-01
Web browsers are plagued with vulnerabilities, providing hackers with easy access to computer systems using browser-based attacks. Efforts that retrofit existing browsers have had limited success since modern browsers are not designed to withstand attack. To enable more secure web browsing, we design and implement new web browsers from the ground…
Trophic redundancy reduces vulnerability to extinction cascades
Sanders, Dirk; Thébault, Elisa; Kehoe, Rachel; Frank van Veen, F. J.
2018-01-01
Current species extinction rates are at unprecedentedly high levels. While human activities can be the direct cause of some extinctions, it is becoming increasingly clear that species extinctions themselves can be the cause of further extinctions, since species affect each other through the network of ecological interactions among them. There is concern that the simplification of ecosystems, due to the loss of species and ecological interactions, increases their vulnerability to such secondary extinctions. It is predicted that more complex food webs will be less vulnerable to secondary extinctions due to greater trophic redundancy that can buffer against the effects of species loss. Here, we demonstrate in a field experiment with replicated plant-insect communities, that the probability of secondary extinctions is indeed smaller in food webs that include trophic redundancy. Harvesting one species of parasitoid wasp led to secondary extinctions of other, indirectly linked, species at the same trophic level. This effect was markedly stronger in simple communities than for the same species within a more complex food web. We show that this is due to functional redundancy in the more complex food webs and confirm this mechanism with a food web simulation model by highlighting the importance of the presence and strength of trophic links providing redundancy to those links that were lost. Our results demonstrate that biodiversity loss, leading to a reduction in redundant interactions, can increase the vulnerability of ecosystems to secondary extinctions, which, when they occur, can then lead to further simplification and run-away extinction cascades. PMID:29467292
Vulnerability Assessment of IPv6 Websites to SQL Injection and Other Application Level Attacks
Cho, Ying-Chiang; Pan, Jen-Yi
2013-01-01
Given the proliferation of internet connected devices, IPv6 has been proposed to replace IPv4. Aside from providing a larger address space which can be assigned to internet enabled devices, it has been suggested that the IPv6 protocol offers increased security due to the fact that with the large number of addresses available, standard IP scanning attacks will no longer become feasible. However, given the interest in attacking organizations rather than individual devices, most initial points of entry onto an organization's network and their attendant devices are visible and reachable through web crawling techniques, and, therefore, attacks on the visible application layer may offer ways to compromise the overall network. In this evaluation, we provide a straightforward implementation of a web crawler in conjunction with a benign black box penetration testing system and analyze the ease at which SQL injection attacks can be carried out. PMID:24574863
Vulnerability assessment of IPv6 websites to SQL injection and other application level attacks.
Cho, Ying-Chiang; Pan, Jen-Yi
2013-01-01
Given the proliferation of internet connected devices, IPv6 has been proposed to replace IPv4. Aside from providing a larger address space which can be assigned to internet enabled devices, it has been suggested that the IPv6 protocol offers increased security due to the fact that with the large number of addresses available, standard IP scanning attacks will no longer become feasible. However, given the interest in attacking organizations rather than individual devices, most initial points of entry onto an organization's network and their attendant devices are visible and reachable through web crawling techniques, and, therefore, attacks on the visible application layer may offer ways to compromise the overall network. In this evaluation, we provide a straightforward implementation of a web crawler in conjunction with a benign black box penetration testing system and analyze the ease at which SQL injection attacks can be carried out.
SYNTHESIS OF SPATIAL DATA FOR DECISION-MAKING
EPA'S Regional Vulnerability Assessment Program (ReVA) has developed a web-based statistical tool that synthesizes available spatial data into indices of condition, vulnerability (risk, considering cumulative effects), and feasibility of management options. The Environmental Deci...
Modeling Regular Replacement for String Constraint Solving
NASA Technical Reports Server (NTRS)
Fu, Xiang; Li, Chung-Chih
2010-01-01
Bugs in user input sanitation of software systems often lead to vulnerabilities. Among them many are caused by improper use of regular replacement. This paper presents a precise modeling of various semantics of regular substitution, such as the declarative, finite, greedy, and reluctant, using finite state transducers (FST). By projecting an FST to its input/output tapes, we are able to solve atomic string constraints, which can be applied to both the forward and backward image computation in model checking and symbolic execution of text processing programs. We report several interesting discoveries, e.g., certain fragments of the general problem can be handled using less expressive deterministic FST. A compact representation of FST is implemented in SUSHI, a string constraint solver. It is applied to detecting vulnerabilities in web applications
Parasuicide online: Can suicide websites trigger suicidal behaviour in predisposed adolescents?
Becker, K; Mayer, M; Nagenborg, M; El-Faddagh, M; Schmidt, M H
2004-01-01
The present case report describes a 17-year-old female who explicitly visited suicide web forums, where she researched reliable suicide methods, contacted an anonymous user and purchased substances for the implementation of suicide. The risk of Internet use by vulnerable youth is discussed. Psychiatric exploration should include questions of manner and frequency of media use. The application of media guidelines for suicide prevention is demanded for websites, as are accessible self-help sites for suicidal persons targeted to youthful users.
Villarreal, Miguel; Norman, Laura M.; Labiosa, William B.
2012-01-01
In this paper we describe an application of a GIS-based multi-criteria decision support web tool that models and evaluates relative changes in ecosystem services to policy and land management decisions. The Santa Cruz Watershed Ecosystem Portfolio (SCWEPM) was designed to provide credible forecasts of responses to ecosystem drivers and stressors and to illustrate the role of land use decisions on spatial and temporal distributions of ecosystem services within a binational (U.S. and Mexico) watershed. We present two SCWEPM sub-models that when analyzed together address bidirectional relationships between social and ecological vulnerability and ecosystem services. The first model employs the Modified Socio-Environmental Vulnerability Index (M-SEVI), which assesses community vulnerability using information from U.S. and Mexico censuses on education, access to resources, migratory status, housing situation, and number of dependents. The second, relating land cover change to biodiversity (provisioning services), models changes in the distribution of terrestrial vertebrate habitat based on multitemporal vegetation and land cover maps, wildlife habitat relationships, and changes in land use/land cover patterns. When assessed concurrently, the models exposed some unexpected relationships between vulnerable communities and ecosystem services provisioning. For instance, the most species-rich habitat type in the watershed, Desert Riparian Forest, increased over time in areas occupied by the most vulnerable populations and declined in areas with less vulnerable populations. This type of information can be used to identify ecological conservation and restoration targets that enhance the livelihoods of people in vulnerable communities and promote biodiversity and ecosystem health.
Disseminating near-real-time hazards information and flood maps in the Philippines through Web-GIS.
A Lagmay, Alfredo Mahar Francisco; Racoma, Bernard Alan; Aracan, Ken Adrian; Alconis-Ayco, Jenalyn; Saddi, Ivan Lester
2017-09-01
The Philippines being a locus of tropical cyclones, tsunamis, earthquakes and volcanic eruptions, is a hotbed of disasters. These natural hazards inflict loss of lives and costly damage to property. Situated in a region where climate and geophysical tempest is common, the Philippines will inevitably suffer from calamities similar to those experienced recently. With continued development and population growth in hazard prone areas, it is expected that damage to infrastructure and human losses would persist and even rise unless appropriate measures are immediately implemented by government. In 2012, the Philippines launched a responsive program for disaster prevention and mitigation called the Nationwide Operational Assessment of Hazards (Project NOAH), specifically for government warning agencies to be able to provide a 6hr lead-time warning to vulnerable communities against impending floods and to use advanced technology to enhance current geo-hazard vulnerability maps. To disseminate such critical information to as wide an audience as possible, a Web-GIS using mashups of freely available source codes and application program interface (APIs) was developed and can be found in the URLs http://noah.dost.gov.ph and http://noah.up.edu.ph/. This Web-GIS tool is now heavily used by local government units in the Philippines in their disaster prevention and mitigation efforts and can be replicated in countries that have a proactive approach to address the impacts of natural hazards but lack sufficient funds. Copyright © 2017. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Congi, Maria Pia; Campo, Valentina; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Spizzichino, Daniele; Trigila, Alessandro
2014-05-01
The increasing damage caused by natural disasters in the last decades points out the need for interoperable added-value services to support environmental safety and human protection, by reducing vulnerability of exposed elements as well as improving the resilience of the involved communities. For this reason, to provide access to harmonized and customized data is only one of several steps towards delivering adequate support to risk assessment, reduction and management. Scope of the present work is to illustrate a methodology under development for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement an infrastructure based on web services for environmental analysis, that integrates in its own architecture specifications and results from INSPIRE, SEIS and GMES. Existing web services will be customized during the project to provide functionalities for supporting the environmental integrated management. The implemented infrastructure will be applied to landslide risk scenarios, to be developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. More in detail, the landslide pilot will be aimed at providing a landslide risk scenario through the implementation and analysis of: 1) a landslide inventory from available historical databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements (e.g. population, road network, residential area, cultural heritage). The pilot will be implemented in Liguria, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience, being highly vulnerable to landslides induced by heavy rainfall. The landslide risk impact analysis will be calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. The final target of the landslide risk assessment scenario will be to improve the knowledge and awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application can have a practical and positive effects for i.e. i) updating the land planning process in order to improve the resilience of local communities, ii) implementing preliminary cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.
Vulnerability Assessment of Open Source Wireshark and Chrome Browser
2013-08-01
UNLIMITED 5 We spent much of the initial time learning about the logical model that modern HTML5 web browsers support, including how users interact with...are supposed to protect users of that site against cross-site scripting) and the new powerful an all-encompassing HTML5 standard. This vulnerability
Lee, Jae Eun; Sung, Jung Hye; Malouhi, Mohamad
2015-12-22
There is abundant evidence that neighborhood characteristics are significantly linked to the health of the inhabitants of a given space within a given time frame. This study is to statistically validate a web-based GIS application designed to support cardiovascular-related research developed by the NIH funded Research Centers in Minority Institutions (RCMI) Translational Research Network (RTRN) Data Coordinating Center (DCC) and discuss its applicability to cardiovascular studies. Geo-referencing, geocoding and geospatial analyses were conducted for 500 randomly selected home addresses in a U.S. southeastern Metropolitan area. The correlation coefficient, factor analysis and Cronbach's alpha (α) were estimated to quantify measures of the internal consistency, reliability and construct/criterion/discriminant validity of the cardiovascular-related geospatial variables (walk score, number of hospitals, fast food restaurants, parks and sidewalks). Cronbach's α for CVD GEOSPATIAL variables was 95.5%, implying successful internal consistency. Walk scores were significantly correlated with number of hospitals (r = 0.715; p < 0.0001), fast food restaurants (r = 0.729; p < 0.0001), parks (r = 0.773; p < 0.0001) and sidewalks (r = 0.648; p < 0.0001) within a mile from homes. It was also significantly associated with diversity index (r = 0.138, p = 0.0023), median household incomes (r = -0.181; p < 0.0001), and owner occupied rates (r = -0.440; p < 0.0001). However, its non-significant correlation was found with median age, vulnerability, unemployment rate, labor force, and population growth rate. Our data demonstrates that geospatial data generated by the web-based application were internally consistent and demonstrated satisfactory validity. Therefore, the GIS application may be useful to apply to cardiovascular-related studies aimed to investigate potential impact of geospatial factors on diseases and/or the long-term effect of clinical trials.
NASA Astrophysics Data System (ADS)
Giannakopoulos, Christos; Karali, Anna; Roussos, Anargyros
2014-05-01
Greece, being part of the eastern Mediterranean basin, is an area particularly vulnerable to climate change and associated forest fire risk. The aim of this study is to assess the vulnerability of Greek forests to fire risk occurrence and identify potential adaptation options within the context of climate change through continuous interaction with local stakeholders. To address their needs, the following tools for the provision of climate information services were developed: 1. An application providing fire risk forecasts for the following 3 days (http://cirrus.meteo.noa.gr/forecast/bolam/index.htm) was developed from NOA to address the needs of short term fire planners. 2. A web-based application providing long term fire risk and other fire related indices changes due to climate change (time horizon up to 2050 and 2100) was developed in collaboration with the WWF Greece office to address the needs of long term fire policy makers (http://www.oikoskopio.gr/map/). 3. An educational tool was built in order to complement the two web-based tools and to further expand knowledge in fire risk modeling to address the needs for in-depth training. In particular, the second product provided the necessary information to assess the exposure to forest fires. To this aim, maps depicting the days with elevated fire risk (FWI>30) both for the control (1961-1990) and the near future period (2021-2050) were created by the web-application. FWI is a daily index that provides numerical ratings of relative fire potential based solely on weather observations. The meteorological inputs to the FWI System are daily noon values of temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. It was found that eastern lowlands are more exposed to fire risk followed by eastern high elevation areas, for both the control and near future period. The next step towards vulnerability assessment was to address sensitivity, ie the human-environmental conditions that can worsen or ameliorate the hazard. In our study static information concerning fire affecting factors, namely the topography and vegetation, was used to create a fire hazard map in order to assess the sensitivity factor. Land cover types for the year 2007 were combined with topographic information deriving from a digital elevation model order to produce these maps. High elevation continental areas were found to be the most sensitive areas followed by the lowland continental areas. Exposure and sensitivity were combined to produce the overall impact of climate change to forest fire risk. The adaptive capacity is defined by the ability of forests to adapt to changing environmental conditions. To assess the adaptive capacity of Greek forests, a Multi-Criteria Analysis (MCA) tool was implemented and used by the stakeholders. The major proposed adaptation measures for Greek forests included fire prevention measures and the inclusion of the private forest covered areas in the fire fighting. Finally, vulnerability of Greek forest to fire was estimated as the overall impact of climate change minus the forests' adaptive capacity and was found to be medium for most areas in the country. Acknowledgement: This work was supported by the EU project CLIM-RUN under contract FP7-ENV-2010-265192.
Hacking Your Ride: Is Web 2.0 Creating Vulnerabilities To Surface Transportation
2016-09-01
SMSN’s vulnerabilities, and uncovers terrorists’ malign use of social media for intelligence gathering. Academic researchers have already discovered...of social media for intelligence gathering. Academic researchers have already discovered threats in social navigation platforms such as Waze and...51 4. Social Navigation as Intelligence ................................................52 C. FUTURE CONCERNS
Our Shared Future: Social Media, Leadership, Vulnerability, and Digital Identity
ERIC Educational Resources Information Center
Stoller, Eric
2013-01-01
Social media have challenged us in our journey to support our students. Administrators have entered into new web-based conversations with one another and with their students. Personal branding has created a sense of performativity that conflicts with a growing trend towards online vulnerability. Our leaders have increasingly been engaged in…
Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection
NASA Astrophysics Data System (ADS)
Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki
Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.
The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework is implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the ...
The EPA and USGS have developed a framework to evaluate the relative vulnerability of near-coastal species to impacts of climate change. This framework was implemented in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT). We evaluated the vulnerability of the...
Miles, Alistair; Zhao, Jun; Klyne, Graham; White-Cooper, Helen; Shotton, David
2010-10-01
Integrating heterogeneous data across distributed sources is a major requirement for in silico bioinformatics supporting translational research. For example, genome-scale data on patterns of gene expression in the fruit fly Drosophila melanogaster are widely used in functional genomic studies in many organisms to inform candidate gene selection and validate experimental results. However, current data integration solutions tend to be heavy weight, and require significant initial and ongoing investment of effort. Development of a common Web-based data integration infrastructure (a.k.a. data web), using Semantic Web standards, promises to alleviate these difficulties, but little is known about the feasibility, costs, risks or practical means of migrating to such an infrastructure. We describe the development of OpenFlyData, a proof-of-concept system integrating gene expression data on D. melanogaster, combining Semantic Web standards with light-weight approaches to Web programming based on Web 2.0 design patterns. To support researchers designing and validating functional genomic studies, OpenFlyData includes user-facing search applications providing intuitive access to and comparison of gene expression data from FlyAtlas, the BDGP in situ database, and FlyTED, using data from FlyBase to expand and disambiguate gene names. OpenFlyData's services are also openly accessible, and are available for reuse by other bioinformaticians and application developers. Semi-automated methods and tools were developed to support labour- and knowledge-intensive tasks involved in deploying SPARQL services. These include methods for generating ontologies and relational-to-RDF mappings for relational databases, which we illustrate using the FlyBase Chado database schema; and methods for mapping gene identifiers between databases. The advantages of using Semantic Web standards for biomedical data integration are discussed, as are open issues. In particular, although the performance of open source SPARQL implementations is sufficient to query gene expression data directly from user-facing applications such as Web-based data fusions (a.k.a. mashups), we found open SPARQL endpoints to be vulnerable to denial-of-service-type problems, which must be mitigated to ensure reliability of services based on this standard. These results are relevant to data integration activities in translational bioinformatics. The gene expression search applications and SPARQL endpoints developed for OpenFlyData are deployed at http://openflydata.org. FlyUI, a library of JavaScript widgets providing re-usable user-interface components for Drosophila gene expression data, is available at http://flyui.googlecode.com. Software and ontologies to support transformation of data from FlyBase, FlyAtlas, BDGP and FlyTED to RDF are available at http://openflydata.googlecode.com. SPARQLite, an implementation of the SPARQL protocol, is available at http://sparqlite.googlecode.com. All software is provided under the GPL version 3 open source license.
Legal Bans on Pro-Suicide Web Sites: An Early Retrospective from Australia
ERIC Educational Resources Information Center
Pirkis, Jane; Neal, Luke; Dare, Andrew; Blood, R. Warwick; Studdert, David
2009-01-01
There are worldwide concerns that pro-suicide web sites may trigger suicidal behaviors among vulnerable individuals. In 2006, Australia became the first country to criminalize such sites, sparking heated debate. Concerns were expressed that the law casts the criminal net too widely; inappropriately interferes with the autonomy of those who wish to…
ERIC Educational Resources Information Center
Doumas, Diana M.; Esp, Susan; Turrisi, Rob; Schottelkorb, April
2015-01-01
Adolescent drinking represents a significant problem in the United States. Although high school juniors and seniors are particularly vulnerable to the negative consequences associated with alcohol use, evidence-based interventions for this age group are limited. The purpose of this article is to introduce a Web-based alcohol intervention with…
Towards Web-based representation and processing of health information
Gao, Sheng; Mioc, Darka; Yi, Xiaolun; Anton, Francois; Oldfield, Eddie; Coleman, David J
2009-01-01
Background There is great concern within health surveillance, on how to grapple with environmental degradation, rapid urbanization, population mobility and growth. The Internet has emerged as an efficient way to share health information, enabling users to access and understand data at their fingertips. Increasingly complex problems in the health field require increasingly sophisticated computer software, distributed computing power, and standardized data sharing. To address this need, Web-based mapping is now emerging as an important tool to enable health practitioners, policy makers, and the public to understand spatial health risks, population health trends and vulnerabilities. Today several web-based health applications generate dynamic maps; however, for people to fully interpret the maps they need data source description and the method used in the data analysis or statistical modeling. For the representation of health information through Web-mapping applications, there still lacks a standard format to accommodate all fixed (such as location) and variable (such as age, gender, health outcome, etc) indicators in the representation of health information. Furthermore, net-centric computing has not been adequately applied to support flexible health data processing and mapping online. Results The authors of this study designed a HEalth Representation XML (HERXML) schema that consists of the semantic (e.g., health activity description, the data sources description, the statistical methodology used for analysis), geometric, and cartographical representations of health data. A case study has been carried on the development of web application and services within the Canadian Geospatial Data Infrastructure (CGDI) framework for community health programs of the New Brunswick Lung Association. This study facilitated the online processing, mapping and sharing of health information, with the use of HERXML and Open Geospatial Consortium (OGC) services. It brought a new solution in better health data representation and initial exploration of the Web-based processing of health information. Conclusion The designed HERXML has been proven to be an appropriate solution in supporting the Web representation of health information. It can be used by health practitioners, policy makers, and the public in disease etiology, health planning, health resource management, health promotion and health education. The utilization of Web-based processing services in this study provides a flexible way for users to select and use certain processing functions for health data processing and mapping via the Web. This research provides easy access to geospatial and health data in understanding the trends of diseases, and promotes the growth and enrichment of the CGDI in the public health sector. PMID:19159445
An Analysis of Botnet Vulnerabilities
2007-06-01
Definition Currently, the primary defense against botnets is prompt patching of vulnerable systems and antivirus software . Network monitoring can identify...IRCd software , none were identified during this effort. AFIT iv For my wife, for her caring and support throughout the course of this...are software agents designed to automatically perform tasks. Examples include web-spiders that catalog the Internet and bots found in popular online
NASA Astrophysics Data System (ADS)
Keen, Arthur A.
2006-04-01
This paper describes technology being developed at 21st Century Technologies to automate Computer Network Operations (CNO). CNO refers to DoD activities related to Attacking and Defending Computer Networks (CNA & CND). Next generation cyber threats are emerging in the form of powerful Internet services and tools that automate intelligence gathering, planning, testing, and surveillance. We will focus on "Search-Engine Hacks", queries that can retrieve lists of router/switch/server passwords, control panels, accessible cameras, software keys, VPN connection files, and vulnerable web applications. Examples include "Titan Rain" attacks against DoD facilities and the Santy worm, which identifies vulnerable sites by searching Google for URLs containing application-specific strings. This trend will result in increasingly sophisticated and automated intelligence-driven cyber attacks coordinated across multiple domains that are difficult to defeat or even understand with current technology. One traditional method of CNO relies on surveillance detection as an attack predictor. Unfortunately, surveillance detection is difficult because attackers can perform search engine-driven surveillance such as with Google Hacks, and avoid touching the target site. Therefore, attack observables represent only about 5% of the attacker's total attack time, and are inadequate to provide warning. In order to predict attacks and defend against them, CNO must also employ more sophisticated techniques and work to understand the attacker's Motives, Means and Opportunities (MMO). CNO must use automated reconnaissance tools, such as Google, to identify information vulnerabilities, and then utilize Internet tools to observe the intelligence gathering, planning, testing, and collaboration activities that represent 95% of the attacker's effort.
NASA Technical Reports Server (NTRS)
Rushley, Stephanie; Carter, Matthew; Chiou, Charles; Farmer, Richard; Haywood, Kevin; Pototzky, Anthony, Jr.; White, Adam; Winker, Daniel
2014-01-01
Colombia is a country with highly variable terrain, from the Andes Mountains to plains and coastal areas, many of these areas are prone to flooding disasters. To identify these risk areas NASA's Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) was used to construct a digital elevation model (DEM) for the study region. The preliminary risk assessment was applied to a pilot study area, the La Mosca River basin. Precipitation data from the National Aeronautics and Space Administration (NASA) Tropical Rainfall Measuring Mission (TRMM)'s near-real-time rainfall products as well as precipitation data from the Instituto de Hidrologia, Meteorologia y Estudios Ambientales (the Institute of Hydrology, Meteorology and Environmental Studies, IDEAM) and stations in the La Mosca River Basin were used to create rainfall distribution maps for the region. Using the precipitation data and the ASTER DEM, the web application, Mi Pronóstico, run by IDEAM, was updated to include an interactive map which currently allows users to search for a location and view the vulnerability and current weather and flooding conditions. The geospatial information was linked to an early warning system in Mi Pronóstico that can alert the public of flood warnings and identify locations of nearby shelters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casella, R.
RESTful (REpresentational State Transfer) web services are an alternative implementation to SOAP/RPC web services in a client/server model. BNLs IT Division has started deploying RESTful Web Services for enterprise data retrieval and manipulation. Data is currently used by system administrators for tracking configuration information and as it is expanded will be used by Cyber Security for vulnerability management and as an aid to cyber investigations. This talk will describe the implementation and outstanding issues as well as some of the reasons for choosing RESTful over SOAP/RPC and future directions.
Develop, Build, and Test a Virtual Lab to Support a Vulnerability Training System
2004-09-01
docs.us.dell.com/support/edocs/systems/pe1650/ en /it/index.htm> (20 August 2004) “HOWTO: Installing Web Services with Linux /Tomcat/Apache/Struts...configured as host machines with VMware and VNC running on a Linux RedHat 9 Kernel. An Apache-Tomcat web server was configured as the external interface to...1650, dual processor, blade servers were configured as host machines with VMware and VNC running on a Linux RedHat 9 Kernel. An Apache-Tomcat web
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walkup, Elizabeth
Passwords are an ubiquitous, established part of the Internet today, but they are also a huge security headache. Single sign-on, OAuth, and password managers are some of the solutions to this problem. OAuth is a new, popular method that allows people to use large, common authentication providers for many web applications. However, it comes at the expense of some privacy: OAuth makes users easy to track across websites, applications, and devices. Password managers put the power in the hands of the users, but this vulnerability survey reveals that you have to be extremely careful which program you choose. All inmore » all, password managers are the solution of choice for home users and small organizations, but large companies will probably want to invest in their own SSO solutions.« less
Protecting clinical data on Web client computers: the PCASSO approach.
Masys, D. R.; Baker, D. B.
1998-01-01
The ubiquity and ease of use of the Web have made it an increasingly popular medium for communication of health-related information. Web interfaces to commercially available clinical information systems are now available or under development by most major vendors. To the extent that such interfaces involve the use of unprotected operating systems, they are vulnerable to security limitations of Web client software environments. The Patient Centered Access to Secure Systems Online (PCASSO) project extends the protections for person-identifiable health data on Web client computers. PCASSO uses several approaches, including physical protection of authentication information, execution containment, graphical displays, and monitoring the client system for intrusions and co-existing programs that may compromise security. PMID:9929243
ERIC Educational Resources Information Center
Gu, Xiaoqing; Ding, Rui; Fu, Shirong
2011-01-01
Senior citizens are comparatively vulnerable in accessing learning opportunities offered on the Internet due to usability problems in current web design. In an effort to build a senior-friendly learning web as a part of the Life-long Learning Network in Shanghai, usability studies of two websites currently available to Shanghai senior citizens…
NASA Astrophysics Data System (ADS)
Nagle, Tadhg; Golden, William
Managing strategic contradiction and paradoxical situations has been gaining importance in technological, innovation and management domains. As a result, more and more paradoxical instances and types have been documented in literature. The innovators dilemma is such an instance that gives a detailed description of how disruptive innovations affect firms. However, the innovators dilemma has only been applied to large organisations and more specifically industry incumbents. Through a multiple case study of six eLearning SME’s, this paper investigates the applicability of the innovators dilemma as well as the disruptive effects of Web 2.0 on the organisations. Analysing the data collected over 18 months, it was found that the innovators dilemma did indeed apply to SME’s. However, inline with the original thesis the dilemma only applied to the SME’s established (pre-2002) before the development of Web 2.0 technologies began. Furthermore, the study highlights that the post-2002 firms were also partly vulnerable to the dilemma but were able to avoid any negative effects though technological visionary leadership. In contrast, the pre-2002 firms were lacking this visionary ability and were also constrained by low risk profiles.
Defining ecospace of Arctic marine food webs using a novel quantitative approach
NASA Astrophysics Data System (ADS)
Gale, M.; Loseto, L. L.
2011-12-01
The Arctic is currently facing unprecedented change with developmental, physical and climatological changes. Food webs within the marine Arctic environment are highly susceptible to anthropogenic stressors and have thus far been understudied. Stable isotopes, in conjunction with a novel set of metrics, may provide a framework that allows us to understand which areas of the Arctic are most vulnerable to change. The objective of this study was to use linear distance metrics applied to stable isotopes to a) define and quantify four Arctic marine food webs in ecospace; b) enable quantifiable comparisons among the four food webs and with other ecosystems; and, c) evaluate vulnerability of the four food webs to anthropogenic stressors such as climate change. The areas studied were Hudson Bay, Beaufort Sea, Lancaster Sound and North Water Polynya. Each region was selected based on the abundance of previous research and published and available stable isotope data in peer-review literature. We selected species to cover trophic levels ranging from particulate matter to polar bears with consideration of pelagic, benthic and ice-associated energy pathways. We interpret higher diversity in baseline carbon energy as signifying higher stability in food web structure. Based on this, the Beaufort Sea food web had the highest stability; the Beaufort Sea food web occupied the largest isotopic niche space and was supported by multiple carbon sources. Areas with top-down control system, such as Lancaster Sound and North Water Polynya, would be the first to experience an increase in trophic redundancy and possible hardships from external stressors, as they have fewer basal carbon sources and greater numbers of mid-high level consumers. We conclude that a diverse carbon energy based ecosystem such as the Beaufort Sea and Hudson Bay regions are more resilient to change than a top down control system.
Bernier, Eveline; Gosselin, Pierre; Badard, Thierry; Bédard, Yvan
2009-04-03
Climate change has a significant impact on population health. Population vulnerabilities depend on several determinants of different types, including biological, psychological, environmental, social and economic ones. Surveillance of climate-related health vulnerabilities must take into account these different factors, their interdependence, as well as their inherent spatial and temporal aspects on several scales, for informed analyses. Currently used technology includes commercial off-the-shelf Geographic Information Systems (GIS) and Database Management Systems with spatial extensions. It has been widely recognized that such OLTP (On-Line Transaction Processing) systems were not designed to support complex, multi-temporal and multi-scale analysis as required above. On-Line Analytical Processing (OLAP) is central to the field known as BI (Business Intelligence), a key field for such decision-support systems. In the last few years, we have seen a few projects that combine OLAP and GIS to improve spatio-temporal analysis and geographic knowledge discovery. This has given rise to SOLAP (Spatial OLAP) and a new research area. This paper presents how SOLAP and climate-related health vulnerability data were investigated and combined to facilitate surveillance. Based on recent spatial decision-support technologies, this paper presents a spatio-temporal web-based application that goes beyond GIS applications with regard to speed, ease of use, and interactive analysis capabilities. It supports the multi-scale exploration and analysis of integrated socio-economic, health and environmental geospatial data over several periods. This project was meant to validate the potential of recent technologies to contribute to a better understanding of the interactions between public health and climate change, and to facilitate future decision-making by public health agencies and municipalities in Canada and elsewhere. The project also aimed at integrating an initial collection of geo-referenced multi-scale indicators that were identified by Canadian specialists and end-users as relevant for the surveillance of the public health impacts of climate change. This system was developed in a multidisciplinary context involving researchers, policy makers and practitioners, using BI and web-mapping concepts (more particularly SOLAP technologies), while exploring new solutions for frequent automatic updating of data and for providing contextual warnings for users (to minimize the risk of data misinterpretation). According to the project participants, the final system succeeds in facilitating surveillance activities in a way not achievable with today's GIS. Regarding the experiments on frequent automatic updating and contextual user warnings, the results obtained indicate that these are meaningful and achievable goals but they still require research and development for their successful implementation in the context of surveillance and multiple organizations. Surveillance of climate-related health vulnerabilities may be more efficiently supported using a combination of BI and GIS concepts, and more specifically, SOLAP technologies (in that it facilitates and accelerates multi-scale spatial and temporal analysis to a point where a user can maintain an uninterrupted train of thought by focussing on "what" she/he wants (not on "how" to get it) and always obtain instant answers, including to the most complex queries that take minutes or hours with OLTP systems (e.g., aggregated, temporal, comparative)). The developed system respects Newell's cognitive band of 10 seconds when performing knowledge discovery (exploring data, looking for hypotheses, validating models). The developed system provides new operators for easily and rapidly exploring multidimensional data at different levels of granularity, for different regions and epochs, and for visualizing the results in synchronized maps, tables and charts. It is naturally adapted to deal with multiscale indicators such as those used in the surveillance community, as confirmed by this project's end-users.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
BingEO: Enable Distributed Earth Observation Data for Environmental Research
NASA Astrophysics Data System (ADS)
Wu, H.; Yang, C.; Xu, Y.
2010-12-01
Our planet is facing great environmental challenges including global climate change, environmental vulnerability, extreme poverty, and a shortage of clean cheap energy. To address these problems, scientists are developing various models to analysis, forecast, simulate various geospatial phenomena to support critical decision making. These models not only challenge our computing technology, but also challenge us to feed huge demands of earth observation data. Through various policies and programs, open and free sharing of earth observation data are advocated in earth science. Currently, thousands of data sources are freely available online through open standards such as Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS). Seamless sharing and access to these resources call for a spatial Cyberinfrastructure (CI) to enable the use of spatial data for the advancement of related applied sciences including environmental research. Based on Microsoft Bing Search Engine and Bing Map, a seamlessly integrated and visual tool is under development to bridge the gap between researchers/educators and earth observation data providers. With this tool, earth science researchers/educators can easily and visually find the best data sets for their research and education. The tool includes a registry and its related supporting module at server-side and an integrated portal as its client. The proposed portal, Bing Earth Observation (BingEO), is based on Bing Search and Bing Map to: 1) Use Bing Search to discover Web Map Services (WMS) resources available over the internet; 2) Develop and maintain a registry to manage all the available WMS resources and constantly monitor their service quality; 3) Allow users to manually register data services; 4) Provide a Bing Maps-based Web application to visualize the data on a high-quality and easy-to-manipulate map platform and enable users to select the best data layers online. Given the amount of observation data accumulated already and still growing, BingEO will allow these resources to be utilized more widely, intensively, efficiently and economically in earth science applications.
Legal bans on pro-suicide web sites: an early retrospective from Australia.
Pirkis, Jane; Neal, Luke; Dare, Andrew; Blood, R Warwick; Studdert, David
2009-04-01
There are worldwide concerns that pro-suicide web sites may trigger suicidal behaviors among vulnerable individuals. In 2006, Australia became the first country to criminalize such sites, sparking heated debate. Concerns were expressed that the law casts the criminal net too widely; inappropriately interferes with the autonomy of those who wish to die; and has jurisdictional limitations, with off-shore web sites remaining largely immune. Conversely, proponents point out that the law may limit access to domestic pro-suicide web sites, raise awareness of Internet-related suicide, mobilize community efforts to combat it, and serve as a powerful expression of societal norms about the promotion of suicidal behavior.
Web Server Security on Open Source Environments
NASA Astrophysics Data System (ADS)
Gkoutzelis, Dimitrios X.; Sardis, Manolis S.
Administering critical resources has never been more difficult that it is today. In a changing world of software innovation where major changes occur on a daily basis, it is crucial for the webmasters and server administrators to shield their data against an unknown arsenal of attacks in the hands of their attackers. Up until now this kind of defense was a privilege of the few, out-budgeted and low cost solutions let the defender vulnerable to the uprising of innovating attacking methods. Luckily, the digital revolution of the past decade left its mark, changing the way we face security forever: open source infrastructure today covers all the prerequisites for a secure web environment in a way we could never imagine fifteen years ago. Online security of large corporations, military and government bodies is more and more handled by open source application thus driving the technological trend of the 21st century in adopting open solutions to E-Commerce and privacy issues. This paper describes substantial security precautions in facing privacy and authentication issues in a totally open source web environment. Our goal is to state and face the most known problems in data handling and consequently propose the most appealing techniques to face these challenges through an open solution.
A survey of kidney disease and risk-factor information on the World Wide Web.
Calderón, José Luis; Zadshir, Ashraf; Norris, Keith
2004-11-11
Chronic kidney disease (CKD) is epidemic, and informing those at risk is a national health priority. However, the discrepancy between the readability of health information and the literacy skills of those it targets is a recognized barrier to communicating health information that may promote good health outcomes. Because the World Wide Web has become one of the most important sources of health information, we sought to assess the readability of commonly available CKD information. Twelve highly cited English-language, kidney disease Web sites were identified with 4 popular search engines. Each Web site was reviewed for the availability of 6 domains of information germane to CKD and risk-factor information. We estimated readability scores with the Flesch-Kincaid and Flesch Reading Ease Index methods. The deviation of readability scores for CKD information from readability appropriate to average literacy skills and the limited literacy skills of vulnerable populations (low socioeconomic status, health disparities, and elderly) were calculated. Eleven Web sites met the inclusion criteria. Six of 11 sites provided information on all 6 domains of CKD and risk-factor information. Mean readability scores for all 6 domains of CKD information exceeded national average literacy skills and far exceeded the fifth-grade-level readability desired for informing vulnerable populations. Information about CKD and diabetes consistently had higher readability scores. Information on the World Wide Web about CKD and its risk factors may not be readable for comprehension by the general public, especially by underserved minority populations with limited literacy skills. Barriers to health communication may be important contributors to the rising CKD epidemic and disparities in CKD health status experienced by minority populations.
Vulnerabilities to misinformation in online pharmaceutical marketing.
De Freitas, Julian; Falls, Brian A; Haque, Omar S; Bursztajn, Harold J
2013-05-01
Given the large percentage of Internet users who search for health information online, pharmaceutical companies have invested significantly in online marketing of their products. Although online pharmaceutical marketing can potentially benefit both physicians and patients, it can also harm these groups by misleading them. Indeed, some pharmaceutical companies have been guilty of undue influence, which has threatened public health and trust. We conducted a review of the available literature on online pharmaceutical marketing, undue influence and the psychology of decision-making, in order to identify factors that contribute to Internet users' vulnerability to online pharmaceutical misinformation. We find five converging factors: Internet dependence, excessive trust in the veracity of online information, unawareness of pharmaceutical company influence, social isolation and detail fixation. As the Internet continues to change, it is important that regulators keep in mind not only misinformation that surrounds new web technologies and their contents, but also the factors that make Internet users vulnerable to misinformation in the first place. Psychological components are a critical, although often neglected, risk factor for Internet users becoming misinformed upon exposure to online pharmaceutical marketing. Awareness of these psychological factors may help Internet users attentively and safely navigate an evolving web terrain.
Exceptional body sizes but typical trophic structure in a Pleistocene food web.
Segura, Angel M; Fariña, Richard A; Arim, Matías
2016-05-01
In this study, we focused on the exceptionally large mammals inhabiting the Americas during the Quaternary period and the paramount role of body size in species ecology. We evaluated two main features of Pleistocene food webs: the relationship between body size and (i) trophic position and (ii) vulnerability to predation. Despite the large range of species sizes, we found a hump-shaped relationship between trophic position and body size. We also found a negative trend in species vulnerability similar to that observed in modern faunas. The largest species lived near the boundary of energetic constraints, such that any shift in resource availability could drive these species to extinction. Our results reinforce several features of megafauna ecology: (i) the negative relationship between trophic position and body size implies that large-sized species were particularly vulnerable to changes in energetic support; (ii) living close to energetic imbalance could favour the incorporation of additional energy sources, for example, a transition from a herbivorous to a scavenging diet in the largest species (e.g. Megatherium) and (iii) the interactions and structure of Quaternary megafauna communities were shaped by similar forces to those shaping modern fauna communities. © 2016 The Author(s).
Vulnerabilities to misinformation in online pharmaceutical marketing
De Freitas, Julian; Falls, Brian A; Haque, Omar S; Bursztajn, Harold J
2013-01-01
Given the large percentage of Internet users who search for health information online, pharmaceutical companies have invested significantly in online marketing of their products. Although online pharmaceutical marketing can potentially benefit both physicians and patients, it can also harm these groups by misleading them. Indeed, some pharmaceutical companies have been guilty of undue influence, which has threatened public health and trust. We conducted a review of the available literature on online pharmaceutical marketing, undue influence and the psychology of decision-making, in order to identify factors that contribute to Internet users’ vulnerability to online pharmaceutical misinformation. We find five converging factors: Internet dependence, excessive trust in the veracity of online information, unawareness of pharmaceutical company influence, social isolation and detail fixation. As the Internet continues to change, it is important that regulators keep in mind not only misinformation that surrounds new web technologies and their contents, but also the factors that make Internet users vulnerable to misinformation in the first place. Psychological components are a critical, although often neglected, risk factor for Internet users becoming misinformed upon exposure to online pharmaceutical marketing. Awareness of these psychological factors may help Internet users attentively and safely navigate an evolving web terrain. PMID:23761527
Tarzia, Laura; May, Carl; Hegarty, Kelsey
2016-11-24
Domestic violence shares many features with chronic disease, including ongoing physical and mental health problems and eroded self-efficacy. Given the challenges around help-seeking for women experiencing domestic violence, it is essential that they be given support to 'self-manage' their condition. The growing popularity of web-based applications for chronic disease self-management suggests that there may be opportunities to use them as an intervention strategy for women experiencing domestic violence, however, as yet, little is known about whether this might work in practice. It is critical that interventions for domestic violence-whether web-based or otherwise-promote agency and capacity for action rather than adding to the 'workload' of already stressed and vulnerable women. Although randomised controlled trials are vital to determine the effectiveness of interventions, robust theoretical frameworks can complement them as a way of examining the feasibility of implementing an intervention in practice. To date, no such frameworks have been developed for the domestic violence context. Consequently, in this paper we propose that it may be useful to appraise interventions for domestic violence using frameworks developed to help understand the barriers and facilitators around self-management of chronic conditions. Using a case study of an online healthy relationship tool and safety decision aid developed in Australia (I-DECIDE), this paper adapts and applies two theories: Burden of Treatment Theory and Normalisation Process Theory, to assess whether the intervention might increase women's agency and capacity for action. In doing this, it proposes a new theoretical model with which the practical application of domestic violence interventions could be appraised in conjunction with other evaluation frameworks. This paper argues that theoretical frameworks for chronic disease are appropriate to assess the feasibility of implementing interventions for domestic violence in practice. The use of the modified Burden of Treatment/Normalisation Process Theory framework developed in this paper strengthens the case for I-DECIDE and other web-based applications as a way of supporting women experiencing domestic violence.
Jaiswal, Kishor
2013-01-01
This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macumber, Daniel L; Horowitz, Scott G; Schott, Marjorie
Across most industries, desktop applications are being rapidly migrated to web applications for a variety of reasons. Web applications are inherently cross platform, mobile, and easier to distribute than desktop applications. Fueling this trend are a wide range of free, open source libraries and frameworks that make it incredibly easy to develop powerful web applications. The building energy modeling community is just beginning to pick up on these larger trends, with a small but growing number of building energy modeling applications starting on or moving to the web. This paper presents a new, open source, web based geometry editor formore » Building Energy Modeling (BEM). The editor is written completely in JavaScript and runs in a modern web browser. The editor works on a custom JSON file format and is designed to be integrated into a variety of web and desktop applications. The web based editor is available to use as a standalone web application at: https://nrel.github.io/openstudio-geometry-editor/. An example integration is demonstrated with the OpenStudio desktop application. Finally, the editor can be easily integrated with a wide range of possible building energy modeling web applications.« less
A Study and Taxonomy of Vulnerabilities in Web Based Animation and Interactivity Software
2010-12-01
Flash Player is available as a plugin for most common Web browsers (Firefox, Mozilla, Netscape, Opera) and as an ActiveX control for Internet...script or HTML via (1) a swf file that uses the asfunction: protocol or (2) the navigateToURL function when used with the Flash Player ActiveX ...malicious page or open a malicious file. 2. Coding an Exploit The specific flaw exists in the Flash Player ActiveX Control’s handling of the
MedlinePlus Connect: How it Works
... it looks depends on how it is implemented. Web Application The Web application returns a formatted response ... for more examples of Web Application response pages. Web Service The MedlinePlus Connect REST-based Web service ...
Langseth, Brian J.; Jones, Michael L.; Riley, Stephen C.
2014-01-01
Ecopath with Ecosim (EwE) is a widely used modeling tool in fishery research and management. Ecopath requires a mass-balanced snapshot of a food web at a particular point in time, which Ecosim then uses to simulate changes in biomass over time. Initial inputs to Ecopath, including estimates for biomasses, production to biomass ratios, consumption to biomass ratios, and diets, rarely produce mass balance, and thus ad hoc changes to inputs are required to balance the model. There has been little previous research of whether ad hoc changes to achieve mass balance affect Ecosim simulations. We constructed an EwE model for the offshore community of Lake Huron, and balanced the model using four contrasting but realistic methods. The four balancing methods were based on two contrasting approaches; in the first approach, production of unbalanced groups was increased by increasing either biomass or the production to biomass ratio, while in the second approach, consumption of predators on unbalanced groups was decreased by decreasing either biomass or the consumption to biomass ratio. We compared six simulation scenarios based on three alternative assumptions about the extent to which mortality rates of prey can change in response to changes in predator biomass (i.e., vulnerabilities) under perturbations to either fishing mortality or environmental production. Changes in simulated biomass values over time were used in a principal components analysis to assess the comparative effect of balancing method, vulnerabilities, and perturbation types. Vulnerabilities explained the most variation in biomass, followed by the type of perturbation. Choice of balancing method explained little of the overall variation in biomass. Under scenarios where changes in predator biomass caused large changes in mortality rates of prey (i.e., high vulnerabilities), variation in biomass was greater than when changes in predator biomass caused only small changes in mortality rates of prey (i.e., low vulnerabilities), and was amplified when environmental production was increased. When standardized to mean changes in biomass within each scenario, scenarios when vulnerabilities were low and when fishing mortality was increased explained the most variation in biomass. Our findings suggested that approaches to balancing Ecopath models have relatively little effect on changes in biomass over time, especially when compared to assumptions about how mortality rates of prey change in response to changes in predator biomass. We concluded that when constructing food-web models using EwE, determining the effect of changes in predator biomass on mortality rates of prey should be prioritized over determining the best way to balance the model.
Cryptography for a High-Assurance Web-Based Enterprise
2013-10-01
2. Other Cryptographic services - Java provides many cryptographic services through the Java Cryptography Architecture (JCA) framework. The...id=2125 [7]. Miller, Sandra Kay, Fiber Optic Networks Vulnerable to Attack, Information Security Magazine, November 15, 2006, [8]. José R.C
McCary, Matthew A; Mores, Robin; Farfan, Monica A; Wise, David H
2016-03-01
Although invasive plants are a major source of terrestrial ecosystem degradation worldwide, it remains unclear which trophic levels above the base of the food web are most vulnerable to plant invasions. We performed a meta-analysis of 38 independent studies from 32 papers to examine how invasive plants alter major groupings of primary and secondary consumers in three globally distributed ecosystems: wetlands, woodlands and grasslands. Within each ecosystem we examined if green (grazing) food webs are more sensitive to plant invasions compared to brown (detrital) food webs. Invasive plants have strong negative effects on primary consumers (detritivores, bacterivores, fungivores, and/or herbivores) in woodlands and wetlands, which become less abundant in both green and brown food webs in woodlands and green webs in wetlands. Plant invasions increased abundances of secondary consumers (predators and/or parasitoids) only in woodland brown food webs and green webs in wetlands. Effects of invasive plants on grazing and detrital food webs clearly differed between ecosystems. Overall, invasive plants had the most pronounced effects on the trophic structure of wetlands and woodlands, but caused no detectable changes to grassland trophic structure. © 2016 John Wiley & Sons Ltd/CNRS.
Hearn, Paul; Strong, David; Swain, Eric; Decker, Jeremy
2013-01-01
South Florida's Greater Everglades area is particularly vulnerable to sea level rise, due to its rich endowment of animal and plant species and its heavily populated urban areas along the coast. Rising sea levels are expected to have substantial impacts on inland flooding, the depth and extent of surge from coastal storms, the degradation of water supplies by saltwater intrusion, and the integrity of plant and animal habitats. Planners and managers responsible for mitigating these impacts require advanced tools to help them more effectively identify areas at risk. The U.S. Geological Survey's (USGS) Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE) Web site has been developed to address these needs by providing more convenient access to projections from models that forecast the effects of sea level rise on surface water and groundwater, the extent of surge and resulting economic losses from coastal storms, and the distribution of habitats. IMMAGE not only provides an advanced geographic information system (GIS) interface to support decision making, but also includes topic-based modules that explain and illustrate key concepts for nontechnical users. The purpose of this report is to familiarize both technical and nontechnical users with the IMMAGE Web site and its various applications.
NASA Astrophysics Data System (ADS)
Abad-Mota, S.; Guenni, L.; Salcedo, A.; Cardinale, Y.
2006-05-01
Climate variability, environmental degradation and poor livelihood conditions of an important proportion of the population are all key factors determining the high vulnerability of the population to natural disasters and vector-borne diseases as malaria and dengue in most tropical Latin American countries. It is not uncommon that basic bio-geophysical and hydro-meteorological data required for understanding vulnerability and risk of the population to these environmental hazards at present and on retrospective are disperse, have limited quality and are not easily accessible. In Venezuela for example, hydrometeorological data from ground based networks, are collected by different agencies for specific purposes and applications going from aviation, agriculture, hydropower generation and general public needs. In order to improve accessibility, visibility and output products, two public universities in Venezuela: Universidad Simón Bolívar (USB) and Universidad Central de Venezuela (UCV) have designed a data management project to integrate all these historical point data holdings together with the metadata relating to their origin, in a single data repository with facilities for storage, manipulation, extraction and dissemination. Several statistical analyses of the data will be presented as client tailored made products, for specific applications oriented to environmental and epidemiological risk assessments. The project has two main phases: modeling of the hidroclimatic data and its metadata and development of the web site through which services will be provided. We have collected historical data from different sources in the country. These sources use different formats and have their data at different levels of granularity. Our data model should be general enough to accomodate all these differences annotated with the appropriate metadata. The quality of these data will be evaluated, statistically and semantically. The modeled data will be stored in a database, so that queries are allowed. A web site specially designed for this project will provide an interface for querying the data, analyzing the data statistically and visualizing it in maps and images. A special module will be built to allow the execution of different applications and decision making procedures. In this module we plan to implement a scientific workflow facility which should simplify the construction of new applications over the existing data. In a final stage we will explore running some of these applications on a grid and interact with the Grid Venezuela Project being developed in our country by other groups of researchers. The development of this data project includes facilities to incorporate real time data from a newer generation of measurement devices to assure an ongoing data integration activity in the near future.
A prototype web-GIS application for risk analysis of natural hazards in Switzerland
NASA Astrophysics Data System (ADS)
Aye, Zar Chi; Nicolet, Pierrick; Jaboyedoff, Michel; Derron, Marc-Henri; Gerber, Christian; Lévy, Sebastien
2016-04-01
Following changes in the system of Swiss subsidy in January 2008, the Swiss cantons and the Federal Office for the Environment (FOEN) were forced to prioritize different natural hazard protection projects based on their cost-effectiveness, as a response to limited financial resources (Bründl et al., 2009). For this purpose, applications such as EconoMe (OFEV, 2016) and Valdorisk (DGE, 2016) were developed for risk evaluation and prioritization of mitigation projects. These tools serve as a useful decision-making instrument to the community of practitioners and responsible authorities for natural hazard risk management in Switzerland. However, there are several aspects which could be improved, in particular, the integration and visualization of spatial information interactively through a web-GIS interface for better risk planning and evaluation. Therefore, in this study, we aim to develop an interactive web-GIS application based on the risk concepts applied in Switzerland. The purpose of this tool is to provide a rapid evaluation of risk before and after protection measures, and to test the efficiency of measures by using a simplified cost-benefit analysis within the context of different protection projects. This application allows to integrate different layers which are necessary to calculate risk, in particular, hazard intensity (vector) maps for different scenarios (such as 30, 100 and 300 years of return periods based on Swiss guidelines), exposed objects (such as buildings) and vulnerability information of these objects. Based on provided information and additional parameters, risk is calculated automatically and results are visualized within the web-GIS interface of the application. The users can modify these input information and parameters to create different risk scenarios. Based on the resultant risk scenarios, the users can propose and visualize (preliminary) risk reduction measures before realizing the actual design and dimensions of such protective measures in the area. After designing measures, the users can re-calculate risk by updating hazard intensity and object layers. This is achieved by manual editing of shape (vector) layers in the web-GIS interface interactively. Within the application, a cost-benefit analysis tool is also integrated to support the decision-making process for the selection of different protection measures. Finally, the resultant risk information (vector layers and data) can be exported in the form of shapefiles and excel sheets. A prototype application is realized using open-source geospatial software and technologies. Boundless framework with its client-side SDK environment is applied for the rapid prototyping. Free and open source components such as PostGIS spatial database, GeoServer and GeoWebCache, GeoExt and OpenLayers are used for the development of the platform. This developed prototype is demonstrated with a case study area located in Les Diablerets, Switzerland. This research work is carried out within a project funded by the Canton of Vaud, Switzerland. References: Bründl, M., Romang, H. E., Bischof, N., and Rheinberger, C. M.: The risk concept and its application in natural hazard risk management in Switzerland, Nat. Hazards Earth Syst. Sci., 9, 801-813, 2009. DGE: Valdorisk - Direction Générale de l'Environnement, www.vd.ch, accessed 9 January 2016, 2016. OFEV: EconoMe - Office fédéral de l'environnement, www.econome.admin.ch, accessed 9 January 2016, 2016.
Polymer-Polymer Bilayer Actuator
NASA Technical Reports Server (NTRS)
Su, Ji (Inventor); Harrison, Joycelyn S. (Inventor); St.Clair, Terry L. (Inventor)
2003-01-01
A device for providing an electromechanical response includes two polymeric webs bonded to each other along their lengths. At least one polymeric web is activated upon application thereto of an electric field and exhibits electrostriction by rotation of polar graft moieties within the polymeric web. In one embodiment, one of the two polymeric webs in an active web upon application thereto of the electric field, and the other polymeric web is a non-active web upon application thereto of the electric field. In another embodiment, both of the two polymeric webs are capable of being active webs upon application thereto of the electric field. However, these two polymeric webs are alternately activated and non-activated by the electric field.
The Role of the Web Server in a Capstone Web Application Course
ERIC Educational Resources Information Center
Umapathy, Karthikeyan; Wallace, F. Layne
2010-01-01
Web applications have become commonplace in the Information Systems curriculum. Much of the discussion about Web development for capstone courses has centered on the scripting tools. Very little has been discussed about different ways to incorporate the Web server into Web application development courses. In this paper, three different ways of…
Using Integrated Earth and Social Science Data for Disaster Risk Assessment
NASA Astrophysics Data System (ADS)
Downs, R. R.; Chen, R. S.; Yetman, G.
2016-12-01
Society faces many different risks from both natural and technological hazards. In some cases, disaster risk managers focus on only a few risks, e.g., in regions where a single hazard such as earthquakes dominate. More often, however, disaster risk managers deal with multiple hazards that pose diverse threats to life, infrastructure, and livelihoods. From the viewpoint of scientists, hazards are often studied based on traditional disciplines such as seismology, hydrology, climatology, and epidemiology. But from the viewpoint of disaster risk managers, data are needed on all hazards in a specific region and on the exposure and vulnerability of population, infrastructure, and economic resources and activity. Such managers also need to understand how hazards, exposures, and vulnerabilities may interact, and human and environmental systems respond, to hazard events, as in the case of the Fukushima nuclear disaster that followed from the Sendai earthquake and tsunami. In this regard, geospatial tools that enable visualization and analysis of both Earth and social science data can support the use case of disaster risk managers who need to quickly assess where specific hazard events occur relative to population and critical infrastructure. Such information can help them assess the potential severity of actual or predicted hazard events, identify population centers or key infrastructure at risk, and visualize hazard dynamics, e.g., earthquakes and their aftershocks or the paths of severe storms. This can then inform efforts to mitigate risks across multiple hazards, including reducing exposure and vulnerability, strengthening system resiliency, improving disaster response mechanisms, and targeting mitigation resources to the highest or most critical risks. We report here on initial efforts to develop hazard mapping tools that draw on open web services and support simple spatial queries about population exposure. The NASA Socioeconomic Data and Applications Center (SEDAC) Hazards Mapper, a web-based mapping tool, enables users to estimate population living in areas subject to flood or tornado warnings, near recent earthquakes, or around critical infrastructure. The HazPop mobile app, implemented for iOS devices, utilizes location services to support disaster risk managers working in field conditions.
XMM-Newton Mobile Web Application
NASA Astrophysics Data System (ADS)
Ibarra, A.; Kennedy, M.; Rodríguez, P.; Hernández, C.; Saxton, R.; Gabriel, C.
2013-10-01
We present the first XMM-Newton web mobile application, coded using new web technologies such as HTML5, the Query mobile framework, and D3 JavaScript data-driven library. This new web mobile application focuses on re-formatted contents extracted directly from the XMM-Newton web, optimizing the contents for mobile devices. The main goals of this development were to reach all kind of handheld devices and operating systems, while minimizing software maintenance. The application therefore has been developed as a web mobile implementation rather than a more costly native application. New functionality will be added regularly.
Ajax Architecture Implementation Techniques
NASA Astrophysics Data System (ADS)
Hussaini, Syed Asadullah; Tabassum, S. Nasira; Baig, Tabassum, M. Khader
2012-03-01
Today's rich Web applications use a mix of Java Script and asynchronous communication with the application server. This mechanism is also known as Ajax: Asynchronous JavaScript and XML. The intent of Ajax is to exchange small pieces of data between the browser and the application server, and in doing so, use partial page refresh instead of reloading the entire Web page. AJAX (Asynchronous JavaScript and XML) is a powerful Web development model for browser-based Web applications. Technologies that form the AJAX model, such as XML, JavaScript, HTTP, and XHTML, are individually widely used and well known. However, AJAX combines these technologies to let Web pages retrieve small amounts of data from the server without having to reload the entire page. This capability makes Web pages more interactive and lets them behave like local applications. Web 2.0 enabled by the Ajax architecture has given rise to a new level of user interactivity through web browsers. Many new and extremely popular Web applications have been introduced such as Google Maps, Google Docs, Flickr, and so on. Ajax Toolkits such as Dojo allow web developers to build Web 2.0 applications quickly and with little effort.
NASA Astrophysics Data System (ADS)
Friberg, P. A.; Luis, R. S.; Quintiliani, M.; Lisowski, S.; Hunter, S.
2014-12-01
Recently, a novel set of modules has been included in the Open Source Earthworm seismic data processing system, supporting the use of web applications. These include the Mole sub-system, for storing relevant event data in a MySQL database (see M. Quintiliani and S. Pintore, SRL, 2013), and an embedded webserver, Moleserv, for serving such data to web clients in QuakeML format. These modules have enabled, for the first time using Earthworm, the use of web applications for seismic data processing. These can greatly simplify the operation and maintenance of seismic data processing centers by having one or more servers providing the relevant data as well as the data processing applications themselves to client machines running arbitrary operating systems.Web applications with secure online web access allow operators to work anywhere, without the often cumbersome and bandwidth hungry use of secure shell or virtual private networks. Furthermore, web applications can seamlessly access third party data repositories to acquire additional information, such as maps. Finally, the usage of HTML email brought the possibility of specialized web applications, to be used in email clients. This is the case of EWHTMLEmail, which produces event notification emails that are in fact simple web applications for plotting relevant seismic data.Providing web services as part of Earthworm has enabled a number of other tools as well. One is ISTI's EZ Earthworm, a web based command and control system for an otherwise command line driven system; another is a waveform web service. The waveform web service serves Earthworm data to additional web clients for plotting, picking, and other web-based processing tools. The current Earthworm waveform web service hosts an advanced plotting capability for providing views of event-based waveforms from a Mole database served by Moleserve.The current trend towards the usage of cloud services supported by web applications is driving improvements in JavaScript, css and HTML, as well as faster and more efficient web browsers, including mobile. It is foreseeable that in the near future, web applications are as powerful and efficient as native applications. Hence the work described here has been the first step towards bringing the Open Source Earthworm seismic data processing system to this new paradigm.
Using web-based observations to identify thresholds of a person's stability in a flow
NASA Astrophysics Data System (ADS)
Milanesi, L.; Pilotti, M.; Bacchi, B.
2016-10-01
Flood risk assessment and mitigation are important tasks that should take advantage of rational vulnerability models to increase their effectiveness. These models are usually identified through a relevant set of laboratory experiments. However, there is growing evidence that these tests are not fully representative of the variety of conditions that characterize real flood hazard situations. This paper suggests a citizen science-based and innovative approach to obtain information from web resources for the calibration of people's vulnerability models. A comprehensive study employing commonly used web engines allowed the collection of a wide set of documents showing real risk situations for people impacted by floods, classified according to the stability of the involved subjects. A procedure to extrapolate the flow depth and velocity from the video frames is developed and its reliability is verified by comparing the results with observation. The procedure is based on the statistical distribution of the population height employing a direct uncertainty propagation method. The results complement the experimental literature data and conceptual models. The growing availability of online information will progressively increase the sample size on which the procedure is based and will eventually lead to the identification of a probability surface describing the transition between stability and instability conditions of individuals in a flow.
NASA Astrophysics Data System (ADS)
Bachelet, D. M.
2014-12-01
Climate change is projected to jeopardize ecosystems in the Pacific Northwest. Managing ecosystems for future resilience requires collaboration, innovation and communication. The abundance of data and documents describing the uncertainty around both climate change projections and impacts has become challenging to managers who have little funding and limited time to digest and incorporate these materials into planning and implementation documents. We worked with US Forest Service and BLM managers to help them develop vulnerability assessments and identify on-the-ground strategies to address climate change challenges on the federal lands in northwest Oregon (Siuslaw, Willamette and Mt. Hood National Forests; Eugene and Salem BLM Districts). We held workshops to promote dialogue about climate change, which were particularly effective in fostering discussions between the managers who often do not have the time to share their knowledge and compare experiences across administrative boundaries. We used the Adaptation for Conservation Targets (ACT) framework to identify measurable management objectives and rapidly assess local vulnerabilities. We used databasin.org to centralize usable information, including state-of-the-art CMIP5 climate projections, for the mandated assessments of vulnerability and resilience. We introduced participants to a decision support framework providing opportunities to develop more effective adaptation strategies. We built a special web page to hold the information gathered at the workshops and provide easy access to climate change information. We are now working with several Landscape Conservation Cooperatives (LCCs) to design gateways - conservation atlases - to their relevant data repositories on databasin.org and working with them to develop web tools that can provide usable information for their own vulnerability assessments.
NASA Astrophysics Data System (ADS)
Wibonele, Kasanda J.; Zhang, Yanqing
2002-03-01
A web data mining system using granular computing and ASP programming is proposed. This is a web based application, which allows web users to submit survey data for many different companies. This survey is a collection of questions that will help these companies develop and improve their business and customer service with their clients by analyzing survey data. This web application allows users to submit data anywhere. All the survey data is collected into a database for further analysis. An administrator of this web application can login to the system and view all the data submitted. This web application resides on a web server, and the database resides on the MS SQL server.
A web-based tool for ranking landslide mitigation measures
NASA Astrophysics Data System (ADS)
Lacasse, S.; Vaciago, G.; Choi, Y. J.; Kalsnes, B.
2012-04-01
As part of the research done in the European project SafeLand "Living with landslide risk in Europe: Assessment, effects of global change, and risk management strategies", a compendium of structural and non-structural mitigation measures for different landslide types in Europe was prepared, and the measures were assembled into a web-based "toolbox". Emphasis was placed on providing a rational and flexible framework applicable to existing and future mitigation measures. The purpose of web-based toolbox is to assist decision-making and to guide the user in the choice of the most appropriate mitigation measures. The mitigation measures were classified into three categories, describing whether the mitigation measures addressed the landslide hazard, the vulnerability or the elements at risk themselves. The measures considered include structural measures reducing hazard and non-structural mitigation measures, reducing either the hazard or the consequences (or vulnerability and exposure of elements at risk). The structural measures include surface protection and control of surface erosion; measures modifying the slope geometry and/or mass distribution; measures modifying surface water regime - surface drainage; measures mo¬difying groundwater regime - deep drainage; measured modifying the mechanical charac¬teristics of unstable mass; transfer of loads to more competent strata; retaining structures (to modify slope geometry and/or to transfer stress to compe¬tent layer); deviating the path of landslide debris; dissipating the energy of debris flows; and arresting and containing landslide debris or rock fall. The non-structural mitigation measures, reducing either the hazard or the consequences: early warning systems; restricting or discouraging construction activities; increasing resistance or coping capacity of elements at risk; relocation of elements at risk; sharing of risk through insurance. The measures are described in the toolbox with fact sheets providing a brief description, guidance on design, schematic details, practical examples and references for each mitigation measure. Each of the measures was given a score on its ability and applicability for different types of landslides and boundary conditions, and a decision support matrix was established. The web-based toolbox organizes the information in the compendium and provides an algorithm to rank the measures on the basis of the decision support matrix, and on the basis of the risk level estimated at the site. The toolbox includes a description of the case under study and offers a simplified option for estimating the hazard and risk levels of the slide at hand. The user selects the mitigation measures to be included in the assessment. The toolbox then ranks, with built-in assessment factors and weights and/or with user-defined ranking values and criteria, the mitigation measures included in the analysis. The toolbox includes data management, e.g. saving data half-way in an analysis, returning to an earlier case, looking up prepared examples or looking up information on mitigation measures. The toolbox also generates a report and has user-forum and help features. The presentation will give an overview of the mitigation measures considered and examples of the use of the toolbox, and will take the attendees through the application of the toolbox.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
... (Internet Student CPR Web Registration Application); Comment Request AGENCY: Veterans Health Administration... web registration application. DATES: Written comments and recommendations on the proposed collection.... Title: Internet Student CPR Web Registration Application, VA Form 10-0468. OMB Control Number: 2900-0746...
Changes in host-parasitoid food web structure with elevation.
Maunsell, Sarah C; Kitching, Roger L; Burwell, Chris J; Morris, Rebecca J
2015-03-01
Gradients in elevation are increasingly used to investigate how species respond to changes in local climatic conditions. Whilst many studies have shown elevational patterns in species richness and turnover, little is known about how food web structure is affected by elevation. Contrasting responses of predator and prey species to elevation may lead to changes in food web structure. We investigated how the quantitative structure of a herbivore-parasitoid food web changes with elevation in an Australian subtropical rain forest. On four occasions, spread over 1 year, we hand-collected leaf miners at twelve sites, along three elevational gradients (between 493 m and 1159 m a.s.l). A total of 5030 insects, including 603 parasitoids, were reared, and summary food webs were created for each site. We also carried out a replicated manipulative experiment by translocating an abundant leaf-mining weevil Platynotocis sp., which largely escaped parasitism at high elevations (≥ 900 m a.s.l.), to lower, warmer elevations, to test if it would experience higher parasitism pressure. We found strong evidence that the environmental change that occurs with increasing elevation affects food web structure. Quantitative measures of generality, vulnerability and interaction evenness decreased significantly with increasing elevation (and decreasing temperature), whilst elevation did not have a significant effect on connectance. Mined plant composition also had a significant effect on generality and vulnerability, but not on interaction evenness. Several relatively abundant species of leaf miner appeared to escape parasitism at higher elevations, but contrary to our prediction, Platynotocis sp. did not experience greater levels of parasitism when translocated to lower elevations. Our study indicates that leaf-mining herbivores and their parasitoids respond differently to environmental conditions imposed by elevation, thus producing structural changes in their food webs. Increasing temperatures and changes in vegetation communities that are likely to result from climate change may have a restructuring effect on host-parasitoid food webs. Our translocation experiment, however, indicated that leaf miners currently escaping parasitism at high elevations may not automatically experience higher parasitism under warmer conditions and future changes in food web structure may depend on the ability of parasitoids to adapt to novel hosts. © 2014 The Authors. Journal of Animal Ecology © 2014 British Ecological Society.
SSE-GIS v1.03 Web Mapping Application Now Available
Atmospheric Science Data Center
2018-03-16
SSE-GIS v1.03 Web Mapping Application Now Available Wednesday, July 6, 2016 ... you haven’t already noticed the link to the new SSE-GIS web application on the SSE homepage entitled “GIS Web Mapping Applications and Services”, we invite you to visit the site. ...
WebViz: A web browser based application for collaborative analysis of 3D data
NASA Astrophysics Data System (ADS)
Ruegg, C. S.
2011-12-01
In the age of high speed Internet where people can interact instantly, scientific tools have lacked technology which can incorporate this concept of communication using the web. To solve this issue a web application for geological studies has been created, tentatively titled WebViz. This web application utilizes tools provided by Google Web Toolkit to create an AJAX web application capable of features found in non web based software. Using these tools, a web application can be created to act as piece of software from anywhere in the globe with a reasonably speedy Internet connection. An application of this technology can be seen with data regarding the recent tsunami from the major japan earthquakes. After constructing the appropriate data to fit a computer render software called HVR, WebViz can request images of the tsunami data and display it to anyone who has access to the application. This convenience alone makes WebViz a viable solution, but the option to interact with this data with others around the world causes WebViz to be taken as a serious computational tool. WebViz also can be used on any javascript enabled browser such as those found on modern tablets and smart phones over a fast wireless connection. Due to the fact that WebViz's current state is built using Google Web Toolkit the portability of the application is in it's most efficient form. Though many developers have been involved with the project, each person has contributed to increase the usability and speed of the application. In the project's most recent form a dramatic speed increase has been designed as well as a more efficient user interface. The speed increase has been informally noticed in recent uses of the application in China and Australia with the hosting server being located at the University of Minnesota. The user interface has been improved to not only look better but the functionality has been improved. Major functions of the application are rotating the 3D object using buttons. These buttons have been replaced with a new layout that is easier to understand the function and is also easy to use with mobile devices. With these new changes, WebViz is easier to control and use for general use.
Estimation of vulnerability functions based on a global earthquake damage database
NASA Astrophysics Data System (ADS)
Spence, R. J. S.; Coburn, A. W.; Ruffle, S. J.
2009-04-01
Developing a better approach to the estimation of future earthquake losses, and in particular to the understanding of the inherent uncertainties in loss models, is vital to confidence in modelling potential losses in insurance or for mitigation. For most areas of the world there is currently insufficient knowledge of the current building stock for vulnerability estimates to be based on calculations of structural performance. In such areas, the most reliable basis for estimating vulnerability is performance of the building stock in past earthquakes, using damage databases, and comparison with consistent estimates of ground motion. This paper will present a new approach to the estimation of vulnerabilities using the recently launched Cambridge University Damage Database (CUEDD). CUEDD is based on data assembled by the Martin Centre at Cambridge University since 1980, complemented by other more-recently published and some unpublished data. The database assembles in a single, organised, expandable and web-accessible database, summary information on worldwide post-earthquake building damage surveys which have been carried out since the 1960's. Currently it contains data on the performance of more than 750,000 individual buildings, in 200 surveys following 40 separate earthquakes. The database includes building typologies, damage levels, location of each survey. It is mounted on a GIS mapping system and links to the USGS Shakemaps of each earthquake which enables the macroseismic intensity and other ground motion parameters to be defined for each survey and location. Fields of data for each building damage survey include: · Basic earthquake data and its sources · Details of the survey location and intensity and other ground motion observations or assignments at that location · Building and damage level classification, and tabulated damage survey results · Photos showing typical examples of damage. In future planned extensions of the database information on human casualties will also be assembled. The database also contains analytical tools enabling data from similar locations, building classes or ground motion levels to be assembled and thus vulnerability relationships derived for any chosen ground motion parameter, for a given class of building, and for particular countries or regions. The paper presents examples of vulnerability relationships for particular classes of buildings and regions of the world, together with the estimated uncertainty ranges. It will discuss the applicability of such vulnerability functions in earthquake loss assessment for insurance purposes or for earthquake risk mitigation.
ERIC Educational Resources Information Center
Keng, Tan Chin; Ching, Yeoh Kah
2015-01-01
The use of web applications has become a trend in many disciplines including education. In view of the influence of web application in education, this study examines web application technologies that could enhance undergraduates' learning experiences, with focus on Quantity Surveying (QS) and Information Technology (IT) undergraduates. The…
Just-in-time Database-Driven Web Applications
2003-01-01
"Just-in-time" database-driven Web applications are inexpensive, quickly-developed software that can be put to many uses within a health care organization. Database-driven Web applications garnered 73873 hits on our system-wide intranet in 2002. They enabled collaboration and communication via user-friendly Web browser-based interfaces for both mission-critical and patient-care-critical functions. Nineteen database-driven Web applications were developed. The application categories that comprised 80% of the hits were results reporting (27%), graduate medical education (26%), research (20%), and bed availability (8%). The mean number of hits per application was 3888 (SD = 5598; range, 14-19879). A model is described for just-in-time database-driven Web application development and an example given with a popular HTML editor and database program. PMID:14517109
The problem of assessing risk from mercury across the nation is extremely complex involving integration of 1) our understanding of the methylation process in ecosystems, 2) the identification and spatial distribution of sensitive populations, and 3) the spatial pattern of mercury...
As part of an EPA/USGS project to predict the relative vulnerability of near-coastal species to climate change, we have synthesized in a web-based tool, the Coastal Biogeographic Risk Analysis Tool (CBRAT), the biogeographic distributions and abundances of bivalves, found in dept...
Biotool2Web: creating simple Web interfaces for bioinformatics applications.
Shahid, Mohammad; Alam, Intikhab; Fuellen, Georg
2006-01-01
Currently there are many bioinformatics applications being developed, but there is no easy way to publish them on the World Wide Web. We have developed a Perl script, called Biotool2Web, which makes the task of creating web interfaces for simple ('home-made') bioinformatics applications quick and easy. Biotool2Web uses an XML document containing the parameters to run the tool on the Web, and generates the corresponding HTML and common gateway interface (CGI) files ready to be published on a web server. This tool is available for download at URL http://www.uni-muenster.de/Bioinformatics/services/biotool2web/ Georg Fuellen (fuellen@alum.mit.edu).
Specification Patent Management for Web Application Platform Ecosystem
NASA Astrophysics Data System (ADS)
Fukami, Yoshiaki; Isshiki, Masao; Takeda, Hideaki; Ohmukai, Ikki; Kokuryo, Jiro
Diversified usage of web applications has encouraged disintegration of web platform into management of identification and applications. Users make use of various kinds of data linked to their identity with multiple applications on certain social web platforms such as Facebook or MySpace. There has emerged competition among web application platforms. Platformers can design relationship with developers by controlling patent of their own specification and adopt open technologies developed external organizations. Platformers choose a way to open according to feature of the specification and their position. Patent management of specification come to be a key success factor to build competitive web application platforms. Each way to attract external developers such as standardization, open source has not discussed and analyzed all together.
Malkin, Mathew R.; Lenart, John; Stier, Gary R.; Gatling, Jason W.; Applegate II, Richard L.
2016-01-01
Objectives This study compared admission rates to a United States anesthesiology residency program for applicants completing face-to-face versus web-based interviews during the admissions process. We also explored factors driving applicants to select each interview type. Methods The 211 applicants invited to interview for admission to our anesthesiology residency program during the 2014-2015 application cycle were participants in this pilot observational study. Of these, 141 applicants selected face-to-face interviews, 53 applicants selected web-based interviews, and 17 applicants declined to interview. Data regarding applicants' reasons for selecting a particular interview type were gathered using an anonymous online survey after interview completion. Residency program admission rates and survey answers were compared between applicants completing face-to-face versus web-based interviews. Results One hundred twenty-seven (75.1%) applicants completed face-to-face and 42 (24.9%) completed web-based interviews. The admission rate to our residency program was not significantly different between applicants completing face-to-face versus web-based interviews. One hundred eleven applicants completed post-interview surveys. The most common reasons for selecting web-based interviews were conflict of interview dates between programs, travel concerns, or financial limitations. Applicants selected face-to-face interviews due to a desire to interact with current residents, or geographic proximity to the residency program. Conclusions These results suggest that completion of web-based interviews is a viable alternative to completion of face-to-face interviews, and that choice of interview type does not affect the rate of applicant admission to the residency program. Web-based interviews may be of particular interest to applicants applying to a large number of programs, or with financial limitations. PMID:27039029
NASA Astrophysics Data System (ADS)
Rocchi, Marta; Scotti, Marco; Micheli, Fiorenza; Bodini, Antonio
2017-01-01
Ecosystem-Based Management (EBM) aims to support the protection of natural ecosystems and to improve economic activities. It requires considering all of the actors interacting in social-ecological systems (e.g., fish and fishers) in the understanding that their interplay determines the dynamic behavior of the single actors as well as that of the system as a whole. Connections are thus central to EBM. Within the ecological dimension of socio-ecological systems, interactions between species define such connections. Understanding how connections affect ecosystem and species dynamics is often impaired by a lack of data. We propose food web network analysis as a tool to help bridge the gap between EBM theory and practice in data-poor contexts, and illustrate this approach through its application to a coastal marine ecosystem in Baja California Sur, Mexico. First, we calculated centrality indices to identify which key (i.e., most central) species must be considered when designing strategies for sustainable resource management. Second, we analyzed the resilience of the system by measuring changes in food web structure due to the local extinction of vulnerable species (i.e., by mimicking the possible effect of excessive fishing pressure). The consequences of species removals were quantified in terms of impacts on global structural indices and species' centrality indices. Overall, we found that this coastal ecosystem shows high resilience to species loss. We identified species (e.g., Octopus sp. and the kelp bass, Paralabrax clathratus) whose protection could further decrease the risk of potential negative impacts of fishing activities on the Baja California Sur food web. This work introduces an approach that can be applied to other ecosystems to aid the implementation of EBM in data-poor contexts.
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui
2012-01-01
Background The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. Methods This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Results Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. Conclusions This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications. PMID:22998945
Displaying R spatial statistics on Google dynamic maps with web applications created by Rwui.
Newton, Richard; Deonarine, Andrew; Wernisch, Lorenz
2012-09-24
The R project includes a large variety of packages designed for spatial statistics. Google dynamic maps provide web based access to global maps and satellite imagery. We describe a method for displaying directly the spatial output from an R script on to a Google dynamic map. This is achieved by creating a Java based web application which runs the R script and then displays the results on the dynamic map. In order to make this method easy to implement by those unfamiliar with programming Java based web applications, we have added the method to the options available in the R Web User Interface (Rwui) application. Rwui is an established web application for creating web applications for running R scripts. A feature of Rwui is that all the code for the web application being created is generated automatically so that someone with no knowledge of web programming can make a fully functional web application for running an R script in a matter of minutes. Rwui can now be used to create web applications that will display the results from an R script on a Google dynamic map. Results may be displayed as discrete markers and/or as continuous overlays. In addition, users of the web application may select regions of interest on the dynamic map with mouse clicks and the coordinates of the region of interest will automatically be made available for use by the R script. This method of displaying R output on dynamic maps is designed to be of use in a number of areas. Firstly it allows statisticians, working in R and developing methods in spatial statistics, to easily visualise the results of applying their methods to real world data. Secondly, it allows researchers who are using R to study health geographics data, to display their results directly onto dynamic maps. Thirdly, by creating a web application for running an R script, a statistician can enable users entirely unfamiliar with R to run R coded statistical analyses of health geographics data. Fourthly, we envisage an educational role for such applications.
Hydrological Scenario Using Tools and Applications Available in enviroGRIDS Portal
NASA Astrophysics Data System (ADS)
Bacu, V.; Mihon, D.; Stefanut, T.; Rodila, D.; Cau, P.; Manca, S.; Soru, C.; Gorgan, D.
2012-04-01
Nowadays the decision makers but also citizens are concerning with the sustainability and vulnerability of land management practices on various aspects and in particular on water quality and quantity in complex watersheds. The Black Sea Catchment is an important watershed in the Central and East Europe. In the FP7 project enviroGRIDS [1] was developed a Web Portal that incorporates different tools and applications focused on geospatial data management, hydrologic model calibration, execution and visualization and training activities. This presentation highlights, from the end-user point of view, the scenario related with hydrological models using the tools and applications available in the enviroGRIDS Web Portal [2]. The development of SWAT (Soil Water Assessment Tool) hydrological models is a well known procedure for the hydrological specialists [3]. Starting from the primary data (information related to weather, soil properties, topography, vegetation, and land management practices of the particular watershed) that are used to develop SWAT hydrological models, to specific reports, about the water quality in the studied watershed, the hydrological specialist will use different applications available in the enviroGRIDS portal. The tools and applications available through the enviroGRIDS portal are not dealing with the building up of the SWAT hydrological models. They are mainly focused on: calibration procedure (gSWAT [4]) - uses the GRID computational infrastructure to speed-up the calibration process; development of specific scenarios (BASHYT [5]) - starts from an already calibrated SWAT hydrological model and defines new scenarios; execution of scenarios (gSWATSim [6]) - executes the scenarios exported from BASHYT; visualization (BASHYT) - displays charts, tables and maps. Each application is built-up as a stack of functional layers. We combine different layers of applications by vertical interoperability in order to build the desired complex functionality. On the other hand, the applications can collaborate at the same architectural levels, which represent the horizontal interoperability. Both the horizontal and vertical interoperability is accomplished by services and by exchanging data. The calibration procedure requires huge computational resources, which are provided by the Grid infrastructure. On the other hand the scenario development through BASHYT requires a flexible way of interaction with the SWAT model in order to easily change the input model. The large user community of SWAT from the enviroGRIDS consortium or outside may greatly benefit from tools and applications related with the calibration process, scenario development and execution from the enviroGRIDS portal. [1]. enviroGRIDS project, http://envirogrids.net/ [2]. Gorgan D., Abbaspour K., Cau P., Bacu V., Mihon D., Giuliani G., Ray N., Lehmann A., Grid Based Data Processing Tools and Applications for Black Sea Catchment Basin. IDAACS 2011 - The 6th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications 15-17 September 2011, Prague. IEEE Computer Press, pp. 223 - 228 (2011). [3]. Soil and Water Assessment Tool, http://www.brc.tamus.edu/swat/index.html [4]. Bacu V., Mihon D., Rodila D., Stefanut T., Gorgan D., Grid Based Architectural Components for SWAT Model Calibration. HPCS 2011 - International Conference on High Performance Computing and Simulation, 4-8 July, Istanbul, Turkey, ISBN 978-1-61284-381-0, doi: 10.1109/HPCSim.2011.5999824, pp. 193-198 (2011). [5]. Manca S., Soru C., Cau P., Meloni G., Fiori M., A multi model and multiscale, GIS oriented Web framework based on the SWAT model to face issues of water and soil resource vulnerability. Presentation at the 5th International SWAT Conference, August 3-7, 2009, http://www.brc.tamus.edu/swat/4thswatconf/docs/rooma/session5/Cau-Bashyt.pdf [6]. Bacu V., Mihon D., Stefanut T., Rodila D., Gorgan D., Cau P., Manca S., Grid Based Services and Tools for Hydrological Model Processing and Visualization. SYNASC 2011 - 13 International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (in press).
Web Services--A Buzz Word with Potentials
János T. Füstös
2006-01-01
The simplest definition of a web service is an application that provides a web API. The web API exposes the functionality of the solution to other applications. The web API relies on other Internet-based technologies to manage communications. The resulting web services are pervasive, vendor-independent, language-neutral, and very low-cost. The main purpose of a web API...
Creating Web-Based Scientific Applications Using Java Servlets
NASA Technical Reports Server (NTRS)
Palmer, Grant; Arnold, James O. (Technical Monitor)
2001-01-01
There are many advantages to developing web-based scientific applications. Any number of people can access the application concurrently. The application can be accessed from a remote location. The application becomes essentially platform-independent because it can be run from any machine that has internet access and can run a web browser. Maintenance and upgrades to the application are simplified since only one copy of the application exists in a centralized location. This paper details the creation of web-based applications using Java servlets. Java is a powerful, versatile programming language that is well suited to developing web-based programs. A Java servlet provides the interface between the central server and the remote client machines. The servlet accepts input data from the client, runs the application on the server, and sends the output back to the client machine. The type of servlet that supports the HTTP protocol will be discussed in depth. Among the topics the paper will discuss are how to write an http servlet, how the servlet can run applications written in Java and other languages, and how to set up a Java web server. The entire process will be demonstrated by building a web-based application to compute stagnation point heat transfer.
Testing Web Applications with Mutation Analysis
ERIC Educational Resources Information Center
Praphamontripong, Upsorn
2017-01-01
Web application software uses new technologies that have novel methods for integration and state maintenance that amount to new control flow mechanisms and new variables scoping. While modern web development technologies enhance the capabilities of web applications, they introduce challenges that current testing techniques do not adequately test…
Build, Buy, Open Source, or Web 2.0?: Making an Informed Decision for Your Library
ERIC Educational Resources Information Center
Fagan, Jody Condit; Keach, Jennifer A.
2010-01-01
When improving a web presence, today's libraries have a choice: using a free Web 2.0 application, opting for open source, buying a product, or building a web application. This article discusses how to make an informed decision for one's library. The authors stress that deciding whether to use a free Web 2.0 application, to choose open source, to…
Landslide hazard assessment : LIFE+IMAGINE project methodology and Liguria region use case
NASA Astrophysics Data System (ADS)
Spizzichino, Daniele; Campo, Valentina; Congi, Maria Pia; Cipolloni, Carlo; Delmonaco, Giuseppe; Guerrieri, Luca; Iadanza, Carla; Leoni, Gabriele; Trigila, Alessandro
2015-04-01
Scope of the work is to present a methodology developed for analysis of potential impacts in areas prone to landslide hazard in the framework of the EC project LIFE+IMAGINE. The project aims to implement a web services-based infrastructure addressed to environmental analysis, that integrates, in its own architecture, specifications and results from INSPIRE, SEIS and GMES. Existing web services has been customized to provide functionalities for supporting environmental integrated management. The implemented infrastructure has been applied to landslide risk scenarios, developed in selected pilot areas, aiming at: i) application of standard procedures to implement a landslide risk analysis; ii) definition of a procedure for assessment of potential environmental impacts, based on a set of indicators to estimate the different exposed elements with their specific vulnerability in the pilot area. The landslide pilot and related scenario are focused at providing a simplified Landslide Risk Assessment (LRA) through: 1) a landslide inventory derived from available historical and recent databases and maps; 2) landslide susceptibility and hazard maps; 3) assessment of exposure and vulnerability on selected typologies of elements at risk; 4) implementation of a landslide risk scenario for different sets of exposed elements 5) development of a use case; 6) definition of guidelines, best practices and production of thematic maps. The LRA has been implemented in Liguria region, Italy, in two different catchment areas located in the Cinque Terre National Park, characterized by a high landslide susceptibility and low resilience. The landslide risk impact analysis has been calibrated taking into account the socio-economic damage caused by landslides triggered by the October 2011 meteorological event. During this event, over 600 landslides were triggered in the selected pilot area. Most of landslides affected the diffuse system of anthropogenic terraces and caused the direct disruption of the walls as well as transportation of a large amount of loose sediments along the slopes and channels as induced consequence of the event. Application of a spatial analysis detected ca. 400 critical point along the road network with an average length of about 200 m. Over 1,000 buildings were affected and damaged by the event. The exposed population in the area involved by the event has been estimated in ca. 2,600 inhabitants (people?). In the pilot area, 19 different typologies of Cultural Heritage were affected by landslide phenomena or located in zones classified as high landslide hazard. The final scope of the landslide scenario is to improve the awareness on hazard, exposure, vulnerability and landslide risk in the Cinque Terre National Park to the benefit of local authorities and population. In addition, the results of the application will be used for updating the land planning process in order to improve the resilience of local communities, ii) implementing cost-benefit analysis aimed at the definition of guidelines for sustainable landslide risk mitigation strategies, iii) suggesting a general road map for the implementation of a local adaptation plan.
Development of grid-like applications for public health using Web 2.0 mashup techniques.
Scotch, Matthew; Yip, Kevin Y; Cheung, Kei-Hoi
2008-01-01
Development of public health informatics applications often requires the integration of multiple data sources. This process can be challenging due to issues such as different file formats, schemas, naming systems, and having to scrape the content of web pages. A potential solution to these system development challenges is the use of Web 2.0 technologies. In general, Web 2.0 technologies are new internet services that encourage and value information sharing and collaboration among individuals. In this case report, we describe the development and use of Web 2.0 technologies including Yahoo! Pipes within a public health application that integrates animal, human, and temperature data to assess the risk of West Nile Virus (WNV) outbreaks. The results of development and testing suggest that while Web 2.0 applications are reasonable environments for rapid prototyping, they are not mature enough for large-scale public health data applications. The application, in fact a "systems of systems," often failed due to varied timeouts for application response across web sites and services, internal caching errors, and software added to web sites by administrators to manage the load on their servers. In spite of these concerns, the results of this study demonstrate the potential value of grid computing and Web 2.0 approaches in public health informatics.
PaaS for web applications with OpenShift Origin
NASA Astrophysics Data System (ADS)
Lossent, A.; Rodriguez Peon, A.; Wagner, A.
2017-10-01
The CERN Web Frameworks team has deployed OpenShift Origin to facilitate deployment of web applications and to improving efficiency in terms of computing resource usage. OpenShift leverages Docker containers and Kubernetes orchestration to provide a Platform-as-a-service solution oriented for web applications. We will review use cases and how OpenShift was integrated with other services such as source control, web site management and authentication services.
WIRM: An Open Source Toolkit for Building Biomedical Web Applications
Jakobovits, Rex M.; Rosse, Cornelius; Brinkley, James F.
2002-01-01
This article describes an innovative software toolkit that allows the creation of web applications that facilitate the acquisition, integration, and dissemination of multimedia biomedical data over the web, thereby reducing the cost of knowledge sharing. There is a lack of high-level web application development tools suitable for use by researchers, clinicians, and educators who are not skilled programmers. Our Web Interfacing Repository Manager (WIRM) is a software toolkit that reduces the complexity of building custom biomedical web applications. WIRM’s visual modeling tools enable domain experts to describe the structure of their knowledge, from which WIRM automatically generates full-featured, customizable content management systems. PMID:12386108
NASA Astrophysics Data System (ADS)
Zhai, Dongsheng; Liu, Chen
Since 2005, the term Web 2.0 has gradually become a hot topic on the Internet. Web 2.0 lets users create web contents as distinct from webmasters or web coders. Web 2.0 has come to our work, our life and even has become an indispensable part of our web-life. Its applications have already been widespread in many fields on the Internet. So far, China has about 137 million netizens [1], therefore its Web 2.0 market is so attractive that many sources of venture capital flow into the Chinese Web 2.0 market and there are also a lot of new Web 2.0 companies in China. However, the development of Web 2.0 in China is accompanied by some problems and obstacles. In this paper, we will mainly discuss Web 2.0 applications in China, with their current problems and future development trends.
78 FR 42775 - CGI Federal, Inc., and Custom Applications Management; Transfer of Data
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-17
... develop applications, Web sites, Web pages, web-based applications and databases, in accordance with EPA policies and related Federal standards and procedures. The Contractor will provide [[Page 42776
Caught in the Web: How Online Advertising Exploits Children.
ERIC Educational Resources Information Center
Pasnik, Shelley
1997-01-01
Principals should be aware of Internet advertising targeted at children. According to a recent Center for Media Education study, many companies design online sites for children as a way to bypass adult authority and prey on children's vulnerabilities. Some companies use their online sites to develop brand loyalties or collect market-segment data…
76 FR 22625 - Reporting of Security Issues
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
...) Accessing the Government Printing Office's Web page at http://www.gpoaccess.gov/fr/index.html ; or (3... violations, threat information or criminal activities, vulnerabilities and intelligence was put in place...://data.bls.gov/cgi-bin/print.pl/oes/2009/may/naics2_48-49.htm and http://www.bls.gov/cpi/cpid1012.pdf...
Lehmann, Anthony; Guigoz, Yaniss; Ray, Nicolas; Mancosu, Emanuele; Abbaspour, Karim C.; Rouholahnejad Freund, Elham; Allenbach, Karin; De Bono, Andrea; Fasel, Marc; Gago-Silva, Ana; Bär, Roger; Lacroix, Pierre; Giuliani, Gregory
2017-01-01
The Black Sea catchment (BSC) is facing important demographic, climatic and landuse changes that may increase pollution, vulnerability and scarcity of water resources, as well as beach erosion through sea level rise. Limited access to reliable time-series monitoring data from environmental, statistical, and socio-economical sources is a major barrier to policy development and decision-making. To address these issues, a web-based platform was developed to enable discovery and access to key environmental information for the region. This platform covers: landuse, climate, and demographic scenarios; hydrology and related water vulnerability and scarcity; as well as beach erosion. Each data set has been obtained with state-of-the-art modelling tools from available monitoring data using appropriate validation methods. These analyses were conducted using global and regional data sets. The data sets are intended for national to regional assessments, for instance for prioritizing environmental protection projects and investments. Together they form a unique set of information, which lay out future plausible change scenarios for the BSC, both for scientific and policy purposes. PMID:28675383
Lehmann, Anthony; Guigoz, Yaniss; Ray, Nicolas; Mancosu, Emanuele; Abbaspour, Karim C; Rouholahnejad Freund, Elham; Allenbach, Karin; De Bono, Andrea; Fasel, Marc; Gago-Silva, Ana; Bär, Roger; Lacroix, Pierre; Giuliani, Gregory
2017-07-04
The Black Sea catchment (BSC) is facing important demographic, climatic and landuse changes that may increase pollution, vulnerability and scarcity of water resources, as well as beach erosion through sea level rise. Limited access to reliable time-series monitoring data from environmental, statistical, and socio-economical sources is a major barrier to policy development and decision-making. To address these issues, a web-based platform was developed to enable discovery and access to key environmental information for the region. This platform covers: landuse, climate, and demographic scenarios; hydrology and related water vulnerability and scarcity; as well as beach erosion. Each data set has been obtained with state-of-the-art modelling tools from available monitoring data using appropriate validation methods. These analyses were conducted using global and regional data sets. The data sets are intended for national to regional assessments, for instance for prioritizing environmental protection projects and investments. Together they form a unique set of information, which lay out future plausible change scenarios for the BSC, both for scientific and policy purposes.
NASA Astrophysics Data System (ADS)
Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.
2012-12-01
Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.
Development of a web application for water resources based on open source software
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri P.
2014-01-01
This article presents research and development of a prototype web application for water resources using latest advancements in Information and Communication Technologies (ICT), open source software and web GIS. The web application has three web services for: (1) managing, presenting and storing of geospatial data, (2) support of water resources modeling and (3) water resources optimization. The web application is developed using several programming languages (PhP, Ajax, JavaScript, Java), libraries (OpenLayers, JQuery) and open source software components (GeoServer, PostgreSQL, PostGIS). The presented web application has several main advantages: it is available all the time, it is accessible from everywhere, it creates a real time multi-user collaboration platform, the programing languages code and components are interoperable and designed to work in a distributed computer environment, it is flexible for adding additional components and services and, it is scalable depending on the workload. The application was successfully tested on a case study with concurrent multi-users access.
A Tactical Framework for Cyberspace Situational Awareness
2010-06-01
Command & Control 1. VOIP Telephone 2. Internet Chat 3. Web App ( TBMCS ) 4. Email 5. Web App (PEX) 6. Database (CAMS) 7. Database (ARMS) 8...Database (LogMod) 9. Resource (WWW) 10. Application (PFPS) Mission Planning 1. Application (PFPS) 2. Email 3. Web App ( TBMCS ) 4. Internet Chat...1. Web App (PEX) 2. Database (ARMS) 3. Web App ( TBMCS ) 4. Email 5. Database (CAMS) 6. VOIP Telephone 7. Application (PFPS) 8. Internet Chat 9
A topological framework for interactive queries on 3D models in the Web.
Figueiredo, Mauro; Rodrigues, José I; Silvestre, Ivo; Veiga-Pires, Cristina
2014-01-01
Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications.
A Topological Framework for Interactive Queries on 3D Models in the Web
Figueiredo, Mauro; Rodrigues, José I.; Silvestre, Ivo; Veiga-Pires, Cristina
2014-01-01
Several technologies exist to create 3D content for the web. With X3D, WebGL, and X3DOM, it is possible to visualize and interact with 3D models in a web browser. Frequently, three-dimensional objects are stored using the X3D file format for the web. However, there is no explicit topological information, which makes it difficult to design fast algorithms for applications that require adjacency and incidence data. This paper presents a new open source toolkit TopTri (Topological model for Triangle meshes) for Web3D servers that builds the topological model for triangular meshes of manifold or nonmanifold models. Web3D client applications using this toolkit make queries to the web server to get adjacent and incidence information of vertices, edges, and faces. This paper shows the application of the topological information to get minimal local points and iso-lines in a 3D mesh in a web browser. As an application, we present also the interactive identification of stalactites in a cave chamber in a 3D web browser. Several tests show that even for large triangular meshes with millions of triangles, the adjacency and incidence information is returned in real time making the presented toolkit appropriate for interactive Web3D applications. PMID:24977236
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumar, Prashant, E-mail: prashantkumar@csio.res.in; Academy of Scientific and Innovative Research—CSIO, Chandigarh 160030; Bansod, Baban K.S.
2015-02-15
Groundwater vulnerability maps are useful for decision making in land use planning and water resource management. This paper reviews the various groundwater vulnerability assessment models developed across the world. Each model has been evaluated in terms of its pros and cons and the environmental conditions of its application. The paper further discusses the validation techniques used for the generated vulnerability maps by various models. Implicit challenges associated with the development of the groundwater vulnerability assessment models have also been identified with scientific considerations to the parameter relations and their selections. - Highlights: • Various index-based groundwater vulnerability assessment models havemore » been discussed. • A comparative analysis of the models and its applicability in different hydrogeological settings has been discussed. • Research problems of underlying vulnerability assessment models are also reported in this review paper.« less
Use of Web Technology to Access and Update College Plans
ERIC Educational Resources Information Center
Valeau, Edward J.; Luan, Jing
2007-01-01
In this study, the process and outcome of a web-based planning application, called Ports of Call, are discussed. The application allows college management to create, edit, and report out activities relating to college plans, all through a web browser. Its design was based on best practices in modern web technology and the application can be easily…
Simmons, Vani Nath; Heckman, Bryan W.; Fink, Angelina C.; Small, Brent J.; Brandon, Thomas H.
2015-01-01
Objective College represents a window of opportunity to reach the sizeable number of cigarette smokers who are vulnerable to lifelong smoking. The underutilization of typical cessation programs suggests the need for novel and more engaging approaches for reaching college smokers. The aim of the present study was to test the efficacy of a dissonance-enhancing, Web-based experiential intervention for increasing smoking cessation motivation and behavior. Method We used a 4-arm, randomized design to examine the efficacy of a Web-based, experiential smoking intervention (Web-Smoke). The control conditions included a didactic smoking intervention (Didactic), a group-based experiential intervention (Group), and a Web-based nutrition experiential intervention (Web-Nutrition). We recruited 341 college smokers. Primary outcomes were motivation to quit, assessed immediately postintervention, and smoking abstinence at 1 and 6 months following the intervention. Results As hypothesized, the Web-Smoke intervention was more effective than control groups in increasing motivation to quit. At 6-month follow-up, the Web-Smoke intervention produced higher rates of smoking cessation than the Web-Nutrition control intervention. Daily smoking moderated intervention outcomes. Among daily smokers, the Web-Smoke intervention produced greater abstinence rates than both the Web-Nutrition and Didactic control conditions. Conclusion Findings demonstrate the efficacy of a theory-based intervention delivered over the Internet for increasing motivation to quit and smoking abstinence among college smokers. The intervention has potential for translation and implementation as a secondary prevention strategy for college-aged smokers. PMID:23668667
Simmons, Vani Nath; Heckman, Bryan W; Fink, Angelina C; Small, Brent J; Brandon, Thomas H
2013-10-01
College represents a window of opportunity to reach the sizeable number of cigarette smokers who are vulnerable to lifelong smoking. The underutilization of typical cessation programs suggests the need for novel and more engaging approaches for reaching college smokers. The aim of the present study was to test the efficacy of a dissonance-enhancing, Web-based experiential intervention for increasing smoking cessation motivation and behavior. We used a 4-arm, randomized design to examine the efficacy of a Web-based, experiential smoking intervention (Web-Smoke). The control conditions included a didactic smoking intervention (Didactic), a group-based experiential intervention (Group), and a Web-based nutrition experiential intervention (Web-Nutrition). We recruited 341 college smokers. Primary outcomes were motivation to quit, assessed immediately postintervention, and smoking abstinence at 1 and 6 months following the intervention. As hypothesized, the Web-Smoke intervention was more effective than control groups in increasing motivation to quit. At 6-month follow-up, the Web-Smoke intervention produced higher rates of smoking cessation than the Web-Nutrition control intervention. Daily smoking moderated intervention outcomes. Among daily smokers, the Web-Smoke intervention produced greater abstinence rates than both the Web-Nutrition and Didactic control conditions. Findings demonstrate the efficacy of a theory-based intervention delivered over the Internet for increasing motivation to quit and smoking abstinence among college smokers. The intervention has potential for translation and implementation as a secondary prevention strategy for college-aged smokers. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Web 2.0 and Pharmacy Education
Fox, Brent I.
2009-01-01
New types of social Internet applications (often referred to as Web 2.0) are becoming increasingly popular within higher education environments. Although developed primarily for entertainment and social communication within the general population, applications such as blogs, social video sites, and virtual worlds are being adopted by higher education institutions. These newer applications differ from standard Web sites in that they involve the users in creating and distributing information, hence effectively changing how the Web is used for knowledge generation and dispersion. Although Web 2.0 applications offer exciting new ways to teach, they should not be the core of instructional planning, but rather selected only after learning objectives and instructional strategies have been identified. This paper provides an overview of prominent Web 2.0 applications, explains how they are being used within education environments, and elaborates on some of the potential opportunities and challenges that these applications present. PMID:19960079
Web 2.0 and pharmacy education.
Cain, Jeff; Fox, Brent I
2009-11-12
New types of social Internet applications (often referred to as Web 2.0) are becoming increasingly popular within higher education environments. Although developed primarily for entertainment and social communication within the general population, applications such as blogs, social video sites, and virtual worlds are being adopted by higher education institutions. These newer applications differ from standard Web sites in that they involve the users in creating and distributing information, hence effectively changing how the Web is used for knowledge generation and dispersion. Although Web 2.0 applications offer exciting new ways to teach, they should not be the core of instructional planning, but rather selected only after learning objectives and instructional strategies have been identified. This paper provides an overview of prominent Web 2.0 applications, explains how they are being used within education environments, and elaborates on some of the potential opportunities and challenges that these applications present.
Using the World Wide Web: Applications for Marketing Educators.
ERIC Educational Resources Information Center
Stull, William A.; And Others
1996-01-01
This article introduces potential uses of the World Wide Web for marketing education, presents tips for navigating the web, and provides a sample of useful applications. Also provides suggestions for monitoring student use of the web. (JOW)
Web services as applications' integration tool: QikProp case study.
Laoui, Abdel; Polyakov, Valery R
2011-07-15
Web services are a new technology that enables to integrate applications running on different platforms by using primarily XML to enable communication among different computers over the Internet. Large number of applications was designed as stand alone systems before the concept of Web services was introduced and it is a challenge to integrate them into larger computational networks. A generally applicable method of wrapping stand alone applications into Web services was developed and is described. To test the technology, it was applied to the QikProp for DOS (Windows). Although performance of the application did not change when it was delivered as a Web service, this form of deployment had offered several advantages like simplified and centralized maintenance, smaller number of licenses, and practically no training for the end user. Because by using the described approach almost any legacy application can be wrapped as a Web service, this form of delivery may be recommended as a global alternative to traditional deployment solutions. Copyright © 2011 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Dehinbo, Johnson
2011-01-01
The widespread use of the Internet and the World Wide Web led to the availability of many platforms for developing dynamic Web application and the problem of choosing the most appropriate platform that will be easy to use for undergraduate students of web applications development in tertiary institutions. Students beginning to learn web…
Reducing Vulnerability of Ports and Harbors to Earthquake and Tsunami Hazards
Wood, Nathan J.; Good, James W.; Goodwin, Robert F.
2002-01-01
Recent scientific research suggests the Pacific Northwest could experience catastrophic earthquakes in the near future, both from distant and local sources, posing a significant threat to coastal communities. Damage could result from numerous earthquake-related hazards, such as severe ground shaking, soil liquefaction, landslides, land subsidence/uplift, and tsunami inundation. Because of their geographic location, ports and harbors are especially vulnerable to these hazards. Ports and harbors, however, are important components of many coastal communities, supporting numerous activities critical to the local and regional economy and possibly serving as vital post-event, response-recovery transportation links. A collaborative, multi-year initiative is underway to increase the resiliency of Pacific Northwest ports and harbors to earthquake and tsunami hazards, involving Oregon Sea Grant (OSG), Washington Sea Grant (WSG), the National Oceanic and Atmospheric Administration Coastal Services Center (CSC), and the U.S. Geological Survey Center for Science Policy (CSP). Specific products of this research, planning, and outreach initiative include a regional stakeholder issues and needs assessment, a community-based mitigation planning process, a Geographic Information System (GIS) — based vulnerability assessment methodology, an educational web-site and a regional data archive. This paper summarizes these efforts, including results of two pilot port-harbor community projects, one in Yaquina Bay, Oregon and the other in Sinclair Inlet, Washington. Finally, plans are outlined for outreach to other port and harbor communities in the Pacific Northwest and beyond, using "getting started" workshops and a web-based tutorial.
Opal web services for biomedical applications.
Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W
2010-07-01
Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.
Moore, D R; Feurer, I D; Zavala, E Y; Shaffer, D; Karp, S; Hoy, H; Moore, D E
2013-02-01
Most centers utilize phone or written surveys to screen candidates who self-refer to be living kidney donors. To increase efficiency and reduce resource utilization, we developed a web-based application to screen kidney donor candidates. The aim of this study was to evaluate the use of this web-based application. Method and time of referral were tabulated and descriptive statistics summarized demographic characteristics. Time series analyses evaluated use over time. Between January 1, 2011 and March 31, 2012, 1200 candidates self-referred to be living kidney donors at our center. Eight hundred one candidates (67%) completed the web-based survey and 399 (33%) completed a phone survey. Thirty-nine percent of donors accessed the application on nights and weekends. Postimplementation of the web-based application, there was a statistically significant increase (p < 0.001) in the number of self-referrals via the web-based application as opposed to telephone contact. Also, there was a significant increase (p = 0.025) in the total number of self-referrals post-implementation from 61 to 116 per month. An interactive web-based application is an effective strategy for the initial screening of donor candidates. The web-based application increased the ability to interface with donors, process them efficiently and ultimately increased donor self-referral at our center. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.
Calibration of groundwater vulnerability mapping using the generalized reduced gradient method.
Elçi, Alper
2017-12-01
Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Calibration of groundwater vulnerability mapping using the generalized reduced gradient method
NASA Astrophysics Data System (ADS)
Elçi, Alper
2017-12-01
Groundwater vulnerability assessment studies are essential in water resources management. Overlay-and-index methods such as DRASTIC are widely used for mapping of groundwater vulnerability, however, these methods mainly suffer from a subjective selection of model parameters. The objective of this study is to introduce a calibration procedure that results in a more accurate assessment of groundwater vulnerability. The improvement of the assessment is formulated as a parameter optimization problem using an objective function that is based on the correlation between actual groundwater contamination and vulnerability index values. The non-linear optimization problem is solved with the generalized-reduced-gradient (GRG) method, which is numerical algorithm based optimization method. To demonstrate the applicability of the procedure, a vulnerability map for the Tahtali stream basin is calibrated using nitrate concentration data. The calibration procedure is easy to implement and aims the maximization of correlation between observed pollutant concentrations and groundwater vulnerability index values. The influence of each vulnerability parameter in the calculation of the vulnerability index is assessed by performing a single-parameter sensitivity analysis. Results of the sensitivity analysis show that all factors are effective on the final vulnerability index. Calibration of the vulnerability map improves the correlation between index values and measured nitrate concentrations by 19%. The regression coefficient increases from 0.280 to 0.485. It is evident that the spatial distribution and the proportions of vulnerability class areas are significantly altered with the calibration process. Although the applicability of the calibration method is demonstrated on the DRASTIC model, the applicability of the approach is not specific to a certain model and can also be easily applied to other overlay-and-index methods.
Life Cycle Project Plan Outline: Web Sites and Web-based Applications
This tool is a guideline for planning and checking for 508 compliance on web sites and web based applications. Determine which EIT components are covered or excepted, which 508 standards and requirements apply, and how to implement them.
BOWS (bioinformatics open web services) to centralize bioinformatics tools in web services.
Velloso, Henrique; Vialle, Ricardo A; Ortega, J Miguel
2015-06-02
Bioinformaticians face a range of difficulties to get locally-installed tools running and producing results; they would greatly benefit from a system that could centralize most of the tools, using an easy interface for input and output. Web services, due to their universal nature and widely known interface, constitute a very good option to achieve this goal. Bioinformatics open web services (BOWS) is a system based on generic web services produced to allow programmatic access to applications running on high-performance computing (HPC) clusters. BOWS intermediates the access to registered tools by providing front-end and back-end web services. Programmers can install applications in HPC clusters in any programming language and use the back-end service to check for new jobs and their parameters, and then to send the results to BOWS. Programs running in simple computers consume the BOWS front-end service to submit new processes and read results. BOWS compiles Java clients, which encapsulate the front-end web service requisitions, and automatically creates a web page that disposes the registered applications and clients. Bioinformatics open web services registered applications can be accessed from virtually any programming language through web services, or using standard java clients. The back-end can run in HPC clusters, allowing bioinformaticians to remotely run high-processing demand applications directly from their machines.
The Food Web of Potter Cove (Antarctica): complexity, structure and function
NASA Astrophysics Data System (ADS)
Marina, Tomás I.; Salinas, Vanesa; Cordone, Georgina; Campana, Gabriela; Moreira, Eugenia; Deregibus, Dolores; Torre, Luciana; Sahade, Ricardo; Tatián, Marcos; Barrera Oro, Esteban; De Troch, Marleen; Doyle, Santiago; Quartino, María Liliana; Saravia, Leonardo A.; Momo, Fernando R.
2018-01-01
Knowledge of the food web structure and complexity are central to better understand ecosystem functioning. A food-web approach includes both species and energy flows among them, providing a natural framework for characterizing species' ecological roles and the mechanisms through which biodiversity influences ecosystem dynamics. Here we present for the first time a high-resolution food web for a marine ecosystem at Potter Cove (northern Antarctic Peninsula). Eleven food web properties were analyzed in order to document network complexity, structure and topology. We found a low linkage density (3.4), connectance (0.04) and omnivory percentage (45), as well as a short path length (1.8) and a low clustering coefficient (0.08). Furthermore, relating the structure of the food web to its dynamics, an exponential degree distribution (in- and out-links) was found. This suggests that the Potter Cove food web may be vulnerable if the most connected species became locally extinct. For two of the three more connected functional groups, competition overlap graphs imply high trophic interaction between demersal fish and niche specialization according to feeding strategies in amphipods. On the other hand, the prey overlap graph shows also that multiple energy pathways of carbon flux exist across benthic and pelagic habitats in the Potter Cove ecosystem. Although alternative food sources might add robustness to the web, network properties (low linkage density, connectance and omnivory) suggest fragility and potential trophic cascade effects.
Hoelzer, Simon; Schweiger, Ralf K; Rieger, Joerg; Meyer, Michael
2006-01-01
The organizational structures of web contents and electronic information resources must adapt to the demands of a growing volume of information and user requirements. Otherwise the information society will be threatened by disinformation. The biomedical sciences are especially vulnerable in this regard, since they are strongly oriented toward text-based knowledge sources. Here sustainable improvement can only be achieved by using a comprehensive, integrated approach that not only includes data management but also specifically incorporates the editorial processes, including structuring information sources and publication. The technical resources needed to effectively master these tasks are already available in the form of the data standards and tools of the Semantic Web. They include Rich Site Summaries (RSS), which have become an established means of distributing and syndicating conventional news messages and blogs. They can also provide access to the contents of the previously mentioned information sources, which are conventionally classified as 'deep web' content.
Woodward, Guy; Brown, Lee E.; Edwards, Francois K.; Hudson, Lawrence N.; Milner, Alexander M.; Reuman, Daniel C.; Ledger, Mark E.
2012-01-01
Experimental data from intergenerational field manipulations of entire food webs are scarce, yet such approaches are essential for gauging impacts of environmental change in natural systems. We imposed 2 years of intermittent drought on stream channels in a replicated field trial, to measure food web responses to simulated climate change. Drought triggered widespread losses of species and links, with larger taxa and those that were rare for their size, many of which were predatory, being especially vulnerable. Many network properties, including size–scaling relationships within food chains, changed in response to drought. Other properties, such as connectance, were unaffected. These findings highlight the need for detailed experimental data from different organizational levels, from pairwise links to the entire food web. The loss of not only large species, but also those that were rare for their size, provides a newly refined way to gauge likely impacts that may be applied more generally to other systems and/or impacts. PMID:23007087
WebDB Component Builder - Lessons Learned
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macedo, C.
2000-02-15
Oracle WebDB is the easiest way to produce web enabled lightweight and enterprise-centric applications. This concept from Oracle has tantalized our taste for simplistic web development by using a purely web based tool that lives nowhere else but in the database. The use of online wizards, templates, and query builders, which produces PL/SQL behind the curtains, can be used straight ''out of the box'' by both novice and seasoned developers. The topic of this presentation will introduce lessons learned by developing and deploying applications built using the WebDB Component Builder in conjunction with custom PL/SQL code to empower a hybridmore » application. There are two kinds of WebDB components: those that display data to end users via reporting, and those that let end users update data in the database via entry forms. The presentation will also discuss various methods within the Component Builder to enhance the applications pushed to the desktop. The demonstrated example is an application entitled HOME (Helping Other's More Effectively) that was built to manage a yearly United Way Campaign effort. Our task was to build an end to end application which could manage approximately 900 non-profit agencies, an average of 4,100 individual contributions, and $1.2 million dollars. Using WebDB, the shell of the application was put together in a matter of a few weeks. However, we did encounter some hurdles that WebDB, in it's stage of infancy (v2.0), could not solve for us directly. Together with custom PL/SQL, WebDB's Component Builder became a powerful tool that enabled us to produce a very flexible hybrid application.« less
Graph Theory Approach for Studying Food Webs
NASA Astrophysics Data System (ADS)
Longjas, A.; Tejedor, A.; Foufoula-Georgiou, E.
2017-12-01
Food webs are complex networks of feeding interactions among species in ecological communities. Metrics describing food web structure have been proposed to compare and classify food webs ranging from food chain length, connectance, degree distribution, centrality measures, to the presence of motifs (distinct compartments), among others. However, formal methodologies for studying both food web topology and the dynamic processes operating on them are still lacking. Here, we utilize a quantitative framework using graph theory within which a food web is represented by a directed graph, i.e., a collection of vertices (species or trophic species defined as sets of species sharing the same predators and prey) and directed edges (predation links). This framework allows us to identify apex (environmental "source" node) to outlet (top predators) subnetworks and compute the steady-state flux (e.g., carbon, nutrients, energy etc.) in the food web. We use this framework to (1) construct vulnerability maps that quantify the relative change of flux delivery to the top predators in response to perturbations in prey species (2) identify keystone species, whose loss would precipitate further species extinction, and (3) introduce a suite of graph-theoretic metrics to quantify the topologic (imposed by food web connectivity) and dynamic (dictated by the flux partitioning and distribution) components of a food web's complexity. By projecting food webs into a 2D Topodynamic Complexity Space whose coordinates are given by Number of alternative paths (topologic) and Leakage Index (dynamic), we show that this space provides a basis for food web comparison and provide physical insights into their dynamic behavior.
Espie, Colin A.; Kyle, Simon D.; Williams, Chris; Ong, Jason C.; Douglas, Neil J.; Hames, Peter; Brown, June S.L.
2012-01-01
Study Objectives: The internet provides a pervasive milieu for healthcare delivery. The purpose of this study was to determine the effectiveness of a novel web-based cognitive behavioral therapy (CBT) course delivered by an automated virtual therapist, when compared with a credible placebo; an approach required because web products may be intrinsically engaging, and vulnerable to placebo response. Design: Randomized, placebo-controlled trial comprising 3 arms: CBT, imagery relief therapy (IRT: placebo), treatment as usual (TAU). Setting: Online community of participants in the UK. Participants: One hundred sixty-four adults (120 F: [mean age 49y (18-78y)] meeting proposed DSM-5 criteria for Insomnia Disorder, randomly assigned to CBT (n = 55; 40 F), IRT placebo (n = 55; 42 F) or TAU (n = 54; 38 F). Interventions: CBT and IRT each comprised 6 online sessions delivered by an animated personal therapist, with automated web and email support. Participants also had access to a video library/back catalogue of session content and Wikipedia style articles. Online CBT users had access to a moderated social network/community of users. TAU comprised no restrictions on usual care and access to an online sleep diary. Measurements and Results: Major assessments at baseline, post-treatment, and at follow-up 8-weeks post-treatment; outcomes appraised by online sleep diaries and clinical status. On the primary endpoint of sleep efficiency (SE; total time asleep expressed as a percentage of the total time spent in bed), online CBT was associated with sustained improvement at post-treatment (+20%) relative to both TAU (+6%; d = 0.95) and IRT (+6%: d = 1.06), and at 8 weeks (+20%) relative to IRT (+7%: d = 1.00) and TAU (+9%: d = 0.69) These findings were mirrored across a range of sleep diary measures. Clinical benefits of CBT were evidenced by modest superiority over placebo on daytime outcomes (d = 0.23-0.37) and by substantial improved sleep-wake functioning on the Sleep Condition Indicator (range of d = 0.77-1.20). Three-quarters of CBT participants (76% [CBT] vs. 29% [IRT] and 18% [TAU]) completed treatment with SE > 80%, more than half (55% [CBT] vs. 17% [IRT] and 8% [TAU]) with SE > 85%, and over one-third (38% [CBT] vs. 6% [IRT] and 0% [TAU]) with SE > 90%; these improvements were largely maintained during follow-up. Conclusion: CBT delivered using a media-rich web application with automated support and a community forum is effective in improving the sleep and associated daytime functioning of adults with insomnia disorder. Clinical Trial Registration: ISRCTN – 44615689. Citation: Espie CA; Kyle SD; Williams C; Ong JC; Douglas NJ; Hames P; Brown JSL. A randomized, placebo-controlled trial of online cognitive behavioral therapy for chronic insomnia disorder delivered via an automated media-rich web application. SLEEP 2012;35(6):769-781. PMID:22654196
Flood AI: An Intelligent Systems for Discovery and Communication of Disaster Knowledge
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.
2017-12-01
Communities are not immune from extreme events or natural disasters that can lead to large-scale consequences for the nation and public. Improving resilience to better prepare, plan, recover, and adapt to disasters is critical to reduce the impacts of extreme events. The National Research Council (NRC) report discusses the topic of how to increase resilience to extreme events through a vision of resilient nation in the year 2030. The report highlights the importance of data, information, gaps and knowledge challenges that needs to be addressed, and suggests every individual to access the risk and vulnerability information to make their communities more resilient. This project presents an intelligent system, Flood AI, for flooding to improve societal preparedness by providing a knowledge engine using voice recognition, artificial intelligence, and natural language processing based on a generalized ontology for disasters with a primary focus on flooding. The knowledge engine utilizes the flood ontology and concepts to connect user input to relevant knowledge discovery channels on flooding by developing a data acquisition and processing framework utilizing environmental observations, forecast models, and knowledge bases. Communication channels of the framework includes web-based systems, agent-based chat bots, smartphone applications, automated web workflows, and smart home devices, opening the knowledge discovery for flooding to many unique use cases.
Web 2.0 for Health Promotion: Reviewing the Current Evidence
Prestin, Abby; Lyons, Claire
2013-01-01
As Web 2.0 and social media make the communication landscape increasingly participatory, empirical evidence is needed regarding their impact on and utility for health promotion. Following Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, we searched 4 medical and social science databases for literature (2004–present) on the intersection of Web 2.0 and health. A total of 514 unique publications matched our criteria. We classified references as commentaries and reviews (n = 267), descriptive studies (n = 213), and pilot intervention studies (n = 34). The scarcity of empirical evidence points to the need for more interventions with participatory and user-generated features. Innovative study designs and measurement methods are needed to understand the communication landscape and to critically assess intervention effectiveness. To address health disparities, interventions must consider accessibility for vulnerable populations. PMID:23153164
DOE Office of Scientific and Technical Information (OSTI.GOV)
Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.
2008-05-04
This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroicmore » effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster.« less
Open Marketplace for Simulation Software on the Basis of a Web Platform
NASA Astrophysics Data System (ADS)
Kryukov, A. P.; Demichev, A. P.
2016-02-01
The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.
Critical Thinking of Young Citizens towards News Headlines in Chile
ERIC Educational Resources Information Center
Vernier, Matthieu; Cárcamo, Luis; Scheihing, Eliana
2018-01-01
Strengthening critical thinking abilities of citizens in the face of news published on the web represents a key challenge for education. Young citizens appear to be vulnerable in the face of poor quality news or those containing non-explicit ideologies. In the field of data science, computational and statistical techniques have been developed to…
Y2K Resources for Public Libraries.
ERIC Educational Resources Information Center
Foster, Janet
1999-01-01
Presents information for public libraries on computer-related vulnerabilities as the century turns from 1999 to 2000. Highlights include: general Y2K information; the Y2K Bug and PCs; Y2K sites for librarians; Online Computer Library Center (OCLC) and USMARC; technological developments in cyberspace; and a list of Web sites and Y2K resources. (AEF)
ERIC Educational Resources Information Center
Benotsch, Eric G.; Kalichman, Seth; Weinhardt, Lance S.
2004-01-01
Access to health information on the Internet has revolutionized how medical patients learn about their illnesses. Valuable information can be found online; however, many health Web sites contain inaccurate or misleading information. The authors surveyed 324 adults with HIV concerning their Internet use for obtaining health information. Health…
Toward Automating Web Protocol Configuration for a Programmable Logic Controller Emulator
2014-06-19
Security Risks for Industrial Control Systems ,” VDE 2004 Congress, Berlin, Germany, October 2004, pp. 1-7. [Cis12] Cisco, NetFlow Configuration Guide...Date 29 May 2014 Date AFIT-ENG-T-14-J-4 Abstract Industrial Control Systems (ICS) remain vulnerable through attack vectors that exist within programmable...5 2.2 Industrial Control Systems
Moraitou, Marina; Pateli, Adamantia; Fotiou, Sotiris
2017-01-01
As access to health care is important to people's health especially for vulnerable groups that need nursing for a long period of time, new studies in the human sciences argue that the health of the population depend less on the quality of the health care, or on the amount of spending that goes into health care, and more heavily on the quality of everyday life. Smart home applications are designed to "sense" and monitor the health conditions of its residents through the use of a wide range of technological components (motion sensors, video cameras, wearable devices etc.), and web-based services that support their wish to stay at home. In this work, we provide a review of the main technological, psychosocial/ethical and economic challenges that the implementation of a Smart Health Caring Home raises.
Nixdorf, Erik; Sun, Yuanyuan; Lin, Mao; Kolditz, Olaf
2017-12-15
The main objective of this study is to quantify the groundwater contamination risk of Songhua River Basin by applying a novel approach of integrating public datasets, web services and numerical modelling techniques. To our knowledge, this study is the first to establish groundwater risk maps for the entire Songhua River Basin, one of the largest and most contamination-endangered river basins in China. Index-based groundwater risk maps were created with GIS tools at a spatial resolution of 30arc sec by combining the results of groundwater vulnerability and hazard assessment. Groundwater vulnerability was evaluated using the DRASTIC index method based on public datasets at the highest available resolution in combination with numerical groundwater modelling. As a novel approach to overcome data scarcity at large scales, a web mapping service based data query was applied to obtain an inventory for potential hazardous sites within the basin. The groundwater risk assessment demonstrated that <1% of Songhua River Basin is at high or very high contamination risk. These areas were mainly located in the vast plain areas with hotspots particularly in the Changchun metropolitan area. Moreover, groundwater levels and pollution point sources were found to play a significantly larger impact in assessing these areas than originally assumed by the index scheme. Moderate contamination risk was assigned to 27% of the aquifers, predominantly associated with less densely populated agricultural areas. However, the majority of aquifer area in the sparsely populated mountain ranges displayed low groundwater contamination risk. Sensitivity analysis demonstrated that this novel method is valid for regional assessments of groundwater contamination risk. Despite limitations in resolution and input data consistency, the obtained groundwater contamination risk maps will be beneficial for regional and local decision-making processes with regard to groundwater protection measures, particularly if other data availability is limited. Copyright © 2017 Elsevier B.V. All rights reserved.
Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis
NASA Astrophysics Data System (ADS)
Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter
2013-04-01
Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at the local level provides a clear advantage since, to a large extent, limitations in globally available observational data constrain such a validation on the global scale. Overall, the five steps are outlined in detail in order to facilitate and motivate the application of pattern recognition in other research studies concerned with vulnerability analysis, including future applications to different vulnerability frameworks. Such applications could promote the refinement of mechanisms in specific contexts and advance methodological adjustments. This would further increase the value of identifying typical patterns in the properties of socio-ecological systems for an improved understanding and management of the relation between these systems and particular stresses.
Demonstration of the Web-based Interspecies Correlation Estimation (Web-ICE) modeling application
The Web-based Interspecies Correlation Estimation (Web-ICE) modeling application is available to the risk assessment community through a user-friendly internet platform (http://epa.gov/ceampubl/fchain/webice/). ICE models are log-linear least square regressions that predict acute...
NASA Technical Reports Server (NTRS)
Goseva-Popstojanova, Katerina; Tyo, Jacob
2017-01-01
While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.
New and Improved Version of the ASDC MOPITT Search and Subset Web Application
Atmospheric Science Data Center
2016-07-06
... and Improved Version of the ASDC MOPITT Search and Subset Web Application Friday, June 24, 2016 A new and improved version of the ASDC MOPITT Search and Subset Web Application has been released. New features include: Versions 5 and 6 ...
Ludovici, Alessandro; Calveras, Anna
2015-01-01
In this paper, we present the design of a Constrained Application Protocol (CoAP) proxy able to interconnect Web applications based on Hypertext Transfer Protocol (HTTP) and WebSocket with CoAP based Wireless Sensor Networks. Sensor networks are commonly used to monitor and control physical objects or environments. Smart Cities represent applications of such a nature. Wireless Sensor Networks gather data from their surroundings and send them to a remote application. This data flow may be short or long lived. The traditional HTTP long-polling used by Web applications may not be adequate in long-term communications. To overcome this problem, we include the WebSocket protocol in the design of the CoAP proxy. We evaluate the performance of the CoAP proxy in terms of latency and memory consumption. The tests consider long and short-lived communications. In both cases, we evaluate the performance obtained by the CoAP proxy according to the use of WebSocket and HTTP long-polling. PMID:25585107
Review of Extracting Information From the Social Web for Health Personalization
Karlsen, Randi; Bonander, Jason
2011-01-01
In recent years the Web has come into its own as a social platform where health consumers are actively creating and consuming Web content. Moreover, as the Web matures, consumers are gaining access to personalized applications adapted to their health needs and interests. The creation of personalized Web applications relies on extracted information about the users and the content to personalize. The Social Web itself provides many sources of information that can be used to extract information for personalization apart from traditional Web forms and questionnaires. This paper provides a review of different approaches for extracting information from the Social Web for health personalization. We reviewed research literature across different fields addressing the disclosure of health information in the Social Web, techniques to extract that information, and examples of personalized health applications. In addition, the paper includes a discussion of technical and socioethical challenges related to the extraction of information for health personalization. PMID:21278049
Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs
NASA Astrophysics Data System (ADS)
O'Connor, Rory V.
This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.
Network-Based Learning and Assessment Applications on the Semantic Web
ERIC Educational Resources Information Center
Gibson, David
2005-01-01
Today's Web applications are already "aware" of the network of computers and data on the Internet, in the sense that they perceive, remember, and represent knowledge external to themselves. However, Web applications are generally not able to respond to the meaning and context of the information in their memories. As a result, most applications are…
ERIC Educational Resources Information Center
Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio
2010-01-01
The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…
Accountable Information Flow for Java-Based Web Applications
2010-01-01
runtime library Swift server runtime Java servlet framework HTTP Web server Web browser Figure 2: The Swift architecture introduced an open-ended...On the server, the Java application code links against Swift’s server-side run-time library, which in turn sits on top of the standard Java servlet ...AFRL-RI-RS-TR-2010-9 Final Technical Report January 2010 ACCOUNTABLE INFORMATION FLOW FOR JAVA -BASED WEB APPLICATIONS
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Web-enabling technologies for the factory floor: a web-enabling strategy for emanufacturing
NASA Astrophysics Data System (ADS)
Velez, Ricardo; Lastra, Jose L. M.; Tuokko, Reijo O.
2001-10-01
This paper is intended to address the different technologies available for Web-enabling of the factory floor. It will give an overview of the importance of Web-enabling of the factory floor, in the application of the concepts of flexible and intelligent manufacturing, in conjunction with e-commerce. As a last section, it will try to define a Web-enabling strategy for the application in eManufacturing. This is made under the scope of the electronics manufacturing industry, so every application, technology or related matter is presented under such scope.
NASA Technical Reports Server (NTRS)
Laakso, J. H.; Straayer, J. W.
1974-01-01
A final program summary is reported for test and evaluation activities that were conducted for space shuttle web selection. Large scale advanced composite shear web components were tested and analyzed to evaluate application of advanced composite shear web construction to a space shuttle orbiter thrust structure. The shear web design concept consisted of a titanium-clad + or - 45 deg boron/epoxy web laminate stiffened with vertical boron-epoxy reinforced aluminum stiffeners and logitudinal aluminum stiffening. The design concept was evaluated to be efficient and practical for the application that was studied. Because of the effects of buckling deflections, a requirement is identified for shear buckling resistant design to maximize the efficiency of highly-loaded advanced composite shear webs.
IsoWeb: A Bayesian Isotope Mixing Model for Diet Analysis of the Whole Food Web
Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku
2012-01-01
Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb. PMID:22848427
Fleming Photo of Katherine Fleming Katherine Fleming Database and Web Applications Engineer and web application development in the Commercial Buildings Research group. Her projects include the , Katherine was pursuing a Ph.D. with a focus on robotics and working as a Web developer and Web accessibility
Web-Based Intelligent E-Learning Systems: Technologies and Applications
ERIC Educational Resources Information Center
Ma, Zongmin
2006-01-01
Collecting and presenting the latest research and development results from the leading researchers in the field of e-learning systems, Web-Based Intelligent E-Learning Systems: Technologies and Applications provides a single record of current research and practical applications in Web-based intelligent e-learning systems. This book includes major…
The design and implementation of web mining in web sites security
NASA Astrophysics Data System (ADS)
Li, Jian; Zhang, Guo-Yin; Gu, Guo-Chang; Li, Jian-Li
2003-06-01
The backdoor or information leak of Web servers can be detected by using Web Mining techniques on some abnormal Web log and Web application log data. The security of Web servers can be enhanced and the damage of illegal access can be avoided. Firstly, the system for discovering the patterns of information leakages in CGI scripts from Web log data was proposed. Secondly, those patterns for system administrators to modify their codes and enhance their Web site security were provided. The following aspects were described: one is to combine web application log with web log to extract more information, so web data mining could be used to mine web log for discovering the information that firewall and Information Detection System cannot find. Another approach is to propose an operation module of web site to enhance Web site security. In cluster server session, Density-Based Clustering technique is used to reduce resource cost and obtain better efficiency.
MedlinePlus Connect: Technical Information
... Service Technical Information Page MedlinePlus Connect Implementation Options Web Application How does it work? Responds to requests ... examples of MedlinePlus Connect Web Application response pages. Web Service How does it work? Responds to requests ...
ERIC Educational Resources Information Center
Pelham, Anabel, Ed.; Sills, Elizabeth, Ed.; Eisman, Gerald S., Ed.
2010-01-01
Starting from the premise that the health status, vulnerability to accidents and disease, and life spans are determined by the organization, delivery, and financing (or lack thereof) of health care, this book explores how educators and community caretakers teach the complex web of inter-connection between the micro level of individual health and…
Testing in Service-Oriented Environments
2010-03-01
software releases (versions, service packs, vulnerability patches) for one com- mon ESB during the 13-month period from January 1, 2008 through...impact on quality of service : Unlike traditional software compo- nents, a single instance of a web service can be used by multiple consumers. Since the...distributed, with heterogeneous hardware and software (SOA infrastructure, services , operating systems, and databases). Because of cost and security, it
Cybersecurity Workforce Development and the Protection of Critical Infrastructure
2017-03-31
communicat ions products, and limited travel for site visits and conferencing. The CSCC contains a developed web-based coordination site, computer ...the CSCC. The Best Practices Ana~yst position maintains a lisr of best practices, computer related patches. and standard operating procedures (SOP...involved in conducting vulnerability assessments of computer networks. To adequately exercise and experiment with industry standard software, it was
Error and attack tolerance of complex networks
NASA Astrophysics Data System (ADS)
Albert, Réka; Jeong, Hawoong; Barabási, Albert-László
2000-07-01
Many complex systems display a surprising degree of tolerance against errors. For example, relatively simple organisms grow, persist and reproduce despite drastic pharmaceutical or environmental interventions, an error tolerance attributed to the robustness of the underlying metabolic network. Complex communication networks display a surprising degree of robustness: although key components regularly malfunction, local failures rarely lead to the loss of the global information-carrying ability of the network. The stability of these and other complex systems is often attributed to the redundant wiring of the functional web defined by the systems' components. Here we demonstrate that error tolerance is not shared by all redundant systems: it is displayed only by a class of inhomogeneously wired networks, called scale-free networks, which include the World-Wide Web, the Internet, social networks and cells. We find that such networks display an unexpected degree of robustness, the ability of their nodes to communicate being unaffected even by unrealistically high failure rates. However, error tolerance comes at a high price in that these networks are extremely vulnerable to attacks (that is, to the selection and removal of a few nodes that play a vital role in maintaining the network's connectivity). Such error tolerance and attack vulnerability are generic properties of communication networks.
Strategies for expanding health insurance coverage in vulnerable populations.
Jia, Liying; Yuan, Beibei; Huang, Fei; Lu, Ying; Garner, Paul; Meng, Qingyue
2014-11-26
Health insurance has the potential to improve access to health care and protect people from the financial risks of diseases. However, health insurance coverage is often low, particularly for people most in need of protection, including children and other vulnerable populations. To assess the effectiveness of strategies for expanding health insurance coverage in vulnerable populations. We searched Cochrane Central Register of Controlled Trials (CENTRAL), part of The Cochrane Library. www.thecochranelibrary.com (searched 2 November 2012), PubMed (searched 1 November 2012), EMBASE (searched 6 July 2012), Global Health (searched 6 July 2012), IBSS (searched 6 July 2012), WHO Library Database (WHOLIS) (searched 1 November 2012), IDEAS (searched 1 November 2012), ISI-Proceedings (searched 1 November 2012),OpenGrey (changed from OpenSIGLE) (searched 1 November 2012), African Index Medicus (searched 1 November 2012), BLDS (searched 1 November 2012), Econlit (searched 1 November 2012), ELDIS (searched 1 November 2012), ERIC (searched 1 November 2012), HERDIN NeON Database (searched 1 November 2012), IndMED (searched 1 November 2012), JSTOR (searched 1 November 2012), LILACS(searched 1 November 2012), NTIS (searched 1 November 2012), PAIS (searched 6 July 2012), Popline (searched 1 November 2012), ProQuest Dissertation &Theses Database (searched 1 November 2012), PsycINFO (searched 6 July 2012), SSRN (searched 1 November 2012), Thai Index Medicus (searched 1 November 2012), World Bank (searched 2 November 2012), WanFang (searched 3 November 2012), China National Knowledge Infrastructure (CHKD-CNKI) (searched 2 November 2012).In addition, we searched the reference lists of included studies and carried out a citation search for the included studies via Web of Science to find other potentially relevant studies. Randomised controlled trials (RCTs), non-randomised controlled trials (NRCTs), controlled before-after (CBA) studies and Interrupted time series (ITS) studies that evaluated the effects of strategies on increasing health insurance coverage for vulnerable populations. We defined strategies as measures to improve the enrolment of vulnerable populations into health insurance schemes. Two categories and six specified strategies were identified as the interventions. At least two review authors independently extracted data and assessed the risk of bias. We undertook a structured synthesis. We included two studies, both from the United States. People offered health insurance information and application support by community-based case managers were probably more likely to enrol their children into health insurance programmes (risk ratio (RR) 1.68, 95% confidence interval (CI) 1.44 to 1.96, moderate quality evidence) and were probably more likely to continue insuring their children (RR 2.59, 95% CI 1.95 to 3.44, moderate quality evidence). Of all the children that were insured, those in the intervention group may have been insured quicker (47.3 fewer days, 95% CI 20.6 to 74.0 fewer days, low quality evidence) and parents may have been more satisfied on average (satisfaction score average difference 1.07, 95% CI 0.72 to 1.42, low quality evidence).In the second study applications were handed out in emergency departments at hospitals, compared to not handing out applications, and may have had an effect on enrolment (RR 1.5, 95% CI 1.03 to 2.18, low quality evidence). Community-based case managers who provide health insurance information, application support, and negotiate with the insurer probably increase enrolment of children in health insurance schemes. However, the transferability of this intervention to other populations or other settings is uncertain. Handing out insurance application materials in hospital emergency departments may help increase the enrolment of children in health insurance schemes. Further studies evaluating the effectiveness of different strategies for expanding health insurance coverage in vulnerable population are needed in different settings, with careful attention given to study design.
Prediction of individualized therapeutic vulnerabilities in cancer from genomic profiles
Aksoy, Bülent Arman; Demir, Emek; Babur, Özgün; Wang, Weiqing; Jing, Xiaohong; Schultz, Nikolaus; Sander, Chris
2014-01-01
Motivation: Somatic homozygous deletions of chromosomal regions in cancer, while not necessarily oncogenic, may lead to therapeutic vulnerabilities specific to cancer cells compared with normal cells. A recently reported example is the loss of one of the two isoenzymes in glioblastoma cancer cells such that the use of a specific inhibitor selectively inhibited growth of the cancer cells, which had become fully dependent on the second isoenzyme. We have now made use of the unprecedented conjunction of large-scale cancer genomics profiling of tumor samples in The Cancer Genome Atlas (TCGA) and of tumor-derived cell lines in the Cancer Cell Line Encyclopedia, as well as the availability of integrated pathway information systems, such as Pathway Commons, to systematically search for a comprehensive set of such epistatic vulnerabilities. Results: Based on homozygous deletions affecting metabolic enzymes in 16 TCGA cancer studies and 972 cancer cell lines, we identified 4104 candidate metabolic vulnerabilities present in 1019 tumor samples and 482 cell lines. Up to 44% of these vulnerabilities can be targeted with at least one Food and Drug Administration-approved drug. We suggest focused experiments to test these vulnerabilities and clinical trials based on personalized genomic profiles of those that pass preclinical filters. We conclude that genomic profiling will in the future provide a promising basis for network pharmacology of epistatic vulnerabilities as a promising therapeutic strategy. Availability and implementation: A web-based tool for exploring all vulnerabilities and their details is available at http://cbio.mskcc.org/cancergenomics/statius/ along with supplemental data files. Contact: statius@cbio.mskcc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24665131
A Web-Based Platform for Educating Researchers About Bioethics and Biobanking.
Sehovic, Ivana; Gwede, Clement K; Meade, Cathy D; Sodeke, Stephen; Pentz, Rebecca; Quinn, Gwendolyn P
2016-06-01
Participation in biobanking among individuals with familial risk for hereditary cancer (IFRs) and underserved/minority populations is vital for biobanking research. To address gaps in researcher knowledge regarding ethical concerns of these populations, we developed a web-based curriculum. Based on formative research and expert panel assessments, a curriculum and website was developed in an integrative, systematic manner. Researchers were recruited to evaluate the curriculum. Public health graduate students were recruited to pilot test the curriculum. All 14 researchers agreed the curriculum was easy to understand, adequately addressed the domains, and contained appropriate post-test questions. The majority evaluated the dialgoue animations as interesting and valuable. Twenty-two graduate students completed the curriculum, and 77 % improved their overall test score. A web-based curriculum is an acceptable and effective way to provide information to researchers about vulnerable populations' biobanking concerns. Future goals are to incorporate the curriculum with larger organizations.
A Web-based Platform for Educating Researchers about Bioethics and Biobanking
Sehovic, Ivana; Gwede, Clement K.; Meade, Cathy D.; Sodeke, Stephen; Pentz, Rebecca; Quinn, Gwendolyn P.
2015-01-01
Background Participation in biobanking among individuals with familial risk for hereditary cancer (IFRs) and underserved/minority populations is vital for biobanking research. To address gaps in researcher knowledge regarding ethical concerns of these populations, we developed a web-based curriculum. Methods Based on formative research and expert panel assessments, a curriculum and website was developed in an integrative, systematic manner. Researchers were recruited to evaluate the curriculum. Public health graduate students were recruited to pilot test the curriculum. Results All 14 researchers agreed that the curriculum was easy to understand, adequately addressed the domains, and contained appropriate post-test questions. A majority felt the dialogue animations were interesting and valuable. 22 graduate students completed the curriculum and 77% improved their overall test score. Conclusions A web-based curriculum is an acceptable and effective way to provide information to researchers about vulnerable populations’ biobanking concerns. Future goals are to incorporate the curriculum with larger organizations. PMID:25773136
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
...; Comment Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer... currently valid OMB control number. Proposed Collection: Title: NCI Cancer Genetics Services Directory Web... application form and the Web-based update mailer is to collect information about genetics professionals to be...
Enhancing UCSF Chimera through web services
Huang, Conrad C.; Meng, Elaine C.; Morris, John H.; Pettersen, Eric F.; Ferrin, Thomas E.
2014-01-01
Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. PMID:24861624
pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2014-01-01
This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.
Design and Implementation of High Interaction Client Honeypot for Drive-by-Download Attacks
NASA Astrophysics Data System (ADS)
Akiyama, Mitsuaki; Iwamura, Makoto; Kawakoya, Yuhei; Aoki, Kazufumi; Itoh, Mitsutaka
Nowadays, the number of web-browser targeted attacks that lead users to adversaries' web sites and exploit web browser vulnerabilities is increasing, and a clarification of their methods and countermeasures is urgently needed. In this paper, we introduce the design and implementation of a new client honeypot for drive-by-download attacks that has the capacity to detect and investigate a variety of malicious web sites. On the basis of the problems of existing client honeypots, we enumerate the requirements of a client honeypot: 1) detection accuracy and variety, 2) collection variety, 3) performance efficiency, and 4) safety and stability. We improve our system with regard to these requirements. The key features of our developed system are stepwise detection focusing on exploit phases, multiple crawler processing, tracking of malware distribution networks, and malware infection prevention. Our evaluation of our developed system in a laboratory experiment and field experiment indicated that its detection variety and crawling performance are higher than those of existing client honeypots. In addition, our system is able to collect information for countermeasures and is secure and stable for continuous operation. We conclude that our system can investigate malicious web sites comprehensively and support countermeasures.
Assessing application vulnerability to radiation-induced SEUs in memory
NASA Technical Reports Server (NTRS)
Springer, P. L.
2001-01-01
One of the goals of the Remote Exploration and Experimentation (REE) project at JPL is to determine how vulnerable applications are to single event upsets (SEUs) when run in low radiation space environments using commercial-off-the-shelf (COTS) components.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-15
... Request; NCI Cancer Genetics Services Directory Web-Based Application Form and Update Mailer Summary: In... Cancer Genetics Services Directory Web-based Application Form and Update Mailer. [[Page 14035
Web services in the U.S. geological survey streamstats web application
Guthrie, J.D.; Dartiguenave, C.; Ries, Kernell G.
2009-01-01
StreamStats is a U.S. Geological Survey Web-based GIS application developed as a tool for waterresources planning and management, engineering design, and other applications. StreamStats' primary functionality allows users to obtain drainage-basin boundaries, basin characteristics, and streamflow statistics for gaged and ungaged sites. Recently, Web services have been developed that provide the capability to remote users and applications to access comprehensive GIS tools that are available in StreamStats, including delineating drainage-basin boundaries, computing basin characteristics, estimating streamflow statistics for user-selected locations, and determining point features that coincide with a National Hydrography Dataset (NHD) reach address. For the state of Kentucky, a web service also has been developed that provides users the ability to estimate daily time series of drainage-basin average values of daily precipitation and temperature. The use of web services allows the user to take full advantage of the datasets and processes behind the Stream Stats application without having to develop and maintain them. ?? 2009 IEEE.
A randomized controlled study about the use of eHealth in the home health care of premature infants.
Gund, Anna; Sjöqvist, Bengt Arne; Wigert, Helena; Hentz, Elisabet; Lindecrantz, Kaj; Bry, Kristina
2013-02-09
One area where the use of information and communication technology (ICT), or eHealth, could be developed is the home health care of premature infants. The aim of this randomized controlled study was to investigate whether the use of video conferencing or a web application improves parents' satisfaction in taking care of a premature infant at home and decreases the need of home visits. In addition, nurses' attitudes regarding the use of these tools were examined. Thirty-four families were randomized to one of three groups before their premature infant was discharged from the hospital to home health care: a control group receiving standard home health care (13 families); a web group receiving home health care supplemented with the use of a web application (12 families); a video group with home health care supplemented with video conferencing using Skype (9 families). Families and nursing staff answered questionnaires about the usefulness of ICT. In addition, semi-structured interviews were conducted with 16 families. All the parents in the web group found the web application easy to use. 83% of the families thought it was good to have access to their child's data through the application. All the families in the video group found Skype easy to use and were satisfied with the video calls. 88% of the families thought that video calls were better than ordinary phone calls. 33% of the families in the web group and 75% of those in the video group thought the need for home visits was decreased by the web application or Skype. 50% of the families in the web group and 100% of those in the video group thought the web application or the video calls had helped them feel more confident in caring for their child. Most of the nurses were motivated to use ICT but some were reluctant and avoided using the web application and video conferencing. The families were satisfied with both the web application and video conferencing. The families readily embraced the use of ICT, whereas motivating some of the nurses to accept and use ICT was a major challenge.
A randomized controlled study about the use of eHealth in the home health care of premature infants
2013-01-01
Background One area where the use of information and communication technology (ICT), or eHealth, could be developed is the home health care of premature infants. The aim of this randomized controlled study was to investigate whether the use of video conferencing or a web application improves parents’ satisfaction in taking care of a premature infant at home and decreases the need of home visits. In addition, nurses’ attitudes regarding the use of these tools were examined. Method Thirty-four families were randomized to one of three groups before their premature infant was discharged from the hospital to home health care: a control group receiving standard home health care (13 families); a web group receiving home health care supplemented with the use of a web application (12 families); a video group with home health care supplemented with video conferencing using Skype (9 families). Families and nursing staff answered questionnaires about the usefulness of ICT. In addition, semi-structured interviews were conducted with 16 families. Results All the parents in the web group found the web application easy to use. 83% of the families thought it was good to have access to their child’s data through the application. All the families in the video group found Skype easy to use and were satisfied with the video calls. 88% of the families thought that video calls were better than ordinary phone calls. 33% of the families in the web group and 75% of those in the video group thought the need for home visits was decreased by the web application or Skype. 50% of the families in the web group and 100% of those in the video group thought the web application or the video calls had helped them feel more confident in caring for their child. Most of the nurses were motivated to use ICT but some were reluctant and avoided using the web application and video conferencing. Conclusion The families were satisfied with both the web application and video conferencing. The families readily embraced the use of ICT, whereas motivating some of the nurses to accept and use ICT was a major challenge. PMID:23394465
Enhancing promotional strategies within social marketing programs: use of Web 2.0 social media.
Thackeray, Rosemary; Neiger, Brad L; Hanson, Carl L; McKenzie, James F
2008-10-01
The second generation of Internet-based applications (i.e., Web 2.0), in which users control communication, holds promise to significantly enhance promotional efforts within social marketing campaigns. Web 2.0 applications can directly engage consumers in the creative process by both producing and distributing information through collaborative writing, content sharing, social networking, social bookmarking, and syndication. Web 2.0 can also enhance the power of viral marketing by increasing the speed at which consumers share experiences and opinions with progressively larger audiences. Because of the novelty and potential effectiveness of Web 2.0, social marketers may be enticed to prematurely incorporate related applications into promotional plans. However, as strategic issues such as priority audience preferences, selection of appropriate applications, tracking and evaluation, and related costs are carefully considered, Web 2.0 will expand to allow health promotion practitioners more direct access to consumers with less dependency on traditional communication channels.
NASA Astrophysics Data System (ADS)
Lykiardopoulos, A.; Iona, A.; Lakes, V.; Batis, A.; Balopoulos, E.
2009-04-01
The development of new technologies for the aim of enhancing Web Applications with Dynamically data access was the starting point for Geospatial Web Applications to developed at the same time as well. By the means of these technologies the Web Applications embed the capability of presenting Geographical representations of the Geo Information. The induction in nowadays, of the state of the art technologies known as Web Services, enforce the Web Applications to have interoperability among them i.e. to be able to process requests from each other via a network. In particular throughout the Oceanographic Community, modern Geographical Information systems based on Geospatial Web Services are now developed or will be developed shortly in the near future, with capabilities of managing the information itself fully through Web Based Geographical Interfaces. The exploitation of HNODC Data Base, through a Web Based Application enhanced with Web Services by the use of open source tolls may be consider as an ideal case of such implementation. Hellenic National Oceanographic Data Center (HNODC) as a National Public Oceanographic Data provider and at the same time a member of the International Net of Oceanographic Data Centers( IOC/IODE), owns a very big volume of Data and Relevant information about the Marine Ecosystem. For the efficient management and exploitation of these Data, a relational Data Base has been constructed with a storage of over 300.000 station data concerning, physical, chemical and biological Oceanographic information. The development of a modern Web Application for the End User worldwide to be able to explore and navigate throughout HNODC data via the use of an interface with the capability of presenting Geographical representations of the Geo Information, is today a fact. The application is constituted with State of the art software components and tools such as: • Geospatial and no Spatial Web Services mechanisms • Geospatial open source tools for the creation of Dynamic Geographical Representations. • Communication protocols (messaging mechanisms) in all Layers such as XML and GML together with SOAP protocol via Apache/Axis. At the same time the application may interact with any other SOA application either in sending or receiving Geospatial Data through Geographical Layers, since it inherits the big advantage of interoperability between Web Services systems. Roughly the Architecture can denoted as follows: • At the back End Open source PostgreSQL DBMS stands as the data storage mechanism with more than one Data Base Schemas cause of the separation of the Geospatial Data and the non Geospatial Data. • UMN Map Server and Geoserver are the mechanisms for: Represent Geospatial Data via Web Map Service (WMS) Querying and Navigating in Geospatial and Meta Data Information via Web Feature Service (WFS) oAnd in the near future Transacting and processing new or existing Geospatial Data via Web Processing Service (WPS) • Map Bender, a geospatial portal site management software for OGC and OWS architectures acts as the integration module between the Geospatial Mechanisms. Mapbender comes with an embedded data model capable to manage interfaces for displaying, navigating and querying OGC compliant web map and feature services (WMS and transactional WFS). • Apache and Tomcat stand again as the Web Service middle Layers • Apache Axis with it's embedded implementation of the SOAP protocol ("Simple Object Access Protocol") acts as the No spatial data Mechanism of Web Services. These modules of the platform are still under development but their implementation will be fulfilled in the near future. • And a new Web user Interface for the end user based on enhanced and customized version of a MapBender GUI, a powerful Web Services client. For HNODC the interoperability of Web Services is the big advantage of the developed platform since it is capable to act in the future as provider and consumer of Web Services in both ways: • Either as data products provider for external SOA platforms. • Or as consumer of data products from external SOA platforms for new applications to be developed or for existing applications to be enhanced. A great paradigm of Data Managenet integration and dissemination via the use of such technologies is the European's Union Research Project Seadatanet, with the main objective to develop a standardized distributed system for managing and disseminating the large and diverse data sets and to enhance the currently existing infrastructures with Web Services Further more and when the technology of Web Processing Service (WPS), will be mature enough and applicable for development, the derived data products will be able to have any kind of GIS functionality for consumers across the network. From this point of view HNODC, joins the global scientific community by providing and consuming application Independent data products.
Employing WebGL to develop interactive stereoscopic 3D content for use in biomedical visualization
NASA Astrophysics Data System (ADS)
Johnston, Semay; Renambot, Luc; Sauter, Daniel
2013-03-01
Web Graphics Library (WebGL), the forthcoming web standard for rendering native 3D graphics in a browser, represents an important addition to the biomedical visualization toolset. It is projected to become a mainstream method of delivering 3D online content due to shrinking support for third-party plug-ins. Additionally, it provides a virtual reality (VR) experience to web users accommodated by the growing availability of stereoscopic displays (3D TV, desktop, and mobile). WebGL's value in biomedical visualization has been demonstrated by applications for interactive anatomical models, chemical and molecular visualization, and web-based volume rendering. However, a lack of instructional literature specific to the field prevents many from utilizing this technology. This project defines a WebGL design methodology for a target audience of biomedical artists with a basic understanding of web languages and 3D graphics. The methodology was informed by the development of an interactive web application depicting the anatomy and various pathologies of the human eye. The application supports several modes of stereoscopic displays for a better understanding of 3D anatomical structures.
Development of a Web-based financial application System
NASA Astrophysics Data System (ADS)
Hasan, M. R.; Ibrahimy, M. I.; Motakabber, S. M. A.; Ferdaus, M. M.; Khan, M. N. H.; Mostafa, M. G.
2013-12-01
The paper describes a technique to develop a web based financial system, following latest technology and business needs. In the development of web based application, the user friendliness and technology both are very important. It is used ASP .NET MVC 4 platform and SQL 2008 server for development of web based financial system. It shows the technique for the entry system and report monitoring of the application is user friendly. This paper also highlights the critical situations of development, which will help to develop the quality product.
Chu, Larry F; Young, Chelsea A; Zamora, Abby K; Lowe, Derek; Hoang, Dan B; Pearl, Ronald G; Macario, Alex
2011-02-01
Despite the use of web-based information resources by both anesthesia departments and applicants, little research has been done to assess these resources and determine whether they are meeting applicant needs. Evidence is needed to guide anesthesia informatics research in developing high-quality anesthesia residency program Web sites (ARPWs). We used an anonymous web-based program (SurveyMonkey, Portland, OR) to distribute a survey investigating the information needs and perceived usefulness of ARPWs to all 572 Stanford anesthesia residency program applicants. A quantitative scoring system was then created to assess the quality of ARPWs in meeting the information needs of these applicants. Two researchers independently analyzed all 131 ARPWs in the United States to determine whether the ARPWs met the needs of applicants based on the scoring system. Finally, a qualitative assessment of the overall user experience of ARPWs was developed to account for the subjective elements of the Web site's presentation. Ninety-eight percent of respondents reported having used ARPWs during the application process. Fifty-six percent reported first visiting the Stanford ARPW when deciding whether to apply to Stanford's anesthesia residency program. Multimedia and Web 2.0 technologies were "very" or "most" useful in "learning intangible aspects of a program, like how happy people are" (42% multimedia and Web 2.0 versus 14% text and photos). ARPWs, on average, contained only 46% of the content items identified as important by applicants. The average (SD) quality scores among all ARPWs was 2.06 (0.59) of 4.0 maximum points. The mean overall qualitative score for all 131 ARPWs was 4.97 (1.92) of 10 points. Only 2% of applicants indicated that the majority (75%-100%) of Web sites they visited provided a complete experience. Anesthesia residency applicants rely heavily on ARPWs to research programs, prepare for interviews, and formulate a rank list. Anesthesia departments can improve their ARPWs by including information such as total hours worked and work hours by rotation (missing in 96% and 97% of ARPWs) and providing a valid web address on the Fellowship and Residency Electronic Interactive Database Access System (FREIDA) (missing in 28% of ARPWs).
ERIC Educational Resources Information Center
Kim, Minsung; Kim, Kamyoung; Lee, Sang-Il
2013-01-01
This article examines the pedagogical potential of a Web-based GIS application, Population Migration Web Service (PMWS), in which students can examine population geography in an interactive and exploratory manner. This article introduces PMWS, a tailored, unique Internet GIS application that provides functions for visualizing spatial interaction…
Modelling Safe Interface Interactions in Web Applications
NASA Astrophysics Data System (ADS)
Brambilla, Marco; Cabot, Jordi; Grossniklaus, Michael
Current Web applications embed sophisticated user interfaces and business logic. The original interaction paradigm of the Web based on static content pages that are browsed by hyperlinks is, therefore, not valid anymore. In this paper, we advocate a paradigm shift for browsers and Web applications, that improves the management of user interaction and browsing history. Pages are replaced by States as basic navigation nodes, and Back/Forward navigation along the browsing history is replaced by a full-fledged interactive application paradigm, supporting transactions at the interface level and featuring Undo/Redo capabilities. This new paradigm offers a safer and more precise interaction model, protecting the user from unexpected behaviours of the applications and the browser.
NASA Astrophysics Data System (ADS)
Tsagarakis, K.; Coll, M.; Giannoulaki, M.; Somarakis, S.; Papaconstantinou, C.; Machias, A.
2010-06-01
A mass-balance trophic model was built to describe the food-web traits of the North Aegean Sea (Strymonikos Gulf and Thracian Sea, Greece, Eastern Mediterranean) during the mid-2000s and to explore the impacts of fishing. This is the first food-web model representing the Aegean Sea, and results were presented and discussed in comparison to other previous ecosystems modelled from the western and the central areas of the basin (South Catalan and North-Central Adriatic Seas). Forty functional groups were defined, covering the entire trophic spectrum from lower to higher trophic levels. Emphasis was placed on commercial invertebrates and fish. The potential ecological role of the invasive ctenophore, Mnemiopsis leidyi, and several vulnerable groups (e.g., dolphins) was also explored. Results confirmed the spatial productivity patterns known for the Mediterranean Sea showing, for example, that the total biomass is highest in N.C. Adriatic and lowest in N. Aegean Sea. Accordingly, food-web flows and several ecosystem indicators like the mean transfer efficiency were influenced by these patterns. Nevertheless, all three systems shared some common features evidencing similarities of Mediterranean Sea ecosystems such as dominance of the pelagic fraction in terms of flows and strong benthic-pelagic coupling of zooplankton and benthic invertebrates through detritus. The importance of detritus highlighted the role of the microbial food-web, which was indirectly considered through detritus dynamics. Ciliates, mesozooplankton and several benthic invertebrate groups were shown as important elements of the ecosystem linking primary producers and detritus with higher trophic levels in the N. Aegean Sea. Adult anchovy was shown as the most important fish group in terms of production, consumption and overall effect on the rest of the ecological groups in the model, in line with results from the Western Mediterranean Sea. The five fishing fleets considered (both artisanal and industrial) had high impacts on vulnerable species and numerous targeted groups given the multispecies nature of the fisheries in the N. Aegean Sea. Several exploitation indices highlighted that the N. Aegean Sea ecosystem was highly exploited and unlikely to be sustainably fished, similarly to other Mediterranean marine ecosystems.
Gobe: an interactive, web-based tool for comparative genomic visualization.
Pedersen, Brent S; Tang, Haibao; Freeling, Michael
2011-04-01
Gobe is a web-based tool for viewing comparative genomic data. It supports viewing multiple genomic regions simultaneously. Its simple text format and flash-based rendering make it an interactive, exploratory research tool. Gobe can be used without installation through our web service, or downloaded and customized with stylesheets and javascript callback functions. Gobe is a flash application that runs in all modern web-browsers. The full source-code, including that for the online web application is available under the MIT license at: http://github.com/brentp/gobe. Sample applications are hosted at http://try-gobe.appspot.com/ and http://synteny.cnr.berkeley.edu/gobe-app/.
Chemozart: a web-based 3D molecular structure editor and visualizer platform.
Mohebifar, Mohamad; Sajadi, Fatemehsadat
2015-01-01
Chemozart is a 3D Molecule editor and visualizer built on top of native web components. It offers an easy to access service, user-friendly graphical interface and modular design. It is a client centric web application which communicates with the server via a representational state transfer style web service. Both client-side and server-side application are written in JavaScript. A combination of JavaScript and HTML is used to draw three-dimensional structures of molecules. With the help of WebGL, three-dimensional visualization tool is provided. Using CSS3 and HTML5, a user-friendly interface is composed. More than 30 packages are used to compose this application which adds enough flexibility to it to be extended. Molecule structures can be drawn on all types of platforms and is compatible with mobile devices. No installation is required in order to use this application and it can be accessed through the internet. This application can be extended on both server-side and client-side by implementing modules in JavaScript. Molecular compounds are drawn on the HTML5 Canvas element using WebGL context. Chemozart is a chemical platform which is powerful, flexible, and easy to access. It provides an online web-based tool used for chemical visualization along with result oriented optimization for cloud based API (application programming interface). JavaScript libraries which allow creation of web pages containing interactive three-dimensional molecular structures has also been made available. The application has been released under Apache 2 License and is available from the project website https://chemozart.com.
Web-based UMLS concept retrieval by automatic text scanning: a comparison of two methods.
Brandt, C; Nadkarni, P
2001-01-01
The Web is increasingly the medium of choice for multi-user application program delivery. Yet selection of an appropriate programming environment for rapid prototyping, code portability, and maintainability remain issues. We summarize our experience on the conversion of a LISP Web application, Search/SR to a new, functionally identical application, Search/SR-ASP using a relational database and active server pages (ASP) technology. Our results indicate that provision of easy access to database engines and external objects is almost essential for a development environment to be considered viable for rapid and robust application delivery. While LISP itself is a robust language, its use in Web applications may be hard to justify given that current vendor implementations do not provide such functionality. Alternative, currently available scripting environments for Web development appear to have most of LISP's advantages and few of its disadvantages.
A browser-based event display for the CMS Experiment at the LHC using WebGL
NASA Astrophysics Data System (ADS)
McCauley, T.
2017-10-01
Modern web browsers are powerful and sophisticated applications that support an ever-wider range of uses. One such use is rendering high-quality, GPU-accelerated, interactive 2D and 3D graphics in an HTML canvas. This can be done via WebGL, a JavaScript API based on OpenGL ES. Applications delivered via the browser have several distinct benefits for the developer and user. For example, they can be implemented using well-known and well-developed technologies, while distribution and use via a browser allows for rapid prototyping and deployment and ease of installation. In addition, delivery of applications via the browser allows for easy use on mobile, touch-enabled devices such as phones and tablets. iSpy WebGL is an application for visualization of events detected and reconstructed by the CMS Experiment at the Large Hadron Collider at CERN. The first event display developed for an LHC experiment to use WebGL, iSpy WebGL is a client-side application written in JavaScript, HTML, and CSS and uses the WebGL API three.js. iSpy WebGL is used for monitoring of CMS detector performance, for production of images and animations of CMS collisions events for the public, as a virtual reality application using Google Cardboard, and asa tool available for public education and outreach such as in the CERN Open Data Portal and the CMS masterclasses. We describe here its design, development, and usage as well as future plans.
NASA Astrophysics Data System (ADS)
Salek, Mansour; Levison, Jana; Parker, Beth; Gharabaghi, Bahram
2018-06-01
Road salt is pervasively used throughout Canada and in other cold regions during winter. For cities relying exclusively on groundwater, it is important to plan and minimize the application of salt accordingly to mitigate the adverse effects of high chloride concentrations in water supply aquifers. The use of geospatial data (road network, land use, Quaternary and bedrock geology, average annual recharge, water-table depth, soil distribution, topography) in the DRASTIC methodology provides an efficient way of distinguishing salt-vulnerable areas associated with groundwater supply wells, to aid in the implementation of appropriate management practices for road salt application in urban areas. This research presents a GIS-based methodology to accomplish a vulnerability analysis for 12 municipal water supply wells within the City of Guelph, Ontario, Canada. The chloride application density (CAD) value at each supply well is calculated and related to the measured groundwater chloride concentrations and further combined with soil media and aquifer vadose- and saturated-zone properties used in DRASTIC. This combined approach, CAD-DRASTIC, is more accurate than existing groundwater vulnerability mapping methods and can be used by municipalities and other water managers to further improve groundwater protection related to road salt application.
Impacts of Urbanization on Precipitation and Storms: Physical Insights and Vulnerabilities
J. M. Shepherd
2013-01-01
On January 19, 2011, an unusual weather event occurred. The National Weather Service Web site (http://www.crh.noaa. gov/news/display_cmsstory.php?wfo¼ddc&storyid¼62980& source¼0) described the event in the following way: âthe atmosphere was cold and moist with low clouds and fog preceding the formation of the snow.
Securing SSL-VPN with LR-AKE to access personal health record.
Eizen, Kimura; Masato, Saito; Kazukuni, Kobara; Yoshihito, Nakato; Takuji, Kuroda; Ken, Ishihara
2013-01-01
Using SSL-VPN requires special considerations for well-known issues such as attackers exploiting web browser vulnerabilities and phishing sites using man-in-the-middle attacks. We used leakage-resilient authenticated key exchange (LR-AKE) to develop a comprehensive solution to SSL-VPN issues. Our results show that the LR-AKE should contribute to building a robust infrastructure for personal health records.
ERIC Educational Resources Information Center
Gomez, Fabinton Sotelo; Ordóñez, Armando
2016-01-01
Previously a framework for integrating web resources providing educational services in dotLRN was presented. The present paper describes the application of this framework in a rural school in Cauca--Colombia. The case study includes two web resources about the topic of waves (physics) which is oriented in secondary education. Web classes and…
Enhancing UCSF Chimera through web services.
Huang, Conrad C; Meng, Elaine C; Morris, John H; Pettersen, Eric F; Ferrin, Thomas E
2014-07-01
Integrating access to web services with desktop applications allows for an expanded set of application features, including performing computationally intensive tasks and convenient searches of databases. We describe how we have enhanced UCSF Chimera (http://www.rbvi.ucsf.edu/chimera/), a program for the interactive visualization and analysis of molecular structures and related data, through the addition of several web services (http://www.rbvi.ucsf.edu/chimera/docs/webservices.html). By streamlining access to web services, including the entire job submission, monitoring and retrieval process, Chimera makes it simpler for users to focus on their science projects rather than data manipulation. Chimera uses Opal, a toolkit for wrapping scientific applications as web services, to provide scalable and transparent access to several popular software packages. We illustrate Chimera's use of web services with an example workflow that interleaves use of these services with interactive manipulation of molecular sequences and structures, and we provide an example Python program to demonstrate how easily Opal-based web services can be accessed from within an application. Web server availability: http://webservices.rbvi.ucsf.edu/opal2/dashboard?command=serviceList. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Just tell me what you want!: the promise and perils of rapid prototyping with the World Wide Web.
Cimino, J J; Socratous, S A
1996-01-01
Construction of applications using the World Wide Web architecture and Hypertext Markup Language (HTML) documents is relatively simple. We are exploring this approach with an application, called PolyMed now in use by surgical residents for one year. We monitored use and obtained user feedback to develop new features and eliminate undesirable ones. The system has been used to keep track of over 4,200 patients. We predicted, several advantages and disadvantages to this approach to prototyping clinical applications. Our experience confirms some advantages (ease of development and customization, ability to exploit non-Web system components, and simplified user interface design) and disadvantages (lack of database management services). Some predicted disadvantages failed to materialize (difficulty modeling a clinical application with hypertext and inconveniences associated with the "connectionless" nature of the Web). We were disappointed to find that while integration of external Web applications (such as Medline) into our application was easy, our users did not find it useful.
Just tell me what you want!: the promise and perils of rapid prototyping with the World Wide Web.
Cimino, J. J.; Socratous, S. A.
1996-01-01
Construction of applications using the World Wide Web architecture and Hypertext Markup Language (HTML) documents is relatively simple. We are exploring this approach with an application, called PolyMed now in use by surgical residents for one year. We monitored use and obtained user feedback to develop new features and eliminate undesirable ones. The system has been used to keep track of over 4,200 patients. We predicted, several advantages and disadvantages to this approach to prototyping clinical applications. Our experience confirms some advantages (ease of development and customization, ability to exploit non-Web system components, and simplified user interface design) and disadvantages (lack of database management services). Some predicted disadvantages failed to materialize (difficulty modeling a clinical application with hypertext and inconveniences associated with the "connectionless" nature of the Web). We were disappointed to find that while integration of external Web applications (such as Medline) into our application was easy, our users did not find it useful. PMID:8947759
Secure Web-Site Access with Tickets and Message-Dependent Digests
Donato, David I.
2008-01-01
Although there are various methods for restricting access to documents stored on a World Wide Web (WWW) site (a Web site), none of the widely used methods is completely suitable for restricting access to Web applications hosted on an otherwise publicly accessible Web site. A new technique, however, provides a mix of features well suited for restricting Web-site or Web-application access to authorized users, including the following: secure user authentication, tamper-resistant sessions, simple access to user state variables by server-side applications, and clean session terminations. This technique, called message-dependent digests with tickets, or MDDT, maintains secure user sessions by passing single-use nonces (tickets) and message-dependent digests of user credentials back and forth between client and server. Appendix 2 provides a working implementation of MDDT with PHP server-side code and JavaScript client-side code.
Young, Bradley L; Cantrell, Colin K; Patt, Joshua C; Ponce, Brent A
2018-06-01
Accessible, adequate online information is important to fellowship applicants. Program web sites can affect which programs applicants apply to, subsequently altering interview costs incurred by both parties and ultimately impacting rank lists. Web site analyses have been performed for all orthopaedic subspecialties other than those involved in the combined adult reconstruction and musculoskeletal (MSK) oncology fellowship match. A complete list of active programs was obtained from the official adult reconstruction and MSK oncology society web sites. Web site accessibility was assessed using a structured Google search. Accessible web sites were evaluated based on 21 previously reported content criteria. Seventy-four adult reconstruction programs and 11 MSK oncology programs were listed on the official society web sites. Web sites were identified and accessible for 58 (78%) adult reconstruction and 9 (82%) MSK oncology fellowship programs. No web site contained all content criteria and more than half of both adult reconstruction and MSK oncology web sites failed to include 12 of the 21 criteria. Several programs participating in the combined Adult Reconstructive Hip and Knee/Musculoskeletal Oncology Fellowship Match did not have accessible web sites. Of the web sites that were accessible, none contained comprehensive information and the majority lacked information that has been previously identified as being important to perspective applicants.
Browser-Based Online Applications: Something for Everyone!
ERIC Educational Resources Information Center
Descy, Don E.
2007-01-01
Just as many people log onto a Web mail site (Gmail, Yahoo, MSN, etc.) to read, write and store their email, there are Web sites out there with word processing, database, and a myriad of other software applications that are not downloadable but used on the site through a Web browser. The user does not have to download the applications to a…
Documenting clinical pharmacist intervention before and after the introduction of a web-based tool.
Nurgat, Zubeir A; Al-Jazairi, Abdulrazaq S; Abu-Shraie, Nada; Al-Jedai, Ahmed
2011-04-01
To develop a database for documenting pharmacist intervention through a web-based application. The secondary endpoint was to determine if the new, web-based application provides any benefits with regards to documentation compliance by clinical pharmacists and ease of calculating cost savings compared with our previous method of documenting pharmacist interventions. A tertiary care hospital in Saudi Arabia. The documentation of interventions using a web-based documentation application was retrospectively compared with previous methods of documentation of clinical pharmacists' interventions (multi-user PC software). The number and types of interventions recorded by pharmacists, data mining of archived data, efficiency, cost savings, and the accuracy of the data generated. The number of documented clinical interventions increased from 4,926, using the multi-user PC software, to 6,840 for the web-based application. On average, we observed 653 interventions per clinical pharmacist using the web-based application, which showed an increase compared to an average of 493 interventions using the old multi-user PC software. However, using a paired Student's t-test there was no statistical significance difference between the two means (P = 0.201). Using a χ² test, which captured management level and the type of system used, we found a strong effect of management level (P < 2.2 × 10⁻¹⁶) on the number of documented interventions. We also found a moderately significant relationship between educational level and the number of interventions documented (P = 0.045). The mean ± SD time required to document an intervention using the web-based application was 66.55 ± 8.98 s. Using the web-based application, 29.06% of documented interventions resulted in cost-savings, while using the multi-user PC software only 4.75% of interventions did so. The majority of cost savings across both platforms resulted from the discontinuation of unnecessary drugs and a change in dosage regimen. Data collection using the web-based application was consistently more complete when compared to the multi-user PC software. The web-based application is an efficient system for documenting pharmacist interventions. Its flexibility and accessibility, as well as its detailed report functionality is a useful tool that will hopefully encourage other primary and secondary care facilities to adopt similar applications.
eSciMart: Web Platform for Scientific Software Marketplace
NASA Astrophysics Data System (ADS)
Kryukov, A. P.; Demichev, A. P.
2016-10-01
In this paper we suggest a design of a web marketplace where users of scientific application software and databases, presented in the form of web services, as well as their providers will have presence simultaneously. The model, which will be the basis for the web marketplace is close to the customer-to-customer (C2C) model, which has been successfully used, for example, on the auction sites such as eBay (ebay.com). Unlike the classical model of C2C the suggested marketplace focuses on application software in the form of web services, and standardization of API through which application software will be integrated into the web marketplace. A prototype of such a platform, entitled eSciMart, is currently being developed at SINP MSU.
NGL Viewer: a web application for molecular visualization
Rose, Alexander S.; Hildebrand, Peter W.
2015-01-01
The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. ‘cartoon, spacefill, licorice’). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations. PMID:25925569
Evaluating Application Resilience with XRay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Sui; Bronevetsky, Greg; Li, Bin
2015-05-07
The rising count and shrinking feature size of transistors within modern computers is making them increasingly vulnerable to various types of soft faults. This problem is especially acute in high-performance computing (HPC) systems used for scientific computing, because these systems include many thousands of compute cores and nodes, all of which may be utilized in a single large-scale run. The increasing vulnerability of HPC applications to errors induced by soft faults is motivating extensive work on techniques to make these applications more resiilent to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithmspecific error detection andmore » tolerance techniques. Effective use of such techniques requires a detailed understanding of how a given application is affected by soft faults to ensure that (i) efforts to improve application resilience are spent in the code regions most vulnerable to faults and (ii) the appropriate resilience technique is applied to each code region. This paper presents XRay, a tool to view the application vulnerability to soft errors, and illustrates how XRay can be used in the context of a representative application. In addition to providing actionable insights into application behavior XRay automatically selects the number of fault injection experiments required to provide an informative view of application behavior, ensuring that the information is statistically well-grounded without performing unnecessary experiments.« less
Chilean geo client application for disasters
NASA Astrophysics Data System (ADS)
Suárez, Rodrigo F.; Lovison, Lucia; Potters, Martinus
2018-05-01
The global network of the Group on Earth Observation, GEO, connects all kinds of professionals from public and private institutions with data providers, sharing information to face the challenges of global changes and human development and they are creating a Global Earth Observation System of Systems (GEOSS) to connect existing data infrastructures. A GEOSS Architecture Implementation Pilot Project for Disasters in Chile (AIP-8) was created as part of a capacity building initiative and representatives of different national agencies in Chile, along with international experts, formed a GEOSS Capacity Building Working Group (Lovison et al, 2016). Consistent with the objectives of GEOSS AIP-8 Chile, we developed and implemented a prototype service based on web services, mobile applications and other communication channels, which allows connecting different sources of information, aiming to reduce population vulnerability to natural disasters such as: earthquakes, flooding, wild fires and tsunamis, which is presented here. The GEO Chile client application is a JavaScript application using GEODAB brokering services, GIS technology and disaster information provided by national and international disaster services, including public and private organizations, where cartography becomes fundamental as a tool to provide realism and ubiquity to the information. Seven hotpots are targeted: Calbuco, Copahue and Villarrica volcanoes areas, Valparaíso city, which is frequently a victim of wildfires in the zone where population meets forest and Iquique, Illapel and Talcahuano, areas frequently struck by earthquakes and tsunamis.
Haugen, Hans Morten
2010-08-01
The article analyses the three terms autonomy, dignity and vulnerability. The relevance and practical application of the terms is tested in two spheres. First, as guiding principles in the area of ethics of medicines and science. Second, as human rights principles, serving to guide the conduct of public policies for an effective realization of human rights. The article argues that all human beings have the same dignity, but that the autonomy--and therefore vulnerability--differs considerably. Simply said, with reduced autonomy comes increased vulnerability, implying extra attention to the protective dimensions. The article finds that the three terms approach the protection of human beings in different ways and that all are relevant and applicable in both spheres, but that an isolated notion of autonomy and a 'group-based' notion of vulnerability are not adequate.
NASA Astrophysics Data System (ADS)
Demir, I.; Sermet, M. Y.
2016-12-01
Nobody is immune from extreme events or natural hazards that can lead to large-scale consequences for the nation and public. One of the solutions to reduce the impacts of extreme events is to invest in improving resilience with the ability to better prepare, plan, recover, and adapt to disasters. The National Research Council (NRC) report discusses the topic of how to increase resilience to extreme events through a vision of resilient nation in the year 2030. The report highlights the importance of data, information, gaps and knowledge challenges that needs to be addressed, and suggests every individual to access the risk and vulnerability information to make their communities more resilient. This abstracts presents our project on developing a resilience framework for flooding to improve societal preparedness with objectives; (a) develop a generalized ontology for extreme events with primary focus on flooding; (b) develop a knowledge engine with voice recognition, artificial intelligence, natural language processing, and inference engine. The knowledge engine will utilize the flood ontology and concepts to connect user input to relevant knowledge discovery outputs on flooding; (c) develop a data acquisition and processing framework from existing environmental observations, forecast models, and social networks. The system will utilize the framework, capabilities and user base of the Iowa Flood Information System (IFIS) to populate and test the system; (d) develop a communication framework to support user interaction and delivery of information to users. The interaction and delivery channels will include voice and text input via web-based system (e.g. IFIS), agent-based bots (e.g. Microsoft Skype, Facebook Messenger), smartphone and augmented reality applications (e.g. smart assistant), and automated web workflows (e.g. IFTTT, CloudWork) to open the knowledge discovery for flooding to thousands of community extensible web workflows.
NASA Astrophysics Data System (ADS)
Yu, Weishui; Luo, Changshou; Zheng, Yaming; Wei, Qingfeng; Cao, Chengzhong
2017-09-01
To deal with the “last kilometer” problem during the agricultural science and technology information service, we analyzed the feasibility, necessity and advantages of WebApp applied to agricultural information service and discussed the modes of WebApp used in agricultural information service based on the requirements analysis and the function of WebApp. To overcome the existing App’s defects of difficult installation and weak compatibility between the mobile operating systems, the Beijing Agricultural Sci-tech Service Hotline WebApp was developed based on the HTML and JAVA technology. The WebApp has greater compatibility and simpler operation than the Native App, what’s more, it can be linked to the WeChat public platform making it spread easily and run directly without setup process. The WebApp was used to provide agricultural expert consulting services and agriculture information push, obtained a good preliminary application achievement. Finally, we concluded the creative application of WebApp in agricultural consulting services and prospected the development of WebApp in agricultural information service.
Value of Information Web Application
2015-04-01
their understanding of VoI attributes (source reliable, information content, and latency). The VoI web application emulates many features of a...only when using the Firefox web browser on those computers (Internet Explorer was not viable due to unchangeable user settings). During testing, the
SSE Announcement - New GIS Web Mapping Applications and Services
Atmospheric Science Data Center
2016-06-30
Dear SSE Users, We are excited to announce SSE-GIS v1.0.3 is now available! If you haven’t already noticed the link to the new SSE-GIS web application on the SSE homepage entitled “GIS Web Mapping ...
COEUS: “semantic web in a box” for biomedical applications
2012-01-01
Background As the “omics” revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter’s complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. Results COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a “semantic web in a box” approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. Conclusions The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/. PMID:23244467
COEUS: "semantic web in a box" for biomedical applications.
Lopes, Pedro; Oliveira, José Luís
2012-12-17
As the "omics" revolution unfolds, the growth in data quantity and diversity is bringing about the need for pioneering bioinformatics software, capable of significantly improving the research workflow. To cope with these computer science demands, biomedical software engineers are adopting emerging semantic web technologies that better suit the life sciences domain. The latter's complex relationships are easily mapped into semantic web graphs, enabling a superior understanding of collected knowledge. Despite increased awareness of semantic web technologies in bioinformatics, their use is still limited. COEUS is a new semantic web framework, aiming at a streamlined application development cycle and following a "semantic web in a box" approach. The framework provides a single package including advanced data integration and triplification tools, base ontologies, a web-oriented engine and a flexible exploration API. Resources can be integrated from heterogeneous sources, including CSV and XML files or SQL and SPARQL query results, and mapped directly to one or more ontologies. Advanced interoperability features include REST services, a SPARQL endpoint and LinkedData publication. These enable the creation of multiple applications for web, desktop or mobile environments, and empower a new knowledge federation layer. The platform, targeted at biomedical application developers, provides a complete skeleton ready for rapid application deployment, enhancing the creation of new semantic information systems. COEUS is available as open source at http://bioinformatics.ua.pt/coeus/.
Optimizing real-time Web-based user interfaces for observatories
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Pickering, Timothy E.; Porter, Dallan; Schaller, Skip
2008-08-01
In using common HTML/Ajax approaches for web-based data presentation and telescope control user interfaces at the MMT Observatory (MMTO), we rapidly were confronted with web browser performance issues. Much of the operational data at the MMTO is highly dynamic and is constantly changing during normal operations. Status of telescope subsystems must be displayed with minimal latency to telescope operators and other users. A major motivation of migrating toward web-based applications at the MMTO is to provide easy access to current and past observatory subsystem data for a wide variety of users on their favorite operating system through a familiar interface, their web browser. Performance issues, especially for user interfaces that control telescope subsystems, led to investigations of more efficient use of HTML/Ajax and web server technologies as well as other web-based technologies, such as Java and Flash/Flex. The results presented here focus on techniques for optimizing HTML/Ajax web applications with near real-time data display. This study indicates that direct modification of the contents or "nodeValue" attribute of text nodes is the most efficient method of updating data values displayed on a web page. Other optimization techniques are discussed for web-based applications that display highly dynamic data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-08-21
NREL's Developer Network, developer.nrel.gov, provides data that users can access to provide data to their own analyses, mobile and web applications. Developers can retrieve the data through a Web services API (application programming interface). The Developer Network handles overhead of serving up web services such as key management, authentication, analytics, reporting, documentation standards, and throttling in a common architecture, while allowing web services and APIs to be maintained and managed independently.
ERIC Educational Resources Information Center
Diacopoulos, Mark M.
2015-01-01
The potential for social studies to embrace instructional technology and Web 2.0 applications has become a growing trend in recent social studies research. As part of an ongoing process of collaborative enquiry between an instructional specialist and social studies teachers in a Professional Learning Community, a table of Web 2.0 applications was…
Web Application Design Using Server-Side JavaScript
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hampton, J.; Simons, R.
1999-02-01
This document describes the application design philosophy for the Comprehensive Nuclear Test Ban Treaty Research & Development Web Site. This design incorporates object-oriented techniques to produce a flexible and maintainable system of applications that support the web site. These techniques will be discussed at length along with the issues they address. The overall structure of the applications and their relationships with one another will also be described. The current problems and future design changes will be discussed as well.
Accredited hand surgery fellowship Web sites: analysis of content and accessibility.
Trehan, Samir K; Morrell, Nathan T; Akelman, Edward
2015-04-01
To assess the accessibility and content of accredited hand surgery fellowship Web sites. A list of all accredited hand surgery fellowships was obtained from the online database of the American Society for Surgery of the Hand (ASSH). Fellowship program information on the ASSH Web site was recorded. All fellowship program Web sites were located via Google search. Fellowship program Web sites were analyzed for accessibility and content in 3 domains: program overview, application information/recruitment, and education. At the time of this study, there were 81 accredited hand surgery fellowships with 169 available positions. Thirty of 81 programs (37%) had a functional link on the ASSH online hand surgery fellowship directory; however, Google search identified 78 Web sites. Three programs did not have a Web site. Analysis of content revealed that most Web sites contained contact information, whereas information regarding the anticipated clinical, research, and educational experiences during fellowship was less often present. Furthermore, information regarding past and present fellows, salary, application process/requirements, call responsibilities, and case volume was frequently lacking. Overall, 52 of 81 programs (64%) had the minimal online information required for residents to independently complete the fellowship application process. Hand fellowship program Web sites could be accessed either via the ASSH online directory or Google search, except for 3 programs that did not have Web sites. Although most fellowship program Web sites contained contact information, other content such as application information/recruitment and education, was less frequently present. This study provides comparative data regarding the clinical and educational experiences outlined on hand fellowship program Web sites that are of relevance to residents, fellows, and academic hand surgeons. This study also draws attention to various ways in which the hand surgery fellowship application process can be made more user-friendly and efficient. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Developing web-based data analysis tools for precision farming using R and Shiny
NASA Astrophysics Data System (ADS)
Jahanshiri, Ebrahim; Mohd Shariff, Abdul Rashid
2014-06-01
Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Development and evaluation of a dynamic web-based application.
Hsieh, Yichuan; Brennan, Patricia Flatley
2007-10-11
Traditional consumer health informatics (CHI) applications that were developed for lay public on the Web were commonly written in a Hypertext Markup Language (HTML). As genetics knowledge rapidly advances and requires updating information in a timely fashion, a different content structure is therefore needed to facilitate information delivery. This poster will present the process of developing a dynamic database-driven Web CHI application.
Marine Air Ground Task Force Distribution In The Battlespace
2016-09-01
benefit of this research is a proposed systemic structure with an associated web application that provides the MAGTF commander with critical...associated web application that provides the MAGTF commander with critical information for supporting operations. vi THIS PAGE INTENTIONALLY LEFT BLANK... web analytics in order to support the decision making process. The potential benefit of this research is a methodology with associated application
Transitioning Client Based NALCOMIS to a Multi Function Web Based Application
2016-09-23
NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION by Aaron P...TITLE AND SUBTITLE TRANSITIONING CLIENT- BASED NALCOMIS TO A MULTI-FUNCTION WEB- BASED APPLICATION 5. FUNDING NUMBERS 6. AUTHOR(S) Aaron P. Schnetzler 7...NALCOMIS. NALCOMIS has two configurations that are used by organizational and intermediate level maintenance activi- ties, Optimized Organizational
Web2Quests: Updating a Popular Web-Based Inquiry-Oriented Activity
ERIC Educational Resources Information Center
Kurt, Serhat
2009-01-01
WebQuest is a popular inquiry-oriented activity in which learners use Web resources. Since the creation of the innovation, almost 15 years ago, the Web has changed significantly, while the WebQuest technique has changed little. This article examines possible applications of new Web trends on WebQuest instructional strategy. Some possible…
web cellHTS2: a web-application for the analysis of high-throughput screening data.
Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael
2010-04-12
The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.
Session management for web-based healthcare applications.
Wei, L.; Sengupta, S.
1999-01-01
In health care systems, users may access multiple applications during one session of interaction with the system. However, users must sign on to each application individually, and it is difficult to maintain a common context among these applications. We are developing a session management system for web-based applications using LDAP directory service, which will allow single sign-on to multiple web-based applications, and maintain a common context among those applications for the user. This paper discusses the motivations for building this system, the system architecture, and the challenges of our approach, such as the session objects management for the user, and session security. PMID:10566511
NASA Astrophysics Data System (ADS)
Tellman, B.; Schwarz, B.
2014-12-01
This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
NASA Astrophysics Data System (ADS)
Spaulding, M. L.
2015-12-01
The vision for STORMTOOLS is to provide access to a suite of coastal planning tools (numerical models et al), available as a web service, that allows wide spread accessibly and applicability at high resolution for user selected coastal areas of interest. The first product developed under this framework were flood inundation maps, with and without sea level rise, for varying return periods for RI coastal waters. The flood mapping methodology is based on using the water level vs return periods at a primary NOAA water level gauging station and then spatially scaling the values, based on the predictions of high resolution, storm and wave simulations performed by Army Corp of Engineers, North Atlantic Comprehensive Coastal Study (NACCS) for tropical and extratropical storms on an unstructured grid, to estimate inundation levels for varying return periods. The scaling for the RI application used Newport, RI water levels as the reference point. Predictions are provided for once in 25, 50, and 100 yr return periods (at the upper 95% confidence level), with sea level rises of 1, 2, 3, and 5 ft. Simulations have also been performed for historical hurricane events including 1938, Carol (1954), Bob (1991), and Sandy (2012) and nuisance flooding events with return periods of 1, 3, 5, and 10 yr. Access to the flooding maps is via a web based, map viewer that seamlessly covers all coastal waters of the state at one meter resolution. The GIS structure of the map viewer allows overlays of additional relevant data sets (roads and highways, wastewater treatment facilities, schools, hospitals, emergency evacuation routes, etc.) as desired by the user. The simplified flooding maps are publically available and are now being implemented for state and community resilience planning and vulnerability assessment activities in response to climate change impacts.
7 CFR 1776.8 - Methods for submitting applications.
Code of Federal Regulations, 2010 CFR
2010-01-01
... applications may be filed through Grants.gov, the official Federal Government Web site at http://www.grants.gov. The applicant must be registered with Grants.gov before they can submit a grant applicant. The applicant should refer to instructions found on the Grants.gov Web site for procedures for registering and...
7 CFR 1776.8 - Methods for submitting applications.
Code of Federal Regulations, 2012 CFR
2012-01-01
... applications may be filed through Grants.gov, the official Federal Government Web site at http://www.grants.gov. The applicant must be registered with Grants.gov before they can submit a grant applicant. The applicant should refer to instructions found on the Grants.gov Web site for procedures for registering and...
7 CFR 1776.8 - Methods for submitting applications.
Code of Federal Regulations, 2013 CFR
2013-01-01
... applications may be filed through Grants.gov, the official Federal Government Web site at http://www.grants.gov. The applicant must be registered with Grants.gov before they can submit a grant applicant. The applicant should refer to instructions found on the Grants.gov Web site for procedures for registering and...
7 CFR 1776.8 - Methods for submitting applications.
Code of Federal Regulations, 2014 CFR
2014-01-01
... applications may be filed through Grants.gov, the official Federal Government Web site at http://www.grants.gov. The applicant must be registered with Grants.gov before they can submit a grant applicant. The applicant should refer to instructions found on the Grants.gov Web site for procedures for registering and...
7 CFR 1776.8 - Methods for submitting applications.
Code of Federal Regulations, 2011 CFR
2011-01-01
... applications may be filed through Grants.gov, the official Federal Government Web site at http://www.grants.gov. The applicant must be registered with Grants.gov before they can submit a grant applicant. The applicant should refer to instructions found on the Grants.gov Web site for procedures for registering and...
Free Chlorine and Cyanuric Acid Simulator Application ...
A web-based application designed to simulate the free chlorine in systems adding free chlorine and cyanuric acid, including the application of Dichlor and Trichlor. A web-based application designed to simulate the free chlorine in systems adding free chlorine and cyanuric acid, including the application of Dichlor and Trichlor.
Water Sustainability Assessment for Ten Army Installations
2011-03-26
World Wide Web (WWW) at the following public URL: http: //www.cecer.Army.mil ERDC/CERL TR-11-5 4 2 Army Water Vulnerability National water...portions of water include paper and pulp, commercial laundries, and schools. The Residential Program is meant to achieve 39 percent reduction in use in...using recycled and reclaimed water for cooling and other processes. Other industries that could lower water use by large percentages include paper
Using Semantic Templates to Study Vulnerabilities Recorded in Large Software Repositories
ERIC Educational Resources Information Center
Wu, Yan
2011-01-01
Software vulnerabilities allow an attacker to reduce a system's Confidentiality, Availability, and Integrity by exposing information, executing malicious code, and undermine system functionalities that contribute to the overall system purpose and need. With new vulnerabilities discovered everyday in a variety of applications and user environments,…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-07
... Rehabilitation Research--Disability and Rehabilitation Research Projects--Inclusive Cloud and Web Computing... Rehabilitation Research Projects (DRRPs)--Inclusive Cloud and Web Computing Notice inviting applications for new...#DRRP . Priorities: Priority 1--DRRP on Inclusive Cloud and Web Computing-- is from the notice of final...
U.S. EPA National Stormwater Calculator Mobile Web Application
This presentation gives a brief overview of the new mobile web application version of EPA's National Stormwater Calculator. It is meant to give an overview of the development of the mobile web app and to demonstrate potential uses of the new version of the National Stormwater Cal...
Web 2.0 and Nigerian Academic Librarians
ERIC Educational Resources Information Center
Adekunmisi, Sowemimo Ronke; Odunewu, Abiodun Olusegun
2016-01-01
Web 2.0 applications to library services are aimed at enhancing the provision of relevant and cost-effective information resources for quality education and research. Despite the richness of these web applications and their enormous impact on library and information services as recorded in the developed world, Nigerian academic libraries are yet…
Ajax and Firefox: New Web Applications and Browsers
ERIC Educational Resources Information Center
Godwin-Jones, Bob
2005-01-01
Alternative browsers are gaining significant market share, and both Apple and Microsoft are releasing OS upgrades which portend some interesting changes in Web development. Of particular interest for language learning professionals may be new developments in the area of Web browser based applications, particularly using an approach dubbed "Ajax."…
Computational Intelligence in Web-Based Education: A Tutorial
ERIC Educational Resources Information Center
Vasilakos, Thanos; Devedzic, Vladan; Kinshuk; Pedrycz, Witold
2004-01-01
This article discusses some important aspects of Web Intelligence (WI) in the context of educational applications. Some of the key components of WI have already attracted developers of web-based educational systems for quite some time- ontologies, adaptivity and personalization, and agents. The paper focuses on the application of Computational…
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
eWaterCycle visualisation. combining the strength of NetCDF and Web Map Service: ncWMS
NASA Astrophysics Data System (ADS)
Hut, R.; van Meersbergen, M.; Drost, N.; Van De Giesen, N.
2016-12-01
As a result of the eWatercycle global hydrological forecast we have created Cesium-ncWMS, a web application based on ncWMS and Cesium. ncWMS is a server side application capable of reading any NetCDF file written using the Climate and Forecasting (CF) conventions, and making the data available as a Web Map Service(WMS). ncWMS automatically determines available variables in a file, and creates maps colored according to map data and a user selected color scale. Cesium is a Javascript 3D virtual Globe library. It uses WebGL for rendering, which makes it very fast, and it is capable of displaying a wide variety of data types such as vectors, 3D models, and 2D maps. The forecast results are automatically uploaded to our web server running ncWMS. In turn, the web application can be used to change the settings for color maps and displayed data. The server uses the settings provided by the web application, together with the data in NetCDF to provide WMS image tiles, time series data and legend graphics to the Cesium-NcWMS web application. The user can simultaneously zoom in to the very high resolution forecast results anywhere on the world, and get time series data for any point on the globe. The Cesium-ncWMS visualisation combines a global overview with local relevant information in any browser. See the visualisation live at forecast.ewatercycle.org
Informality and employment vulnerability: application in sellers with subsistence work.
Garzón-Duque, María Osley; Cardona-Arango, María Doris; Rodríguez-Ospina, Fabio León; Segura-Cardona, Angela María
2017-10-05
To describe the origin, evolution, and application of the concept of employment vulnerability in workers who subsist on street sales. We have carried out an analysis of the literature in database in Spanish, Portuguese, and English, without restriction by country. This is a review of the gray literature of government reports, articles, and documents from Latin America and the Caribbean. We have analyzed information on the informal economy, social-employment vulnerability, and subsistence workers. The concept of informal economy is dispersed and suggested as synonymous with employment vulnerability. As a polysemic term, it generates confusion and difficulty in identifying defined profiles of employment vulnerability in informal subsistence workers, who sell their products on the streets and sidewalks of cities. The lack of a clear concept and profile of employment vulnerability for this type of workers generates a restriction on defined actions to reduce employment vulnerability. The profiles could facilitate access to the acquisition of assets that support their structure of opportunities, facilitating and mediating in the passage from vulnerability to social mobility with opportunities. We propose as a concept of employment vulnerability for subsistence workers in the informal sector, the condition of those who must work by day to eat at night, who have little or no ownership of assets, and who have a minimum structure of opportunities to prevent, face, and resist the critical situations that occur daily, putting at risk their subsistence and that of the persons who are their responsibility, thus making the connection between social and employment vulnerability.
Capturing Trust in Social Web Applications
NASA Astrophysics Data System (ADS)
O'Donovan, John
The Social Web constitutes a shift in information flow from the traditional Web. Previously, content was provided by the owners of a website, for consumption by the end-user. Nowadays, these websites are being replaced by Social Web applications which are frameworks for the publication of user-provided content. Traditionally, Web content could be `trusted' to some extent based on the site it originated from. Algorithms such as Google's PageRank were (and still are) used to compute the importance of a website, based on analysis of underlying link topology. In the Social Web, analysis of link topology merely tells us about the importance of the information framework which hosts the content. Consumers of information still need to know about the importance/reliability of the content they are reading, and therefore about the reliability of the producers of that content. Research into trust and reputation of the producers of information in the Social Web is still very much in its infancy. Every day, people are forced to make trusting decisions about strangers on the Web based on a very limited amount of information. For example, purchasing a product from an eBay seller with a `reputation' of 99%, downloading a file from a peer-to-peer application such as Bit-Torrent, or allowing Amazon.com tell you what products you will like. Even something as simple as reading comments on a Web-blog requires the consumer to make a trusting decision about the quality of that information. In all of these example cases, and indeed throughout the Social Web, there is a pressing demand for increased information upon which we can make trusting decisions. This chapter examines the diversity of sources from which trust information can be harnessed within Social Web applications and discusses a high level classification of those sources. Three different techniques for harnessing and using trust from a range of sources are presented. These techniques are deployed in two sample Social Web applications—a recommender system and an online auction. In all cases, it is shown that harnessing an increased amount of information upon which to make trust decisions greatly enhances the user experience with the Social Web application.
Berrouiguet, Sofian; Barrigón, Maria Luisa; Brandt, Sara A; Ovejero-García, Santiago; Álvarez-García, Raquel; Carballo, Juan Jose; Lenca, Philippe; Courtet, Philippe; Baca-García, Enrique
2016-01-01
The emergence of electronic prescribing devices with clinical decision support systems (CDSS) is able to significantly improve management pharmacological treatments. We developed a web application available on smartphones in order to help clinicians monitor prescription and further propose CDSS. A web application (www.MEmind.net) was developed to assess patients and collect data regarding gender, age, diagnosis and treatment. We analyzed antipsychotic prescriptions in 4345 patients attended in five Psychiatric Community Mental Health Centers from June 2014 to October 2014. The web-application reported average daily dose prescribed for antipsychotics, prescribed daily dose (PDD), and the PDD to defined daily dose (DDD) ratio. The MEmind web-application reported that antipsychotics were used in 1116 patients out of the total sample, mostly in 486 (44%) patients with schizophrenia related disorders but also in other diagnoses. Second generation antipsychotics (quetiapine, aripiprazole and long-acting paliperidone) were preferably employed. Low doses were more frequently used than high doses. Long acting paliperidone and ziprasidone however, were the only two antipsychotics used at excessive dosing. Antipsychotic polypharmacy was used in 287 (26%) patients with classic depot drugs, clotiapine, amisulpride and clozapine. In this study we describe the first step of the development of a web application that is able to make polypharmacy, high dose usage and off label usage of antipsychotics visible to clinicians. Current development of the MEmind web application may help to improve prescription security via momentary feedback of prescription and clinical decision support system.
Berrouiguet, Sofian; Barrigón, Maria Luisa; Brandt, Sara A.; Ovejero-García, Santiago; Álvarez-García, Raquel; Carballo, Juan Jose; Lenca, Philippe; Courtet, Philippe; Baca-García, Enrique
2016-01-01
Purpose The emergence of electronic prescribing devices with clinical decision support systems (CDSS) is able to significantly improve management pharmacological treatments. We developed a web application available on smartphones in order to help clinicians monitor prescription and further propose CDSS. Method A web application (www.MEmind.net) was developed to assess patients and collect data regarding gender, age, diagnosis and treatment. We analyzed antipsychotic prescriptions in 4345 patients attended in five Psychiatric Community Mental Health Centers from June 2014 to October 2014. The web-application reported average daily dose prescribed for antipsychotics, prescribed daily dose (PDD), and the PDD to defined daily dose (DDD) ratio. Results The MEmind web-application reported that antipsychotics were used in 1116 patients out of the total sample, mostly in 486 (44%) patients with schizophrenia related disorders but also in other diagnoses. Second generation antipsychotics (quetiapine, aripiprazole and long-acting paliperidone) were preferably employed. Low doses were more frequently used than high doses. Long acting paliperidone and ziprasidone however, were the only two antipsychotics used at excessive dosing. Antipsychotic polypharmacy was used in 287 (26%) patients with classic depot drugs, clotiapine, amisulpride and clozapine. Conclusions In this study we describe the first step of the development of a web application that is able to make polypharmacy, high dose usage and off label usage of antipsychotics visible to clinicians. Current development of the MEmind web application may help to improve prescription security via momentary feedback of prescription and clinical decision support system. PMID:27764107
QuickEval: a web application for psychometric scaling experiments
NASA Astrophysics Data System (ADS)
Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius
2015-01-01
QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.
Project Assessment Skills Web Application
NASA Technical Reports Server (NTRS)
Goff, Samuel J.
2013-01-01
The purpose of this project is to utilize Ruby on Rails to create a web application that will replace a spreadsheet keeping track of training courses and tasks. The goal is to create a fast and easy to use web application that will allow users to track progress on training courses. This application will allow users to update and keep track of all of the training required of them. The training courses will be organized by group and by user, making readability easier. This will also allow group leads and administrators to get a sense of how everyone is progressing in training. Currently, updating and finding information from this spreadsheet is a long and tedious task. By upgrading to a web application, finding and updating information will be easier than ever as well as adding new training courses and tasks. Accessing this data will be much easier in that users just have to go to a website and log in with NDC credentials rather than request the relevant spreadsheet from the holder. In addition to Ruby on Rails, I will be using JavaScript, CSS, and jQuery to help add functionality and ease of use to my web application. This web application will include a number of features that will help update and track progress on training. For example, one feature will be to track progress of a whole group of users to be able to see how the group as a whole is progressing. Another feature will be to assign tasks to either a user or a group of users. All of these together will create a user friendly and functional web application.
Dominkovics, Pau; Granell, Carlos; Pérez-Navarro, Antoni; Casals, Martí; Orcau, Angels; Caylà, Joan A
2011-11-29
Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios.
2011-01-01
Background Health professionals and authorities strive to cope with heterogeneous data, services, and statistical models to support decision making on public health. Sophisticated analysis and distributed processing capabilities over geocoded epidemiological data are seen as driving factors to speed up control and decision making in these health risk situations. In this context, recent Web technologies and standards-based web services deployed on geospatial information infrastructures have rapidly become an efficient way to access, share, process, and visualize geocoded health-related information. Methods Data used on this study is based on Tuberculosis (TB) cases registered in Barcelona city during 2009. Residential addresses are geocoded and loaded into a spatial database that acts as a backend database. The web-based application architecture and geoprocessing web services are designed according to the Representational State Transfer (REST) principles. These web processing services produce spatial density maps against the backend database. Results The results are focused on the use of the proposed web-based application to the analysis of TB cases in Barcelona. The application produces spatial density maps to ease the monitoring and decision making process by health professionals. We also include a discussion of how spatial density maps may be useful for health practitioners in such contexts. Conclusions In this paper, we developed web-based client application and a set of geoprocessing web services to support specific health-spatial requirements. Spatial density maps of TB incidence were generated to help health professionals in analysis and decision-making tasks. The combined use of geographic information tools, map viewers, and geoprocessing services leads to interesting possibilities in handling health data in a spatial manner. In particular, the use of spatial density maps has been effective to identify the most affected areas and its spatial impact. This study is an attempt to demonstrate how web processing services together with web-based mapping capabilities suit the needs of health practitioners in epidemiological analysis scenarios. PMID:22126392
Hopkins, Debbie
2015-03-01
Conceptualisations of 'vulnerability' vary amongst scholarly communities, contributing to a wide variety of applications. Research investigating vulnerability to climate change has often excluded non-climatic changes which may contribute to degrees of vulnerability perceived or experienced. This paper introduces a comprehensive contextual vulnerability framework which incorporates physical, social, economic and political factors which could amplify or reduce vulnerability. The framework is applied to New Zealand's tourism industry to explore its value in interpreting a complex, human-natural environment system with multiple competing vulnerabilities. The comprehensive contextual framework can inform government policy and industry decision making, integrating understandings of climate change within the broader context of internal and external social, physical, economic, and institutional stressors.
2013-01-01
Background As a result of changes in climatic conditions and greater resistance to insecticides, many regions across the globe, including Colombia, have been facing a resurgence of vector-borne diseases, and dengue fever in particular. Timely information on both (1) the spatial distribution of the disease, and (2) prevailing vulnerabilities of the population are needed to adequately plan targeted preventive intervention. We propose a methodology for the spatial assessment of current socioeconomic vulnerabilities to dengue fever in Cali, a tropical urban environment of Colombia. Methods Based on a set of socioeconomic and demographic indicators derived from census data and ancillary geospatial datasets, we develop a spatial approach for both expert-based and purely statistical-based modeling of current vulnerability levels across 340 neighborhoods of the city using a Geographic Information System (GIS). The results of both approaches are comparatively evaluated by means of spatial statistics. A web-based approach is proposed to facilitate the visualization and the dissemination of the output vulnerability index to the community. Results The statistical and the expert-based modeling approach exhibit a high concordance, globally, and spatially. The expert-based approach indicates a slightly higher vulnerability mean (0.53) and vulnerability median (0.56) across all neighborhoods, compared to the purely statistical approach (mean = 0.48; median = 0.49). Both approaches reveal that high values of vulnerability tend to cluster in the eastern, north-eastern, and western part of the city. These are poor neighborhoods with high percentages of young (i.e., < 15 years) and illiterate residents, as well as a high proportion of individuals being either unemployed or doing housework. Conclusions Both modeling approaches reveal similar outputs, indicating that in the absence of local expertise, statistical approaches could be used, with caution. By decomposing identified vulnerability “hotspots” into their underlying factors, our approach provides valuable information on both (1) the location of neighborhoods, and (2) vulnerability factors that should be given priority in the context of targeted intervention strategies. The results support decision makers to allocate resources in a manner that may reduce existing susceptibilities and strengthen resilience, and thus help to reduce the burden of vector-borne diseases. PMID:23945265
Vulnerability and cosusceptibility determine the size of network cascades
Yang, Yang; Nishikawa, Takashi; Motter, Adilson E.
2017-01-27
In a network, a local disturbance can propagate and eventually cause a substantial part of the system to fail in cascade events that are easy to conceptualize but extraordinarily difficult to predict. Furthermore, we develop a statistical framework that can predict cascade size distributions by incorporating two ingredients only: the vulnerability of individual components and the cosusceptibility of groups of components (i.e., their tendency to fail together). Using cascades in power grids as a representative example, we show that correlations between component failures define structured and often surprisingly large groups of cosusceptible components. Aside from their implications for blackout studies,more » these results provide insights and a new modeling framework for understanding cascades in financial systems, food webs, and complex networks in general.« less
NGL Viewer: a web application for molecular visualization.
Rose, Alexander S; Hildebrand, Peter W
2015-07-01
The NGL Viewer (http://proteinformatics.charite.de/ngl) is a web application for the visualization of macromolecular structures. By fully adopting capabilities of modern web browsers, such as WebGL, for molecular graphics, the viewer can interactively display large molecular complexes and is also unaffected by the retirement of third-party plug-ins like Flash and Java Applets. Generally, the web application offers comprehensive molecular visualization through a graphical user interface so that life scientists can easily access and profit from available structural data. It supports common structural file-formats (e.g. PDB, mmCIF) and a variety of molecular representations (e.g. 'cartoon, spacefill, licorice'). Moreover, the viewer can be embedded in other web sites to provide specialized visualizations of entries in structural databases or results of structure-related calculations. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
AMP: a science-driven web-based application for the TeraGrid
NASA Astrophysics Data System (ADS)
Woitaszek, M.; Metcalfe, T.; Shorrock, I.
The Asteroseismic Modeling Portal (AMP) provides a web-based interface for astronomers to run and view simulations that derive the properties of Sun-like stars from observations of their pulsation frequencies. In this paper, we describe the architecture and implementation of AMP, highlighting the lightweight design principles and tools used to produce a functional fully-custom web-based science application in less than a year. Targeted as a TeraGrid science gateway, AMP's architecture and implementation are intended to simplify its orchestration of TeraGrid computational resources. AMP's web-based interface was developed as a traditional standalone database-backed web application using the Python-based Django web development framework, allowing us to leverage the Django framework's capabilities while cleanly separating the user interface development from the grid interface development. We have found this combination of tools flexible and effective for rapid gateway development and deployment.
Using Open Web APIs in Teaching Web Mining
ERIC Educational Resources Information Center
Chen, Hsinchun; Li, Xin; Chau, M.; Ho, Yi-Jen; Tseng, Chunju
2009-01-01
With the advent of the World Wide Web, many business applications that utilize data mining and text mining techniques to extract useful business information on the Web have evolved from Web searching to Web mining. It is important for students to acquire knowledge and hands-on experience in Web mining during their education in information systems…
Concept Mapping Your Web Searches: A Design Rationale and Web-Enabled Application
ERIC Educational Resources Information Center
Lee, Y.-J.
2004-01-01
Although it has become very common to use World Wide Web-based information in many educational settings, there has been little research on how to better search and organize Web-based information. This paper discusses the shortcomings of Web search engines and Web browsers as learning environments and describes an alternative Web search environment…
Therapeutic uses of the WebCam in child psychiatry.
Chlebowski, Susan; Fremont, Wanda
2011-01-01
The authors provide examples for the use of the WebCam as a therapeutic tool in child psychiatry, discussing cases to demonstrate the application of the WebCam, which is most often used in psychiatry training programs during resident supervision and for case presentations. Six cases illustrate the use of the WebCam in individual and family therapy. The WebCam, used during individual sessions, can facilitate the development of prosocial skills. Comparing individual WebCam video sessions can help to evaluate the effectiveness of medication and progress in therapy. The WebCam has proven to be useful in psycho-education, facilitating communication, and treating children and families. The applications of this technology may include cognitive-behavioral therapy, dialectical-behavioral, and group therapy.
A Prototype Web-based system for GOES-R Space Weather Data
NASA Astrophysics Data System (ADS)
Sundaravel, A.; Wilkinson, D. C.
2010-12-01
The Geostationary Operational Environmental Satellite-R Series (GOES-R) makes use of advanced instruments and technologies to monitor the Earth's surface and provide with accurate space weather data. The first GOES-R series satellite is scheduled to be launched in 2015. The data from the satellite will be widely used by scientists for space weather modeling and predictions. This project looks into the ways of how these datasets can be made available to the scientists on the Web and to assist them on their research. We are working on to develop a prototype web-based system that allows users to browse, search and download these data. The GOES-R datasets will be archived in NetCDF (Network Common Data Form) and CSV (Comma Separated Values) format. The NetCDF is a self-describing data format that contains both the metadata information and the data. The data is stored in an array-oriented fashion. The web-based system will offer services in two ways: via a web application (portal) and via web services. Using the web application, the users can download data in NetCDF or CSV format and can also plot a graph of the data. The web page displays the various categories of data and the time intervals for which the data is available. The web application (client) sends the user query to the server, which then connects to the data sources to retrieve the data and delivers it to the users. Data access will also be provided via SOAP (Simple Object Access Protocol) and REST (Representational State Transfer) web services. These provide functions which can be used by other applications to fetch data and use the data for further processing. To build the prototype system, we are making use of proxy data from existing GOES and POES space weather datasets. Java is the programming language used in developing tools that formats data to NetCDF and CSV. For the web technology we have chosen Grails to develop both the web application and the services. Grails is an open source web application framework based on the Groovy language. We are also making use of the THREDDS (Thematic Realtime Environmental Distributed Data Services) server to publish and access the NetCDF files. We have completed developing software tools to generate NetCDF and CSV data files and also tools to translate NetCDF to CSV. The current phase of the project involves in designing and developing the web interface.
A Sample WebQuest Applicable in Teaching Topological Concepts
ERIC Educational Resources Information Center
Yildiz, Sevda Goktepe; Korpeoglu, Seda Goktepe
2016-01-01
In recent years, WebQuests have received a great deal of attention and have been used effectively in teaching-learning process in various courses. In this study, a WebQuest that can be applicable in teaching topological concepts for undergraduate level students was prepared. A number of topological concepts, such as countability, infinity, and…
Knowledge Base for Automatic Generation of Online IMS LD Compliant Course Structures
ERIC Educational Resources Information Center
Pacurar, Ecaterina Giacomini; Trigano, Philippe; Alupoaie, Sorin
2006-01-01
Our article presents a pedagogical scenarios-based web application that allows the automatic generation and development of pedagogical websites. These pedagogical scenarios are represented in the IMS Learning Design standard. Our application is a web portal helping teachers to dynamically generate web course structures, to edit pedagogical content…
DOT National Transportation Integrated Search
2010-03-01
Web 2.0 is an umbrella term for websites or online applications that are user-driven and emphasize collaboration and user interactivity. The trend away from static web pages to a more user-driven Internet model has also occurred in the public s...
Experience on Mashup Development with End User Programming Environment
ERIC Educational Resources Information Center
Yue, Kwok-Bun
2010-01-01
Mashups, Web applications integrating data and functionality from other Web sources to provide a new service, have quickly become ubiquitous. Because of their role as a focal point in three important trends (Web 2.0, situational software applications, and end user development), mashups are a crucial emerging technology for information systems…
Application of Mobile Agents in Web-Based Learning Environment.
ERIC Educational Resources Information Center
Hong Hong, Kinshuk; He, Xiaoqin; Patel, Ashok; Jesshope, Chris
Web-based learning environments are strongly driven by the information revolution and the Internet, but they have a number of common deficiencies, such as slow access, no adaptivity to the individual student, limitation by bandwidth, and more. This paper outlines the benefits of mobile agents technology, and describes its application in Web-based…
[Brazilian psychosocial and operational research vis-à-vis the UNGASS targets].
Bastos, Francisco Inácio; Hacker, Mariana A
2006-04-01
Items from the UNGASS Draft Declaration of Commitment on HIV/AIDS (2001) are analyzed. The Brazilian experience of new methods for testing and counseling among vulnerable populations, preventive methods controlled by women, prevention, psychosocial support for people living with HIV/AIDS, and mother-child transmission, is discussed. These items were put into operation in the form of keywords, in systematic searches within the standard biomedicine databases, also including the subdivisions of the Web of Science relating to natural and social sciences. The Brazilian experience relating to testing and counseling strategies has been consolidated through the utilization of algorithms aimed at estimating incidence rates and identifying recently infected individuals, testing and counseling for pregnant women, and application of quick tests. The introduction of alternative methods and new technologies for collecting data from vulnerable populations has been allowing speedy monitoring of the epidemic. Psychosocial support assessments for people living with HIV/AIDS have gained impetus in Brazil, probably as a result of increased survival and quality of life among these individuals. Substantial advances in controlling mother-child transmission have been observed. This is one of the most important victories within the field of HIV/AIDS in Brazil, but deficiencies in prenatal care still constitute a challenge. With regard to prevention methods for women, Brazil has only shown a halting response. Widespread implementation of new technologies for data gathering and management depends on investments in infrastructure and professional skills acquisition.
Colin M. Beier; Trista M. Patterson; F. Stuart Chapin III
2008-01-01
Managed ecosystems experience vulnerabilities when ecological resilience declines and key flows of ecosystem services become depleted or lost. Drivers of vulnerability often include local management actions in conjunction with other external, larger scale factors. To translate these concepts to management applications, we developed a conceptual model of feedbacks...
The GB/3D Type Fossils Online Web Portal
NASA Astrophysics Data System (ADS)
McCormick, T.; Howe, M. P.
2013-12-01
Fossils are the remains of once-living organisms that existed and played out their lives in 3-dimensional environments. The information content provided by a 3d representation of a fossil is much greater than that provided by a traditional photograph, and can grab the attention and imagination of the younger and older general public alike. The British Geological Survey has been leading a consortium of UK natural history museums including the Oxford University Museum of Natural History, the Sedgwick Museum Cambridge, the National Museum of Wales Cardiff, and a number of smaller regional British museums to construct a web portal giving access to metadata, high resolution images and interactive 3d models of type fossils from the UK. The web portal at www.3d-fossils.ac.uk was officially launched in August 2013. It can be used to discover metadata describing the provenance, taxonomy, and stratigraphy of the specimens. Zoom-able high resolution digital photographs are available, including for many specimens ';anaglyph' stereo images that can be viewed in 3d using red-cyan stereo spectacles. For many of the specimens interactive 3d models were generated by scanning with portable ';NextEngine 3D HD' 3d scanners. These models can be downloaded in zipped .OBJ and .PLY format from the web portal, or may be viewed and manipulated directly in certain web browsers. The images and scans may be freely downloaded subject to a Creative Commons Attribution ShareAlike Non-Commercial license. There is a simple application programming interface (API) allowing metadata to be downloaded, with links to the images and models, in a standardised format for use in data mash-ups and third party applications. The web portal also hosts ';open educational resources' explaining the process of fossilization and the importance of type specimens in taxonomy, as well as providing introductions to the most important fossil groups. We have experimented with using a 3d printer to create replicas of the fossils which can be used in education and public outreach. The audience for the web portal includes both professional paleontologists and the general public. The professional paleontologist can use the portal to discover the whereabouts of the type material for a taxon they are studying, and can use the pictures and 3d models to assess the completeness and preservation quality of the material. This may reduce or negate the need to send specimens (which are often fragile and always irreplaceable) to researchers through the post, or for researchers to make possibly long, expensive and environmentally damaging journeys to visit far-off collections. We hope that the pictures and 3d models will help to stimulate public interest in paleontology and natural history. The ability to digitally image and scan specimens in 3d enables institutions to have an archive record in case specimens are lost or destroyed by accident or warfare. Recent events in Cairo and Baghdad remind us that museum collections are vulnerable to civil and military strife.
DARKDROID: Exposing the Dark Side of Android Marketplaces
2016-06-01
Moreover, our approaches can detect apps containing both intentional and unintentional vulnerabilities, such as unsafe code loading mechanisms and...Security, Static Analysis, Dynamic Analysis, Malware Detection , Vulnerability Scanning 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18...applications in a DoD context. ................... 1 1.2.2 Develop sophisticated whole-system static analyses to detect malicious Android applications
Open-Source web-based geographical information system for health exposure assessment
2012-01-01
This paper presents the design and development of an open source web-based Geographical Information System allowing users to visualise, customise and interact with spatial data within their web browser. The developed application shows that by using solely Open Source software it was possible to develop a customisable web based GIS application that provides functions necessary to convey health and environmental data to experts and non-experts alike without the requirement of proprietary software. PMID:22233606
Kolt, Gregory S; Rosenkranz, Richard R; Savage, Trevor N; Maeder, Anthony J; Vandelanotte, Corneel; Duncan, Mitch J; Caperchione, Cristina M; Tague, Rhys; Hooker, Cindy; Mummery, W Kerry
2013-05-03
Physical inactivity is one of the leading modifiable causes of death and disease in Australia. National surveys indicate less than half of the Australian adult population are sufficiently active to obtain health benefits. The Internet is a potentially important medium for successfully communicating health messages to the general population and enabling individual behaviour change. Internet-based interventions have proven efficacy; however, intervention studies describing website usage objectively have reported a strong decline in usage, and high attrition rate, over the course of the interventions. Web 2.0 applications give users control over web content generated and present innovative possibilities to improve user engagement. There is, however, a need to assess the effectiveness of these applications in the general population. The Walk 2.0 project is a 3-arm randomised controlled trial investigating the effects of "next generation" web-based applications on engagement, retention, and subsequent physical activity behaviour change. 504 individuals will be recruited from two sites in Australia, randomly allocated to one of two web-based interventions (Web 1.0 or Web 2.0) or a control group, and provided with a pedometer to monitor physical activity. The Web 1.0 intervention will provide participants with access to an existing physical activity website with limited interactivity. The Web 2.0 intervention will provide access to a website featuring Web 2.0 content, including social networking, blogs, and virtual walking groups. Control participants will receive a logbook to record their steps. All groups will receive similar educational material on setting goals and increasing physical activity. The primary outcomes are objectively measured physical activity and website engagement and retention. Other outcomes measured include quality of life, psychosocial correlates, and anthropometric measurements. Outcomes will be measured at baseline, 3, 12 and 18 months. The findings of this study will provide increased understanding of the benefit of new web-based technologies and applications in engaging and retaining participants on web-based intervention sites, with the aim of improved health behaviour change outcomes. Australian New Zealand Clinical Trials Registry, ACTRN12611000157976.
’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront
2014-04-30
software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
Source Update Capture in Information Agents
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Kulkarni, Deepak; Wang, Yao
2003-01-01
In this paper we present strategies for successfully capturing updates at Web sources. Web-based information agents provide integrated access to autonomous Web sources that can get updated. For many information agent applications we are interested in knowing when a Web source to which the application provides access, has been updated. We may also be interested in capturing all the updates at a Web source over a period of time i.e., detecting the updates and, for each update retrieving and storing the new version of data. Previous work on update and change detection by polling does not adequately address this problem. We present strategies for intelligently polling a Web source for efficiently capturing changes at the source.
NASA Astrophysics Data System (ADS)
Auermuller, L. M.; Gatto, J.; Huch, C.
2015-12-01
The highly developed nature of New Jersey's coastline, barrier island and lagoon communities make them particularly vulnerable to storm surge, sea level rise and flooding. The impacts of Hurricane Sandy have enlightened coastal communities to these realities. Recognizing these vulnerabilities, the Jacques Cousteau National Research Reserve (JC NERR), Rutgers Center for Remote Sensing and Spatial Analysis (CRSSA), Rutgers Bloustein School and the Barnegat Bay Partnership (BBP) have developed web-based tools to assist NJ's coastal communities in visualizing and planning for future local impacts. NJFloodMapper and NJAdapt are two complementary interactive mapping websites that visualize different current and future flood hazards. These hazard layers can be combined with additional data including critical facilities, evacuation routes, socioeconomic and environmental data. Getting to Resilience is an online self-assessment tool developed to assist communities reduce vulnerability and increase preparedness by linking planning, mitigation, and adaptation. Through this interactive process communities will learn how their preparedness can yield valuable points through voluntary programs like FEMA's Community Rating System and Sustainable Jersey. The assessment process can also increase the community's understanding of where future vulnerabilities should be addressed through hazard mitigation planning. Since Superstorm Sandy, more than thirty communities in New Jersey have been provided technical assistance in assessing their risks and vulnerabilities to coastal hazards, and have begun to understand how to better plan and prepare for short and long-term changes along their shorelines.
Information-Flow-Based Access Control for Web Browsers
NASA Astrophysics Data System (ADS)
Yoshihama, Sachiko; Tateishi, Takaaki; Tabuchi, Naoshi; Matsumoto, Tsutomu
The emergence of Web 2.0 technologies such as Ajax and Mashup has revealed the weakness of the same-origin policy[1], the current de facto standard for the Web browser security model. We propose a new browser security model to allow fine-grained access control in the client-side Web applications for secure mashup and user-generated contents. We propose a browser security model that is based on information-flow-based access control (IBAC) to overcome the dynamic nature of the client-side Web applications and to accurately determine the privilege of scripts in the event-driven programming model.
2017-06-01
for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser
Workflow and web application for annotating NCBI BioProject transcriptome data
Vera Alvarez, Roberto; Medeiros Vidal, Newton; Garzón-Martínez, Gina A.; Barrero, Luz S.; Landsman, David
2017-01-01
Abstract The volume of transcriptome data is growing exponentially due to rapid improvement of experimental technologies. In response, large central resources such as those of the National Center for Biotechnology Information (NCBI) are continually adapting their computational infrastructure to accommodate this large influx of data. New and specialized databases, such as Transcriptome Shotgun Assembly Sequence Database (TSA) and Sequence Read Archive (SRA), have been created to aid the development and expansion of centralized repositories. Although the central resource databases are under continual development, they do not include automatic pipelines to increase annotation of newly deposited data. Therefore, third-party applications are required to achieve that aim. Here, we present an automatic workflow and web application for the annotation of transcriptome data. The workflow creates secondary data such as sequencing reads and BLAST alignments, which are available through the web application. They are based on freely available bioinformatics tools and scripts developed in-house. The interactive web application provides a search engine and several browser utilities. Graphical views of transcript alignments are available through SeqViewer, an embedded tool developed by NCBI for viewing biological sequence data. The web application is tightly integrated with other NCBI web applications and tools to extend the functionality of data processing and interconnectivity. We present a case study for the species Physalis peruviana with data generated from BioProject ID 67621. Database URL: http://www.ncbi.nlm.nih.gov/projects/physalis/ PMID:28605765
Informality and employment vulnerability: application in sellers with subsistence work
Garzón-Duque, María Osley; Cardona-Arango, María Doris; Rodríguez-Ospina, Fabio León; Segura-Cardona, Angela María
2017-01-01
ABSTRACT OBJECTIVE To describe the origin, evolution, and application of the concept of employment vulnerability in workers who subsist on street sales. METHODS We have carried out an analysis of the literature in database in Spanish, Portuguese, and English, without restriction by country. This is a review of the gray literature of government reports, articles, and documents from Latin America and the Caribbean. We have analyzed information on the informal economy, social-employment vulnerability, and subsistence workers. RESULTS AND CONCLUSIONS The concept of informal economy is dispersed and suggested as synonymous with employment vulnerability. As a polysemic term, it generates confusion and difficulty in identifying defined profiles of employment vulnerability in informal subsistence workers, who sell their products on the streets and sidewalks of cities. The lack of a clear concept and profile of employment vulnerability for this type of workers generates a restriction on defined actions to reduce employment vulnerability. The profiles could facilitate access to the acquisition of assets that support their structure of opportunities, facilitating and mediating in the passage from vulnerability to social mobility with opportunities. We propose as a concept of employment vulnerability for subsistence workers in the informal sector, the condition of those who must work by day to eat at night, who have little or no ownership of assets, and who have a minimum structure of opportunities to prevent, face, and resist the critical situations that occur daily, putting at risk their subsistence and that of the persons who are their responsibility, thus making the connection between social and employment vulnerability. PMID:29020122
Network Monitoring for Web-Based Threats
2011-02-01
string to access files or folders that were not intended (see Figure 4-1). http://example.com/getUserProfile.jsp?item=../../../../etc/ passwd Figure...applied to vulnerable fields within a cookie (see Figure 4-2). Cookie: USER=1826cc8f:PSTYLE=../../../../etc/ passwd Figure 4-2: Path Traversal...further privileges. − For example, http://host/cgi- bin/lame.cgi?file=../../../../etc/ passwd • %00 requests − This is the hexadecimal value of a
Multifractal Internet Traffic Model and Active Queue Management
2003-01-01
dropped by the Adaptive RED , ssthresh decreases from 64KB to 4KB and the new con- gestion window cwnd is decreased from 8KB to 1KB (Tahoe). The situation...method to predict the queuing behavior of FIFO and RED queues. In order to satisfy a given delay and jitter requirement for real time connections, and to...5.2 Vulnerability of Adaptive RED to Web-mice . . . . . . . . . . . . . 103 5.3 A Parallel Virtual Queues Structure
On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg
2007-01-01
Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).
Web Application Software for Ground Operations Planning Database (GOPDb) Management
NASA Technical Reports Server (NTRS)
Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey
2013-01-01
A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.
Web Program for Development of GUIs for Cluster Computers
NASA Technical Reports Server (NTRS)
Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward
2003-01-01
WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.
Gender Divide and Acceptance of Collaborative Web 2.0 Applications for Learning in Higher Education
ERIC Educational Resources Information Center
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
2013-01-01
Situated in the gender digital divide framework, this survey study investigated the role of computer anxiety in influencing female college students' perceptions toward Web 2.0 applications for learning. Based on 432 college students' "Web 2.0 for learning" perception ratings collected by relevant categories of "Unified Theory of Acceptance and Use…
System Testing of Desktop and Web Applications
ERIC Educational Resources Information Center
Slack, James M.
2011-01-01
We want our students to experience system testing of both desktop and web applications, but the cost of professional system-testing tools is far too high. We evaluate several free tools and find that AutoIt makes an ideal educational system-testing tool. We show several examples of desktop and web testing with AutoIt, starting with simple…
Robust image obfuscation for privacy protection in Web 2.0 applications
NASA Astrophysics Data System (ADS)
Poller, Andreas; Steinebach, Martin; Liu, Huajian
2012-03-01
We present two approaches to robust image obfuscation based on permutation of image regions and channel intensity modulation. The proposed concept of robust image obfuscation is a step towards end-to-end security in Web 2.0 applications. It helps to protect the privacy of the users against threats caused by internet bots and web applications that extract biometric and other features from images for data-linkage purposes. The approaches described in this paper consider that images uploaded to Web 2.0 applications pass several transformations, such as scaling and JPEG compression, until the receiver downloads them. In contrast to existing approaches, our focus is on usability, therefore the primary goal is not a maximum of security but an acceptable trade-off between security and resulting image quality.
The Other Infrastructure: Distance Education's Digital Plant.
ERIC Educational Resources Information Center
Boettcher, Judith V.; Kumar, M. S. Vijay
2000-01-01
Suggests a new infrastructure--the digital plant--for supporting flexible Web campus environments. Describes four categories which make up the infrastructure: personal communication tools and applications; network of networks for the Web campus; dedicated servers and software applications; software applications and services from external…
WEBCAP: Web Scheduler for Distance Learning Multimedia Documents with Web Workload Considerations
ERIC Educational Resources Information Center
Habib, Sami; Safar, Maytham
2008-01-01
In many web applications, such as the distance learning, the frequency of refreshing multimedia web documents places a heavy burden on the WWW resources. Moreover, the updated web documents may encounter inordinate delays, which make it difficult to retrieve web documents in time. Here, we present an Internet tool called WEBCAP that can schedule…
Applying Web Usage Mining for Personalizing Hyperlinks in Web-Based Adaptive Educational Systems
ERIC Educational Resources Information Center
Romero, Cristobal; Ventura, Sebastian; Zafra, Amelia; de Bra, Paul
2009-01-01
Nowadays, the application of Web mining techniques in e-learning and Web-based adaptive educational systems is increasing exponentially. In this paper, we propose an advanced architecture for a personalization system to facilitate Web mining. A specific Web mining tool is developed and a recommender engine is integrated into the AHA! system in…
New framework of NGN web-based management system
NASA Astrophysics Data System (ADS)
Nian, Zhou; Jie, Yin; Qian, Mao
2007-11-01
This paper introduces the basic conceptions and key technology of the Ajax and some popular frameworks in the J2EE architecture, try to integrate all the frameworks into a new framework. The developers can develop web applications much more convenient by using this framework and the web application can provide a more friendly and interactive platform to the end users. At last an example is given to explain how to use the new framework to build a web-based management system of the softswitch network.
A web implementation: the good and the not-so-good.
Bergsneider, C; Piraino, D; Fuerst, M
2001-06-01
E-commerce, e-mail, e-greeting, e-this, and e-that everywhere you turn there is a new "e" word for an internet or Web application. We, at the Cleveland Clinic Foundation, have been "e-nlightened" and will discuss in this report the implementation of a web-based radiology information system (RIS) in our radiology division or "e-radiology" division. The application, IDXRad Version 10.0 from IDX Corp, Burlington, VT, is in use at the Cleveland Clinic Foundation and has both intranet (for use in Radiology) and internet (referring physician viewing) modules. We will concentrate on the features of using a web browser for the application's front-end, including easy prototyping for screen review, easier mock-ups of demonstrations by vendors and developers, and easier training as more people become web-addicted. Project communication can be facilitated with an internal project web page, and use of the web browser can accommodate quicker turnaround of software upgrades as the software code is centrally located. Compared with other technologies, including client/server, there is a smaller roll out cost when using a standard web browser. However, the new technology requires a change and changes are never implemented without challenges. A seasoned technologist using a legacy system can enter data quicker using function keys than using a graphical user interface and pointing and clicking through a series of pop-up windows. Also, effective use of a web browser depends on intuitive design for it to be easily implemented and accepted by the user. Some software packages will not work on both of the popular web browsers and then are tailored to specific release levels. As computer-based patient records become a standard, patient confidentiality must be enforced. The technical design and application security features that support the web-based software package will be discussed. Also web technologies have their own implementation issues.
JavaScript Access to DICOM Network and Objects in Web Browser.
Drnasin, Ivan; Grgić, Mislav; Gogić, Goran
2017-10-01
Digital imaging and communications in medicine (DICOM) 3.0 standard provides the baseline for the picture archiving and communication systems (PACS). The development of Internet and various communication media initiated demand for non-DICOM access to PACS systems. Ever-increasing utilization of the web browsers, laptops and handheld devices, as opposed to desktop applications and static organizational computers, lead to development of different web technologies. The DICOM standard officials accepted those subsequently as tools of alternative access. This paper provides an overview of the current state of development of the web access technology to the DICOM repositories. It presents a different approach of using HTML5 features of the web browsers through the JavaScript language and the WebSocket protocol by enabling real-time communication with DICOM repositories. JavaScript DICOM network library, DICOM to WebSocket proxy and a proof-of-concept web application that qualifies as a DICOM 3.0 device were developed.
U.S. Geological Survey (USGS) Earthquake Web Applications
NASA Astrophysics Data System (ADS)
Fee, J.; Martinez, E.
2015-12-01
USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/
Systematic plan of building Web geographic information system based on ActiveX control
NASA Astrophysics Data System (ADS)
Zhang, Xia; Li, Deren; Zhu, Xinyan; Chen, Nengcheng
2003-03-01
A systematic plan of building Web Geographic Information System (WebGIS) using ActiveX technology is proposed in this paper. In the proposed plan, ActiveX control technology is adopted in building client-side application, and two different schemas are introduced to implement communication between controls in users¡ browser and middle application server. One is based on Distribute Component Object Model (DCOM), the other is based on socket. In the former schema, middle service application is developed as a DCOM object that communicates with ActiveX control through Object Remote Procedure Call (ORPC) and accesses data in GIS Data Server through Open Database Connectivity (ODBC). In the latter, middle service application is developed using Java language. It communicates with ActiveX control through socket based on TCP/IP and accesses data in GIS Data Server through Java Database Connectivity (JDBC). The first one is usually developed using C/C++, and it is difficult to develop and deploy. The second one is relatively easy to develop, but its performance of data transfer relies on Web bandwidth. A sample application is developed using the latter schema. It is proved that the performance of the sample application is better than that of some other WebGIS applications in some degree.
Web Mining: Machine Learning for Web Applications.
ERIC Educational Resources Information Center
Chen, Hsinchun; Chau, Michael
2004-01-01
Presents an overview of machine learning research and reviews methods used for evaluating machine learning systems. Ways that machine-learning algorithms were used in traditional information retrieval systems in the "pre-Web" era are described, and the field of Web mining and how machine learning has been used in different Web mining…
The Use of Web Search Engines in Information Science Research.
ERIC Educational Resources Information Center
Bar-Ilan, Judit
2004-01-01
Reviews the literature on the use of Web search engines in information science research, including: ways users interact with Web search engines; social aspects of searching; structure and dynamic nature of the Web; link analysis; other bibliometric applications; characterizing information on the Web; search engine evaluation and improvement; and…
CONFU: Configuration Fuzzing Testing Framework for Software Vulnerability Detection
Dai, Huning; Murphy, Christian; Kaiser, Gail
2010-01-01
Many software security vulnerabilities only reveal themselves under certain conditions, i.e., particular configurations and inputs together with a certain runtime environment. One approach to detecting these vulnerabilities is fuzz testing. However, typical fuzz testing makes no guarantees regarding the syntactic and semantic validity of the input, or of how much of the input space will be explored. To address these problems, we present a new testing methodology called Configuration Fuzzing. Configuration Fuzzing is a technique whereby the configuration of the running application is mutated at certain execution points, in order to check for vulnerabilities that only arise in certain conditions. As the application runs in the deployment environment, this testing technique continuously fuzzes the configuration and checks “security invariants” that, if violated, indicate a vulnerability. We discuss the approach and introduce a prototype framework called ConFu (CONfiguration FUzzing testing framework) for implementation. We also present the results of case studies that demonstrate the approach’s feasibility and evaluate its performance. PMID:21037923
Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz
2009-08-25
Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms.
Rot, Gregor; Parikh, Anup; Curk, Tomaz; Kuspa, Adam; Shaulsky, Gad; Zupan, Blaz
2009-01-01
Background Bioinformatics often leverages on recent advancements in computer science to support biologists in their scientific discovery process. Such efforts include the development of easy-to-use web interfaces to biomedical databases. Recent advancements in interactive web technologies require us to rethink the standard submit-and-wait paradigm, and craft bioinformatics web applications that share analytical and interactive power with their desktop relatives, while retaining simplicity and availability. Results We have developed dictyExpress, a web application that features a graphical, highly interactive explorative interface to our database that consists of more than 1000 Dictyostelium discoideum gene expression experiments. In dictyExpress, the user can select experiments and genes, perform gene clustering, view gene expression profiles across time, view gene co-expression networks, perform analyses of Gene Ontology term enrichment, and simultaneously display expression profiles for a selected gene in various experiments. Most importantly, these tasks are achieved through web applications whose components are seamlessly interlinked and immediately respond to events triggered by the user, thus providing a powerful explorative data analysis environment. Conclusion dictyExpress is a precursor for a new generation of web-based bioinformatics applications with simple but powerful interactive interfaces that resemble that of the modern desktop. While dictyExpress serves mainly the Dictyostelium research community, it is relatively easy to adapt it to other datasets. We propose that the design ideas behind dictyExpress will influence the development of similar applications for other model organisms. PMID:19706156
Technical Considerations in Remote LIMS Access via the World Wide Web
Schlabach, David M.
2005-01-01
The increased dependency on the World Wide Web by both laboratories and their customers has led LIMS developers to take advantage of thin-client web applications that provide both remote data entry and manipulation, along with remote reporting functionality. Use of an LIMS through a web browser allows a person to interact with a distant application, providing both remote administration and real-time analytical result delivery from virtually anywhere in the world. While there are many benefits of web-based LIMS applications, some concern must be given to these new methods of system architecture before justifying them as a suitable replacement for their traditional client-server systems. Developers and consumers alike must consider the security aspects of introducing a wide area network capable system into a production environment, as well as the concerns of data integrity and usability. PMID:18924736
The informatics superhighway: prototyping on the World Wide Web.
Cimino, J J; Socratous, S A; Grewal, R
1995-01-01
We have experimented with developing a prototype Surgeon's Workstation which makes use of the World Wide Web client-server architecture. Although originally intended merely as a means for obtaining user feedback for use in designing a "real" system, the application has been adopted for use by our Department of Surgery. As they begin to use the application, they have suggested changes and we have responded. This paper illustrates some of the advantages we have found for prototyping with Web-based applications, including security aspects.
Adopting and adapting a commercial view of web services for the Navy
NASA Astrophysics Data System (ADS)
Warner, Elizabeth; Ladner, Roy; Katikaneni, Uday; Petry, Fred
2005-05-01
Web Services are being adopted as the enabling technology to provide net-centric capabilities for many Department of Defense operations. The Navy Enterprise Portal, for example, is Web Services-based, and the Department of the Navy is promulgating guidance for developing Web Services. Web Services, however, only constitute a baseline specification that provides the foundation on which users, under current approaches, write specialized applications in order to retrieve data over the Internet. Application development may increase dramatically as the number of different available Web Services increases. Reasons for specialized application development include XML schema versioning differences, adoption/use of diverse business rules, security access issues, and time/parameter naming constraints, among others. We are currently developing for the US Navy a system which will improve delivery of timely and relevant meteorological and oceanographic (MetOc) data to the warfighter. Our objective is to develop an Advanced MetOc Broker (AMB) that leverages Web Services technology to identify, retrieve and integrate relevant MetOc data in an automated manner. The AMB will utilize a Mediator, which will be developed by applying ontological research and schema matching techniques to MetOc forms of data. The AMB, using the Mediator, will support a new, advanced approach to the use of Web Services; namely, the automated identification, retrieval and integration of MetOc data. Systems based on this approach will then not require extensive end-user application development for each Web Service from which data can be retrieved. Users anywhere on the globe will be able to receive timely environmental data that fits their particular needs.
Rehabilitation of vulnerable groups in emergencies and disasters: A systematic review
Sheikhbardsiri, Hojjat; Yarmohammadian, Mohammad H.; Rezaei, Fatemeh; Maracy, Mohammad Reza
2017-01-01
BACKGROUND: Natural and man-made disasters, especially those occurring in large scales not only result in human mortality, but also cause physical, psychological, and social disabilities. Providing effective rehabilitation services in time can decrease the frequency of such disabilities. The aim of the current study was to perform a systematic review related to rehabilitation of vulnerable groups in emergencies and disasters. METHODS: The systematic review was conducted according to the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines. The key words “recovery”, “rehabilitation”, “reconstruction”, “transformation”, “transition”, “emergency”, “disaster”, “crisis”, “hazard”, “catastrophe”, “tragedy”, “mass casualty incident”, “women”, “female”, “children”, “pediatric”, “disable”, “handicap”, “elder”, “old” and “vulnerable” were used in combination with Boolean operators OR and AND. ISI Web of Science, PubMed, Scopus, Science Direct, Ovid, ProQuest, Wiley, Google Scholar were searched. RESULTS: In this study a total of 11 928 articles were considered and 25 articles were selected for final review of rehabilitation of vulnerable groups based on the objective of this study. Twenty-five studies including six qualitative, sixteen cross-sectional and three randomized controlled trials were reviewed for rehabilitation of vulnerable groups in emergencies and disasters. Out of the selected papers, 23 were studied based on rehabilitation after natural disasters and the remaining were man-made disasters. Most types of rehabilitation were physical, social, psychological and economic. CONCLUSION: The review of the papers showed different programs of physical, physiological, economic and social rehabilitations for vulnerable groups after emergencies and disasters. It may help health field managers better implement standard rehabilitation activities for vulnerable groups. PMID:29123602
Mathiesen, Anne Sophie; Thomsen, Thordis; Jensen, Tonny; Schiøtz, Charlotte; Langberg, Henning; Egerod, Ingrid
2017-09-01
Digital interventions for improving diabetes management in Type 2 diabetes mellitus (T2DM) are used universally. Digital interventions are defined as any intervention accessed and taking input from people with T2DM in the form of a web-based or mobile phone-based app to improve diabetes self-management. However, the current confidence in digital interventions threatens to augment social inequalities in health, also known as the "digital divide". To counteract dissemination of the digital divide, we aimed to assess the potential of a tailored digital intervention for improving diabetes management in vulnerable people with T2DM. A qualitative design using semi-structured in-depth interviews to explore the perspectives of 12 vulnerable people with T2DM. Interviews were analyzed using inductive content analysis. Vulnerability was defined by the presence of one or more comorbidities, one or more lifestyle risk factors, poor diabetes management, low educational level and low health literacy. The main themes identified were: "Dealing with diabetes distress" characterized by psychological avoidance mechanisms; "Suffering informational confusion" dealing with inconsistent information; "Experiencing digital alienation" dealing with loss of freedom when technology invades the private sphere; and "Missing the human touch" preferring human interaction over digital contact. Vulnerable people with T2DM are unprepared for digital interventions for disease management. Experiencing diabetes distress may be an intermediate mechanism leading to nonadherence to digital interventions and the preference for human interaction in vulnerable people with T2DM. Future interventions could include a designated caregiver and an allocated buddy to provide support and assist uptake of digital interventions for diabetes management.
DoD Application Store: Enabling C2 Agility?
2014-06-01
Framework, will include automated delivery of software patches, web applications, widgets and mobile application packages. The envisioned DoD...Marketplace within the Ozone Widget Framework, will include automated delivery of software patches, web applications, widgets and mobile application...current needs. DoD has started to make inroads within this environment with several Programs of Record (PoR) embracing widgets and other mobile
Web servicing the biological office.
Szugat, Martin; Güttler, Daniel; Fundel, Katrin; Sohler, Florian; Zimmer, Ralf
2005-09-01
Biologists routinely use Microsoft Office applications for standard analysis tasks. Despite ubiquitous internet resources, information needed for everyday work is often not directly and seamlessly available. Here we describe a very simple and easily extendable mechanism using Web Services to enrich standard MS Office applications with internet resources. We demonstrate its capabilities by providing a Web-based thesaurus for biological objects, which maps names to database identifiers and vice versa via an appropriate synonym list. The client application ProTag makes these features available in MS Office applications using Smart Tags and Add-Ins. http://services.bio.ifi.lmu.de/prothesaurus/
NASA Technical Reports Server (NTRS)
Laakso, J. H.; Straayer, J. W.
1973-01-01
Three large scale advanced composite shear web components were tested and analyzed to evaluate application of the design concept to a space shuttle orbiter thrust structure. The shear web design concept consisted of a titanium-clad + or - 45 deg boron/epoxy web laminate stiffened with vertical boron/epoxy reinforced aluminum stiffeners. The design concept was evaluated to be efficient and practical for the application that was studied. Because of the effects of buckling deflections, a requirement is identified for shear buckling resistant design to maximize the efficiency of highly-loaded advanced composite shear webs. An approximate analysis of prebuckling deflections is presented and computer-aided design results, which consider prebuckling deformations, indicate that the design concept offers a theoretical weight saving of 31 percent relative to all metal construction. Recommendations are made for design concept options and analytical methods that are appropriate for production hardware.
Integrating DXplain into a clinical information system using the World Wide Web.
Elhanan, G; Socratous, S A; Cimino, J J
1996-01-01
The World Wide Web(WWW) offers a cross-platform environment and standard protocols that enable integration of various applications available on the Internet. The authors use the Web to facilitate interaction between their Web-based Clinical Information System and a decision-support system-DXplain, at the Massachusetts General Hospital-using local architecture and Common Gateway Interface programs. The current application translates patients laboratory test results into DXplain's terms to generate diagnostic hypotheses. Two different access methods are utilized for this model; Hypertext Transfer Protocol (HTTP) and TCP/IP function calls. While clinical aspects cannot be evaluated as yet, the model demonstrates the potential of Web-based applications for interaction and integration and how local architecture, with a controlled vocabulary server, can further facilitate such integration. This model serves to demonstrate some of the limitations of the current WWW technology and identifies issues such as control over Web resources and their utilization and liability issues as possible obstacles for further integration.
Atmospheric Science Data Center
2013-03-21
... Web Links to Relevant CERES Information Relevant information about CERES, CERES references, ... Instrument Working Group Home Page Aerosol Retrieval Web Page (Center for Satellite Applications and Research) ...
ERIC Educational Resources Information Center
Huang, Wen-Hao David; Hood, Denice Ward; Yoo, Sun Joo
2014-01-01
Web 2.0 applications have been widely applied for teaching and learning in US higher education in recent years. Their potential impact on learning motivation and learner performance, however, has not attracted substantial research efforts. To better understand how Web 2.0 applications might impact learners' motivation in higher education…
Mobile Web 2.0 in the Workplace: A Case Study of Employees' Informal Learning
ERIC Educational Resources Information Center
Gu, Jia; Churchill, Daniel; Lu, Jie
2014-01-01
Employees' informal learning in the workplace warrants more attention, and such learning could benefit from the latest mobile technologies such as Web 2.0 applications, which have increasingly been utilized and have the potential to enhance learning outcomes. This multiple-case study examined the impact of mobile Web 2.0 applications on…
The Design and Application of a Web-Based Self- And Peer-Assessment System
ERIC Educational Resources Information Center
Sung, Yao-Ting; Chang, Kuo-En; Chiou, Shen-Kuan; Hou, Huei-Tse
2005-01-01
This study describes the web-based self- and peer-assessments system, or the Web-SPA, which has been shown to provide teachers with a flexible interface with which to arrange various self- and peer-assessment procedures. Secondly, this study examines the effects of the application of the progressively focused self- and peer-assessment (PFSPA)…
Brain-controlled applications using dynamic P300 speller matrices.
Halder, Sebastian; Pinegger, Andreas; Käthner, Ivo; Wriessnegger, Selina C; Faller, Josef; Pires Antunes, João B; Müller-Putz, Gernot R; Kübler, Andrea
2015-01-01
Access to the world wide web and multimedia content is an important aspect of life. We present a web browser and a multimedia user interface adapted for control with a brain-computer interface (BCI) which can be used by severely motor impaired persons. The web browser dynamically determines the most efficient P300 BCI matrix size to select the links on the current website. This enables control of the web browser with fewer commands and smaller matrices. The multimedia player was based on an existing software. Both applications were evaluated with a sample of ten healthy participants and three end-users. All participants used a visual P300 BCI with face-stimuli for control. The healthy participants completed the multimedia player task with 90% accuracy and the web browsing task with 85% accuracy. The end-users completed the tasks with 62% and 58% accuracy. All healthy participants and two out of three end-users reported that they felt to be in control of the system. In this study we presented a multimedia application and an efficient web browser implemented for control with a BCI. Both applications provide access to important areas of modern information retrieval and entertainment. Copyright © 2014 Elsevier B.V. All rights reserved.
Web 2.0 in healthcare: state-of-the-art in the German health insurance landscape.
Kuehne, Mirko; Blinn, Nadine; Rosenkranz, Christoph; Nuettgens, Markus
2011-01-01
The Internet is increasingly used as a source for information and knowledge. Even in the field of healthcare, information is widely available. Patients and their relatives increasingly use the Internet in order to search for healthcare information and applications. "Health 2.0" - the increasing use of Web 2.0 technologies and tools in Electronic Healthcare - promises new ways of interaction, communication, and participation for healthcare. In order to explore how Web 2.0 applications are in general adopted and implemented by health information providers, we analysed the websites of all German health insurances companies regarding their provision of Web 2.0 applications. As health insurances play a highly relevant role in the German healthcare system, we conduct an exploratory survey in order to provide answers about the adoption and implementation of Web 2.0 technologies. Hence, all 198 private and public health insurances were analysed according to their websites. The results show a wide spread diffusion of Web 2.0 applications but also huge differences between the implementation by the respective insurances. Therefore, our findings provide a foundation for further research on aspects that drive the adoption.
Saint: a lightweight integration environment for model annotation.
Lister, Allyson L; Pocock, Matthew; Taschuk, Morgan; Wipat, Anil
2009-11-15
Saint is a web application which provides a lightweight annotation integration environment for quantitative biological models. The system enables modellers to rapidly mark up models with biological information derived from a range of data sources. Saint is freely available for use on the web at http://www.cisban.ac.uk/saint. The web application is implemented in Google Web Toolkit and Tomcat, with all major browsers supported. The Java source code is freely available for download at http://saint-annotate.sourceforge.net. The Saint web server requires an installation of libSBML and has been tested on Linux (32-bit Ubuntu 8.10 and 9.04).
Pragmatic service development and customisation with the CEDA OGC Web Services framework
NASA Astrophysics Data System (ADS)
Pascoe, Stephen; Stephens, Ag; Lowe, Dominic
2010-05-01
The CEDA OGC Web Services framework (COWS) emphasises rapid service development by providing a lightweight layer of OGC web service logic on top of Pylons, a mature web application framework for the Python language. This approach gives developers a flexible web service development environment without compromising access to the full range of web application tools and patterns: Model-View-Controller paradigm, XML templating, Object-Relational-Mapper integration and authentication/authorization. We have found this approach useful for exploring evolving standards and implementing protocol extensions to meet the requirements of operational deployments. This paper outlines how COWS is being used to implement customised WMS, WCS, WFS and WPS services in a variety of web applications from experimental prototypes to load-balanced cluster deployments serving 10-100 simultaneous users. In particular we will cover 1) The use of Climate Science Modeling Language (CSML) in complex-feature aware WMS, WCS and WFS services, 2) Extending WMS to support applications with features specific to earth system science and 3) A cluster-enabled Web Processing Service (WPS) supporting asynchronous data processing. The COWS WPS underpins all backend services in the UK Climate Projections User Interface where users can extract, plot and further process outputs from a multi-dimensional probabilistic climate model dataset. The COWS WPS supports cluster job execution, result caching, execution time estimation and user management. The COWS WMS and WCS components drive the project-specific NCEO and QESDI portals developed by the British Atmospheric Data Centre. These portals use CSML as a backend description format and implement features such as multiple WMS layer dimensions and climatology axes that are beyond the scope of general purpose GIS tools and yet vital for atmospheric science applications.
NASA Astrophysics Data System (ADS)
Aufdenkampe, A. K.; Mayorga, E.; Tarboton, D. G.; Sazib, N. S.; Horsburgh, J. S.; Cheetham, R.
2016-12-01
The Model My Watershed Web app (http://wikiwatershed.org/model/) was designed to enable citizens, conservation practitioners, municipal decision-makers, educators, and students to interactively select any area of interest anywhere in the continental USA to: (1) analyze real land use and soil data for that area; (2) model stormwater runoff and water-quality outcomes; and (3) compare how different conservation or development scenarios could modify runoff and water quality. The BiG CZ Data Portal is a web application for scientists for intuitive, high-performance map-based discovery, visualization, access and publication of diverse earth and environmental science data via a map-based interface that simultaneously performs geospatial analysis of selected GIS and satellite raster data for a selected area of interest. The two web applications share a common codebase (https://github.com/WikiWatershed and https://github.com/big-cz), high performance geospatial analysis engine (http://geotrellis.io/ and https://github.com/geotrellis) and deployment on the Amazon Web Services (AWS) cloud cyberinfrastructure. Users can use "on-the-fly" rapid watershed delineation over the national elevation model to select their watershed or catchment of interest. The two web applications also share the goal of enabling the scientists, resource managers and students alike to share data, analyses and model results. We will present these functioning web applications and their potential to substantially lower the bar for studying and understanding our water resources. We will also present work in progress, including a prototype system for enabling citizen-scientists to register open-source sensor stations (http://envirodiy.org/mayfly/) to stream data into these systems, so that they can be reshared using Water One Flow web services.
Rotondi, Armando J; Eack, Shaun M; Hanusa, Barbara H; Spring, Michael B; Haas, Gretchen L
2015-03-01
E-health applications are becoming integral components of general medical care delivery models and emerging for mental health care. Few exist for treatment of those with severe mental illness (SMI). In part, this is due to a lack of models to design such technologies for persons with cognitive impairments and lower technology experience. This study evaluated the effectiveness of an e-health design model for persons with SMI termed the Flat Explicit Design Model (FEDM). Persons with schizophrenia (n = 38) performed tasks to evaluate the effectiveness of 5 Web site designs: 4 were prominent public Web sites, and 1 was designed according to the FEDM. Linear mixed-effects regression models were used to examine differences in usability between the Web sites. Omnibus tests of between-site differences were conducted, followed by post hoc pairwise comparisons of means to examine specific Web site differences when omnibus tests reached statistical significance. The Web site designed using the FEDM required less time to find information, had a higher success rate, and was rated easier to use and less frustrating than the other Web sites. The home page design of one of the other Web sites provided the best indication to users about a Web site's contents. The results are consistent with and were used to expand the FEDM. The FEDM provides evidence-based guidelines to design e-health applications for person with SMI, including: minimize an application's layers or hierarchy, use explicit text, employ navigational memory aids, group hyperlinks in 1 area, and minimize the number of disparate subjects an application addresses. © The Author 2013. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
77 FR 47867 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-10
... phenology information to Nature's Notebook through a browser-based web application or via mobile applications for iPhone and Android operating systems, meeting GPEA requirements. The web application interface... techniques or other forms of information technology. Please note that the comments submitted in response to...
Workflow and web application for annotating NCBI BioProject transcriptome data.
Vera Alvarez, Roberto; Medeiros Vidal, Newton; Garzón-Martínez, Gina A; Barrero, Luz S; Landsman, David; Mariño-Ramírez, Leonardo
2017-01-01
The volume of transcriptome data is growing exponentially due to rapid improvement of experimental technologies. In response, large central resources such as those of the National Center for Biotechnology Information (NCBI) are continually adapting their computational infrastructure to accommodate this large influx of data. New and specialized databases, such as Transcriptome Shotgun Assembly Sequence Database (TSA) and Sequence Read Archive (SRA), have been created to aid the development and expansion of centralized repositories. Although the central resource databases are under continual development, they do not include automatic pipelines to increase annotation of newly deposited data. Therefore, third-party applications are required to achieve that aim. Here, we present an automatic workflow and web application for the annotation of transcriptome data. The workflow creates secondary data such as sequencing reads and BLAST alignments, which are available through the web application. They are based on freely available bioinformatics tools and scripts developed in-house. The interactive web application provides a search engine and several browser utilities. Graphical views of transcript alignments are available through SeqViewer, an embedded tool developed by NCBI for viewing biological sequence data. The web application is tightly integrated with other NCBI web applications and tools to extend the functionality of data processing and interconnectivity. We present a case study for the species Physalis peruviana with data generated from BioProject ID 67621. URL: http://www.ncbi.nlm.nih.gov/projects/physalis/. Published by Oxford University Press 2017. This work is written by US Government employees and is in the public domain in the US.
Randomized evaluation of a web based interview process for urology resident selection.
Shah, Satyan K; Arora, Sanjeev; Skipper, Betty; Kalishman, Summers; Timm, T Craig; Smith, Anthony Y
2012-04-01
We determined whether a web based interview process for resident selection could effectively replace the traditional on-site interview. For the 2010 to 2011 match cycle, applicants to the University of New Mexico urology residency program were randomized to participate in a web based interview process via Skype or a traditional on-site interview process. Both methods included interviews with the faculty, a tour of facilities and the opportunity to ask current residents any questions. To maintain fairness the applicants were then reinterviewed via the opposite process several weeks later. We assessed comparative effectiveness, cost, convenience and satisfaction using anonymous surveys largely scored on a 5-point Likert scale. Of 39 total participants (33 applicants and 6 faculty) 95% completed the surveys. The web based interview was less costly to applicants (mean $171 vs $364, p=0.05) and required less time away from school (10% missing 1 or more days vs 30%, p=0.04) compared to traditional on-site interview. However, applicants perceived the web based interview process as less effective than traditional on-site interview, with a mean 6-item summative effectiveness score of 21.3 vs 25.6 (p=0.003). Applicants and faculty favored continuing the web based interview process in the future as an adjunct to on-site interviews. Residency interviews can be successfully conducted via the Internet. The web based interview process reduced costs and improved convenience. The findings of this study support the use of videoconferencing as an adjunct to traditional interview methods rather than as a replacement. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
WebNet 96 Conference Proceedings (San Francisco, California, October 15-19, 1996).
ERIC Educational Resources Information Center
Maurer, Hermann, Ed.
This proceedings contains 80 full papers, 12 posters/demonstrations, 108 short papers, one panel, and one tutorial, all focusing on World Wide Web applications. Topics include: designing hypertext navigation tools; Web site design; distance education via the Web; instructional design; the world-wide market and censorship on the Web; customer…
Chesapeake Bay Low Freshwater Inflow Study. Biota Assessment. Phase I. Appendices.
1980-08-01
Most are restricted to more saline environments by competition, and not by effects of reduced salintiy. Many of these species are found from lo...detrital food webs and nutrient cycles in higher salinity areas e Importance to erosion control * Potential vulnerability to low flow effects -. 4 S - 4 B...After setting, salinity per se has little effect on Rangia. Other Sensitivities: In Chesapeake Bay, R. cuneata is near the northern limit of its range
FragFit: a web-application for interactive modeling of protein segments into cryo-EM density maps.
Tiemann, Johanna K S; Rose, Alexander S; Ismer, Jochen; Darvish, Mitra D; Hilal, Tarek; Spahn, Christian M T; Hildebrand, Peter W
2018-05-21
Cryo-electron microscopy (cryo-EM) is a standard method to determine the three-dimensional structures of molecular complexes. However, easy to use tools for modeling of protein segments into cryo-EM maps are sparse. Here, we present the FragFit web-application, a web server for interactive modeling of segments of up to 35 amino acids length into cryo-EM density maps. The fragments are provided by a regularly updated database containing at the moment about 1 billion entries extracted from PDB structures and can be readily integrated into a protein structure. Fragments are selected based on geometric criteria, sequence similarity and fit into a given cryo-EM density map. Web-based molecular visualization with the NGL Viewer allows interactive selection of fragments. The FragFit web-application, accessible at http://proteinformatics.de/FragFit, is free and open to all users, without any login requirements.
Combining demographic and genetic factors to assess population vulnerability in stream species
Erin L, Landguth; Muhlfeld, Clint C.; Jones, Leslie W.; Waples, Robin S.; Whited, Diane; Lowe, Winsor H.; Lucotch, John; Neville, Helen; Luikart, Gordon
2014-01-01
Accelerating climate change and other cumulative stressors create an urgent need to understand the influence of environmental variation and landscape features on the connectivity and vulnerability of freshwater species. Here, we introduce a novel modeling framework for aquatic systems that integrates spatially explicit, individual-based, demographic and genetic (demogenetic) assessments with environmental variables. To show its potential utility, we simulated a hypothetical network of 19 migratory riverine populations (e.g., salmonids) using a riverscape connectivity and demogenetic model (CDFISH). We assessed how stream resistance to movement (a function of water temperature, fluvial distance, and physical barriers) might influence demogenetic connectivity, and hence, population vulnerability. We present demographic metrics (abundance, immigration, and change in abundance) and genetic metrics (diversity, differentiation, and change in differentiation), and combine them into a single vulnerability index for identifying populations at risk of extirpation. We considered four realistic scenarios that illustrate the relative sensitivity of these metrics for early detection of reduced connectivity: (1) maximum resistance due to high water temperatures throughout the network, (2) minimum resistance due to low water temperatures throughout the network, (3) increased resistance at a tributary junction caused by a partial barrier, and (4) complete isolation of a tributary, leaving resident individuals only. We then applied this demogenetic framework using empirical data for a bull trout (Salvelinus confluentus) metapopulation in the upper Flathead River system, Canada and USA, to assess how current and predicted future stream warming may influence population vulnerability. Results suggest that warmer water temperatures and associated barriers to movement (e.g., low flows, dewatering) are predicted to fragment suitable habitat for migratory salmonids, resulting in the loss of genetic diversity and reduced numbers in certain vulnerable populations. This demogenetic simulation framework, which is illustrated in a web-based interactive mapping prototype, should be useful for evaluating population vulnerability in a wide variety of dendritic and fragmented riverscapes, helping to guide conservation and management efforts for freshwater species.
Online Social Media Applications for Constructivism and Observational Learning
ERIC Educational Resources Information Center
Mbati, Lydia
2013-01-01
Web 2.0 technologies have a range of possibilities for fostering constructivist learning and observational learning. This is due to the available applications which allow for synchronous and asynchronous interaction and the sharing of knowledge between users. Web 2.0 tools include online social media applications which have potential pedagogical…
A web access script language to support clinical application development.
O'Kane, K C; McColligan, E E
1998-02-01
This paper describes the development of a script language to support the implementation of decentralized, clinical information applications on the World Wide Web (Web). The goal of this work is to facilitate construction of low overhead, fully functional clinical information systems that can be accessed anywhere by low cost Web browsers to search, retrieve and analyze stored patient data. The Web provides a model of network access to data bases on a global scale. Although it was originally conceived as a means to exchange scientific documents, Web browsers and servers currently support access to a wide variety of audio, video, graphical and text based data to a rapidly growing community. Access to these services is via inexpensive client software browsers that connect to servers by means of the open architecture of the Internet. In this paper, the design and implementation of a script language that supports the development of low cost, Web-based, distributed clinical information systems for both Inter- and Intra-Net use is presented. The language is based on the Mumps language and, consequently, supports many legacy applications with few modifications. Several enhancements, however, have been made to support modern programming practices and the Web interface. The interpreter for the language also supports standalone program execution on Unix, MS-Windows, OS/2 and other operating systems.
Toward Exposing Timing-Based Probing Attacks in Web Applications †
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
2017-01-01
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users’ browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach. PMID:28245610
Toward Exposing Timing-Based Probing Attacks in Web Applications.
Mao, Jian; Chen, Yue; Shi, Futian; Jia, Yaoqi; Liang, Zhenkai
2017-02-25
Web applications have become the foundation of many types of systems, ranging from cloud services to Internet of Things (IoT) systems. Due to the large amount of sensitive data processed by web applications, user privacy emerges as a major concern in web security. Existing protection mechanisms in modern browsers, e.g., the same origin policy, prevent the users' browsing information on one website from being directly accessed by another website. However, web applications executed in the same browser share the same runtime environment. Such shared states provide side channels for malicious websites to indirectly figure out the information of other origins. Timing is a classic side channel and the root cause of many recent attacks, which rely on the variations in the time taken by the systems to process different inputs. In this paper, we propose an approach to expose the timing-based probing attacks in web applications. It monitors the browser behaviors and identifies anomalous timing behaviors to detect browser probing attacks. We have prototyped our system in the Google Chrome browser and evaluated the effectiveness of our approach by using known probing techniques. We have applied our approach on a large number of top Alexa sites and reported the suspicious behavior patterns with corresponding analysis results. Our theoretical analysis illustrates that the effectiveness of the timing-based probing attacks is dramatically limited by our approach.
ERIC Educational Resources Information Center
Wang, Kening; Mulvenon, Sean W.; Stegman, Charles; Anderson, Travis
2008-01-01
Google Maps API (Application Programming Interface), released in late June 2005 by Google, is an amazing technology that allows users to embed Google Maps in their own Web pages with JavaScript. Google Maps API has accelerated the development of new Google Maps based applications. This article reports a Web-based interactive mapping system…
Reference Architecture for MNE 5 Technical System
2007-05-30
of being available in most experiments. Core Services A core set of applications whi directories, web portal and collaboration applications etc. A...classifications Messages (xml, JMS, content level…) Meta data filtering, who can initiate services Web browsing Collaboration & messaging Border...Exchange Ref Architecture for MNE5 Tech System.doc 9 of 21 audit logging Person and machine Data lev objects, web services, messages rification el
ERIC Educational Resources Information Center
Garcia-Barriocanal, Elena; Sicilia, Miguel-Angel; Sanchez-Alonso, Salvador; Lytras, Miltiadis
2011-01-01
Web 2.0 technologies can be considered a loosely defined set of Web application styles that foster a kind of media consumer more engaged, and usually active in creating and maintaining Internet contents. Thus, Web 2.0 applications have resulted in increased user participation and massive user-generated (or user-published) open multimedia content,…
Information Assurance: Detection & Response to Web Spam Attacks
2010-08-28
such as blogs, social bookmarking ( folksonomies ), and wikis continue to gain its popularity, concerns about the rapid proliferation of Web spam has...Attacks Report Title ABSTRACT As online social media applications such as blogs, social bookmarking ( folksonomies ), and wikis continue to gain its... folksonomies ), and wikis continue to gain its popularity, concerns about the rapid proliferation of Web spam has grown in recent years. These applications
A Quantitative Study of Factors Related to Adult E-Learner's Adoption of Web 2.0 Technology
ERIC Educational Resources Information Center
Bledsoe, Johnny Mark
2012-01-01
The content created by digital natives via collaborative Web 2.0 applications provides a rich source of unique knowledge and social capital for their virtual communities of interest. The problem addressed in this study was the limited understanding of older digital immigrants who use Web 2.0 applications to access, distribute, or enhance these…
ERIC Educational Resources Information Center
Chou, Pao-Nan; Chang, Chi-Cheng
2011-01-01
This study examines the effects of reflection category and reflection quality on learning outcomes during Web-based portfolio assessment process. Experimental subjects consist of forty-five eight-grade students in a "Computer Application" course. Through the Web-based portfolio assessment system, these students write reflection, and join…
MAGMA: analysis of two-channel microarrays made easy.
Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph
2007-07-01
The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.
Robopedia: Leveraging Sensorpedia for Web-Enabled Robot Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Resseguie, David R
There is a growing interest in building Internetscale sensor networks that integrate sensors from around the world into a single unified system. In contrast, robotics application development has primarily focused on building specialized systems. These specialized systems take scalability and reliability into consideration, but generally neglect exploring the key components required to build a large scale system. Integrating robotic applications with Internet-scale sensor networks will unify specialized robotics applications and provide answers to large scale implementation concerns. We focus on utilizing Internet-scale sensor network technology to construct a framework for unifying robotic systems. Our framework web-enables a surveillance robot smore » sensor observations and provides a webinterface to the robot s actuators. This lets robots seamlessly integrate into web applications. In addition, the framework eliminates most prerequisite robotics knowledge, allowing for the creation of general web-based robotics applications. The framework also provides mechanisms to create applications that can interface with any robot. Frameworks such as this one are key to solving large scale mobile robotics implementation problems. We provide an overview of previous Internetscale sensor networks, Sensorpedia (an ad-hoc Internet-scale sensor network), our framework for integrating robots with Sensorpedia, two applications which illustrate our frameworks ability to support general web-based robotic control, and offer experimental results that illustrate our framework s scalability, feasibility, and resource requirements.« less
ChemCalc: a building block for tomorrow's chemical infrastructure.
Patiny, Luc; Borel, Alain
2013-05-24
Web services, as an aspect of cloud computing, are becoming an important part of the general IT infrastructure, and scientific computing is no exception to this trend. We propose a simple approach to develop chemical Web services, through which servers could expose the essential data manipulation functionality that students and researchers need for chemical calculations. These services return their results as JSON (JavaScript Object Notation) objects, which facilitates their use for Web applications. The ChemCalc project http://www.chemcalc.org demonstrates this approach: we present three Web services related with mass spectrometry, namely isotopic distribution simulation, peptide fragmentation simulation, and molecular formula determination. We also developed a complete Web application based on these three Web services, taking advantage of modern HTML5 and JavaScript libraries (ChemDoodle and jQuery).
Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A
2011-07-01
The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Mathew; Bowen, Brian; Coles, Dwight
The Middleware Automated Deployment Utilities consists the these three components: MAD: Utility designed to automate the deployment of java applications to multiple java application servers. The product contains a front end web utility and backend deployment scripts. MAR: Web front end to maintain and update the components inside database. MWR-Encrypt: Web utility to convert a text string to an encrypted string that is used by the Oracle Weblogic application server. The encryption is done using the built in functions if the Oracle Weblogic product and is mainly used to create an encrypted version of a database password.
Stocker, Gernot; Rieder, Dietmar; Trajanoski, Zlatko
2004-03-22
ClusterControl is a web interface to simplify distributing and monitoring bioinformatics applications on Linux cluster systems. We have developed a modular concept that enables integration of command line oriented program into the application framework of ClusterControl. The systems facilitate integration of different applications accessed through one interface and executed on a distributed cluster system. The package is based on freely available technologies like Apache as web server, PHP as server-side scripting language and OpenPBS as queuing system and is available free of charge for academic and non-profit institutions. http://genome.tugraz.at/Software/ClusterControl
Regional Geology Web Map Application Development: Javascript v2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell, Glenn
This is a milestone report for the FY2017 continuation of the Spent Fuel, Storage, and Waste, Technology (SFSWT) program (formerly Used Fuel Disposal (UFD) program) development of the Regional Geology Web Mapping Application by the Idaho National Laboratory Geospatial Science and Engineering group. This application was developed for general public use and is an interactive web-based application built in Javascript to visualize, reference, and analyze US pertinent geological features of the SFSWT program. This tool is a version upgrade from Adobe FLEX technology. It is designed to facilitate informed decision making of the geology of continental US relevant to themore » SFSWT program.« less
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Quinn, J. D.; Larour, E. Y.; Halkides, D. J.
2017-12-01
The Virtual Earth System Laboratory (VESL) is a Web application, under continued development at the Jet Propulsion Laboratory and UC Irvine, for the visualization of Earth System data and process simulations. As with any project of its size, we have encountered both successes and challenges during the course of development. Our principal point of success is the fact that VESL users can interact seamlessly with our earth science simulations within their own Web browser. Some of the challenges we have faced include retrofitting the VESL Web application to respond to touch gestures, reducing page load time (especially as the application has grown), and accounting for the differences between the various Web browsers and computing platforms.
Web 2.0 and Marketing Education: Explanations and Experiential Applications
ERIC Educational Resources Information Center
Granitz, Neil; Koernig, Stephen K.
2011-01-01
Although both experiential learning and Web 2.0 tools focus on creativity, sharing, and collaboration, sparse research has been published integrating a Web 2.0 paradigm with experiential learning in marketing. In this article, Web 2.0 concepts are explained. Web 2.0 is then positioned as a philosophy that can advance experiential learning through…
Web 2.0 and Critical Information Literacy
ERIC Educational Resources Information Center
Dunaway, Michelle
2011-01-01
The impact of Web 2.0 upon culture, education, and knowledge is obfuscated by the pervasiveness of Web 2.0 applications and technologies. Web 2.0 is commonly conceptualized in terms of the tools that it makes possible, such as Facebook, Twitter, and Wikipedia. In the context of information literacy instruction, Web 2.0 is frequently conceptualized…
A Semantic Sensor Web for Environmental Decision Support Applications
Gray, Alasdair J. G.; Sadler, Jason; Kit, Oles; Kyzirakos, Kostis; Karpathiotakis, Manos; Calbimonte, Jean-Paul; Page, Kevin; García-Castro, Raúl; Frazer, Alex; Galpin, Ixent; Fernandes, Alvaro A. A.; Paton, Norman W.; Corcho, Oscar; Koubarakis, Manolis; De Roure, David; Martinez, Kirk; Gómez-Pérez, Asunción
2011-01-01
Sensing devices are increasingly being deployed to monitor the physical world around us. One class of application for which sensor data is pertinent is environmental decision support systems, e.g., flood emergency response. For these applications, the sensor readings need to be put in context by integrating them with other sources of data about the surrounding environment. Traditional systems for predicting and detecting floods rely on methods that need significant human resources. In this paper we describe a semantic sensor web architecture for integrating multiple heterogeneous datasets, including live and historic sensor data, databases, and map layers. The architecture provides mechanisms for discovering datasets, defining integrated views over them, continuously receiving data in real-time, and visualising on screen and interacting with the data. Our approach makes extensive use of web service standards for querying and accessing data, and semantic technologies to discover and integrate datasets. We demonstrate the use of our semantic sensor web architecture in the context of a flood response planning web application that uses data from sensor networks monitoring the sea-state around the coast of England. PMID:22164110
Alava, Juan José; Ross, Peter S; Lachmuth, Cara; Ford, John K B; Hickie, Brendan E; Gobas, Frank A P C
2012-11-20
The development of an area-based polychlorinated biphenyl (PCB) food-web bioaccumulation model enabled a critical evaluation of the efficacy of sediment quality criteria and prey tissue residue guidelines in protecting fish-eating resident killer whales of British Columbia and adjacent waters. Model-predicted and observed PCB concentrations in resident killer whales and Chinook salmon were in good agreement, supporting the model's application for risk assessment and criteria development. Model application shows that PCB concentrations in the sediments from the resident killer whale's Critical Habitats and entire foraging range leads to PCB concentrations in most killer whales that exceed PCB toxicity threshold concentrations reported for marine mammals. Results further indicate that current PCB sediment quality and prey tissue residue criteria for fish-eating wildlife are not protective of killer whales and are not appropriate for assessing risks of PCB-contaminated sediments to high trophic level biota. We present a novel methodology for deriving sediment quality criteria and tissue residue guidelines that protect biota of high trophic levels under various PCB management scenarios. PCB concentrations in sediments and in prey that are deemed protective of resident killer whale health are much lower than current criteria values, underscoring the extreme vulnerability of high trophic level marine mammals to persistent and bioaccumulative contaminants.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2016-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
5 CFR 1655.10 - Loan application process.
Code of Federal Regulations, 2010 CFR
2010-01-01
... request on the TSP Web site: (1) FERS participants or members of the uniformed services requesting a... described in paragraph (b) of this section may use the TSP Web site to submit a loan application and obtain...
Wiklund Axelsson, S; Nyberg, L; Näslund, A; Melander Wikman, A
2013-01-01
This study investigates the anticipated psychosocial impact of present web-based e-health services and future mobile health applications among older Swedes. Random sample's of Swedish citizens aged 55 years old and older were given a survey containing two different e-health scenarios which respondents rated according to their anticipated psychosocial impact by means of the PIADS instrument. Results consistently demonstrated the positive anticipation of psychosocial impacts for both scenarios. The future mobile health applications scored more positively than the present web-based e-health services. An increase in age correlated positively to lower impact scores. These findings indicate that from a psychosocial perspective, web-based e-health services and mobile health applications are likely to positively impact quality of life. This knowledge can be helpful when tailoring and implementing e-health services that are directed to older people.
WebViz:A Web-based Collaborative Interactive Visualization System for large-Scale Data Sets
NASA Astrophysics Data System (ADS)
Yuen, D. A.; McArthur, E.; Weiss, R. M.; Zhou, J.; Yao, B.
2010-12-01
WebViz is a web-based application designed to conduct collaborative, interactive visualizations of large data sets for multiple users, allowing researchers situated all over the world to utilize the visualization services offered by the University of Minnesota’s Laboratory for Computational Sciences and Engineering (LCSE). This ongoing project has been built upon over the last 3 1/2 years .The motivation behind WebViz lies primarily with the need to parse through an increasing amount of data produced by the scientific community as a result of larger and faster multicore and massively parallel computers coming to the market, including the use of general purpose GPU computing. WebViz allows these large data sets to be visualized online by anyone with an account. The application allows users to save time and resources by visualizing data ‘on the fly’, wherever he or she may be located. By leveraging AJAX via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide users with a remote, web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota. LCSE’s custom hierarchical volume rendering software provides high resolution visualizations on the order of 15 million pixels and has been employed for visualizing data primarily from simulations in astrophysics to geophysical fluid dynamics . In the current version of WebViz, we have implemented a highly extensible back-end framework built around HTTP "server push" technology. The web application is accessible via a variety of devices including netbooks, iPhones, and other web and javascript-enabled cell phones. Features in the current version include the ability for users to (1) securely login (2) launch multiple visualizations (3) conduct collaborative visualization sessions (4) delegate control aspects of a visualization to others and (5) engage in collaborative chats with other users within the user interface of the web application. These features are all in addition to a full range of essential visualization functions including 3-D camera and object orientation, position manipulation, time-stepping control, and custom color/alpha mapping.
New Generation Sensor Web Enablement
Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob
2011-01-01
Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760
Multiple-Feature Extracting Modules Based Leak Mining System Design
Cho, Ying-Chiang; Pan, Jen-Yi
2013-01-01
Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing. PMID:24453892
Multiple-feature extracting modules based leak mining system design.
Cho, Ying-Chiang; Pan, Jen-Yi
2013-01-01
Over the years, human dependence on the Internet has increased dramatically. A large amount of information is placed on the Internet and retrieved from it daily, which makes web security in terms of online information a major concern. In recent years, the most problematic issues in web security have been e-mail address leakage and SQL injection attacks. There are many possible causes of information leakage, such as inadequate precautions during the programming process, which lead to the leakage of e-mail addresses entered online or insufficient protection of database information, a loophole that enables malicious users to steal online content. In this paper, we implement a crawler mining system that is equipped with SQL injection vulnerability detection, by means of an algorithm developed for the web crawler. In addition, we analyze portal sites of the governments of various countries or regions in order to investigate the information leaking status of each site. Subsequently, we analyze the database structure and content of each site, using the data collected. Thus, we make use of practical verification in order to focus on information security and privacy through black-box testing.
Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus
NASA Astrophysics Data System (ADS)
Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.
2017-12-01
Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.
Network Science Research Laboratory (NSRL) Telemetry Warehouse
2016-06-01
Functionality and architecture of the NSRL Telemetry Warehouse are also described as well as the web interface, data structure, security aspects, and...Experiment Controller 6 4.5 Telemetry Sensors 7 4.6 Custom Data Processing Nodes 7 5. Web Interface 8 6. Data Structure 8 6.1 Measurements 8...telemetry in comma-separated value (CSV) format from the web interface or via custom applications developed by researchers using the client application
NMRPro: an integrated web component for interactive processing and visualization of NMR spectra.
Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi
2016-07-01
The popularity of using NMR spectroscopy in metabolomics and natural products has driven the development of an array of NMR spectral analysis tools and databases. Particularly, web applications are well used recently because they are platform-independent and easy to extend through reusable web components. Currently available web applications provide the analysis of NMR spectra. However, they still lack the necessary processing and interactive visualization functionalities. To overcome these limitations, we present NMRPro, a web component that can be easily incorporated into current web applications, enabling easy-to-use online interactive processing and visualization. NMRPro integrates server-side processing with client-side interactive visualization through three parts: a python package to efficiently process large NMR datasets on the server-side, a Django App managing server-client interaction, and SpecdrawJS for client-side interactive visualization. Demo and installation instructions are available at http://mamitsukalab.org/tools/nmrpro/ mohamed@kuicr.kyoto-u.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
7 CFR 1783.8 - What are the acceptable methods for submitting applications?
Code of Federal Regulations, 2014 CFR
2014-01-01
... Web site (Grants.gov) at http://www.grants.gov. Applicants should refer to instructions found on the Grants.gov Web site for procedures for registering and using this facility. Applicants who have not previously registered on Grants.gov should allow a sufficient number of business days to complete the process...
7 CFR 1783.8 - What are the acceptable methods for submitting applications?
Code of Federal Regulations, 2011 CFR
2011-01-01
... Web site (Grants.gov) at http://www.grants.gov. Applicants should refer to instructions found on the Grants.gov Web site for procedures for registering and using this facility. Applicants who have not previously registered on Grants.gov should allow a sufficient number of business days to complete the process...
7 CFR 1783.8 - What are the acceptable methods for submitting applications?
Code of Federal Regulations, 2010 CFR
2010-01-01
... Web site (Grants.gov) at http://www.grants.gov. Applicants should refer to instructions found on the Grants.gov Web site for procedures for registering and using this facility. Applicants who have not previously registered on Grants.gov should allow a sufficient number of business days to complete the process...
Design Options for Multimodal Web Applications
NASA Astrophysics Data System (ADS)
Stanciulescu, Adrian; Vanderdonckt, Jean
The capabilities of multimodal applications running on the web are well de-lineated since they are mainly constrained by what their underlying standard mark up language offers, as opposed to hand-made multimodal applications. As the experience in developing such multimodal web applications is growing, the need arises to identify and define major design options of such application to pave the way to a structured development life cycle. This paper provides a design space of independent design options for multimodal web applications based on three types of modalities: graphical, vocal, tactile, and combined. On the one hand, these design options may provide designers with some explicit guidance on what to decide or not for their future user interface, while exploring various design alternatives. On the other hand, these design options have been implemented as graph transformations per-formed on a user interface model represented as a graph. Thanks to a transformation engine, it allows designers to play with the different values of each design option, to preview the results of the transformation, and to obtain the corresponding code on-demand
Raj, Stacey P; Antonini, Tanya N; Oberjohn, Karen S; Cassedy, Amy; Makoroff, Kathi L; Wade, Shari L
2015-01-01
To examine changes in parent depression, psychological distress, parenting stress, and self-efficacy among participants in a randomized trial of a Web-based parent training program for pediatric traumatic brain injury (TBI). Primary caregivers of 37 children aged 3 to 9 years who sustained a moderate/complicated mild to severe TBI were randomly assigned to the intervention or control group, and both groups were equipped with home Internet access. The online parent training program was designed to increase positive parenting skills and improve caregiver stress management. It consisted of 10 core sessions and up to 4 supplemental sessions. Each session included self-guided Web content, followed by a videoconference call with a therapist to discuss content and practice parenting skills with live feedback. Families in the control group received links to TBI Web resources. Parent income moderated treatment effects on parent functioning. Specifically, lower-income parents in the parenting skills group reported significant reductions in psychological distress compared with lower-income parents in the control group. No differences were found among higher-income parents for depression, parenting stress, or caregiver efficacy. Parent training interventions post-TBI may be particularly valuable for lower-income parents who are vulnerable to both environmental and injury-related stresses.
Web Services Provide Access to SCEC Scientific Research Application Software
NASA Astrophysics Data System (ADS)
Gupta, N.; Gupta, V.; Okaya, D.; Kamb, L.; Maechling, P.
2003-12-01
Web services offer scientific communities a new paradigm for sharing research codes and communicating results. While there are formal technical definitions of what constitutes a web service, for a user community such as the Southern California Earthquake Center (SCEC), we may conceptually consider a web service to be functionality provided on-demand by an application which is run on a remote computer located elsewhere on the Internet. The value of a web service is that it can (1) run a scientific code without the user needing to install and learn the intricacies of running the code; (2) provide the technical framework which allows a user's computer to talk to the remote computer which performs the service; (3) provide the computational resources to run the code; and (4) bundle several analysis steps and provide the end results in digital or (post-processed) graphical form. Within an NSF-sponsored ITR project coordinated by SCEC, we are constructing web services using architectural protocols and programming languages (e.g., Java). However, because the SCEC community has a rich pool of scientific research software (written in traditional languages such as C and FORTRAN), we also emphasize making existing scientific codes available by constructing web service frameworks which wrap around and directly run these codes. In doing so we attempt to broaden community usage of these codes. Web service wrapping of a scientific code can be done using a "web servlet" construction or by using a SOAP/WSDL-based framework. This latter approach is widely adopted in IT circles although it is subject to rapid evolution. Our wrapping framework attempts to "honor" the original codes with as little modification as is possible. For versatility we identify three methods of user access: (A) a web-based GUI (written in HTML and/or Java applets); (B) a Linux/OSX/UNIX command line "initiator" utility (shell-scriptable); and (C) direct access from within any Java application (and with the correct API interface from within C++ and/or C/Fortran). This poster presentation will provide descriptions of the following selected web services and their origin as scientific application codes: 3D community velocity models for Southern California, geocoordinate conversions (latitude/longitude to UTM), execution of GMT graphical scripts, data format conversions (Gocad to Matlab format), and implementation of Seismic Hazard Analysis application programs that calculate hazard curve and hazard map data sets.
Launch of Village Blue Web Application Shares Water Monitoring Data with Baltimore Community
EPA and the U.S. Geological Survey (USGS) have launched their mobile-friendly web application for Village Blue, a project that provides real-time water quality monitoring data to the Baltimore, Maryland community.
Augmenting Research, Education, and Outreach with Client-Side Web Programming.
Abriata, Luciano A; Rodrigues, João P G L M; Salathé, Marcel; Patiny, Luc
2018-05-01
The evolution of computing and web technologies over the past decade has enabled the development of fully fledged scientific applications that run directly on web browsers. Powered by JavaScript, the lingua franca of web programming, these 'web apps' are starting to revolutionize and democratize scientific research, education, and outreach. Copyright © 2017 Elsevier Ltd. All rights reserved.
The Knowledge of Web 2.0 by Library and Information Science Academics
ERIC Educational Resources Information Center
Al-Daihani, Sultan
2009-01-01
This research paper reports the results of a Web-based survey designed to explore the attitude of Library and Information Science (LIS) academics to Web 2.0. It investigates their familiarity with Web 2.0 concepts, tools and services and applications as these relate to LIS education, and the barriers to their use. A Web-based questionnaire was…
Enable Web-Based Tracking and Guiding by Integrating Location-Awareness with the World Wide Web
ERIC Educational Resources Information Center
Zhou, Rui
2008-01-01
Purpose: The aim of this research is to enable web-based tracking and guiding by integrating location-awareness with the Worldwide Web so that the users can use various location-based applications without installing extra software. Design/methodology/approach: The concept of web-based tracking and guiding is introduced and the relevant issues are…
An optimized web-based approach for collaborative stereoscopic medical visualization
Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C
2013-01-01
Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008
The Web-based Electronic Data Review (WebEDR) application performs automated data evaluation on ERLN electronic data deliverables (EDDs). It uses test derived from the National Functional Guidelines combined with method-defined limits to measure data.
ReSTful OSGi Web Applications Tutorial
NASA Technical Reports Server (NTRS)
Shams, Khawaja; Norris, Jeff
2008-01-01
This slide presentation accompanies a tutorial on the ReSTful (Representational State Transfer) web application. Using Open Services Gateway Initiative (OSGi), ReST uses HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites. It also uses Eclipse for the rapid development, the Eclipse debugger, the test application, and the ease of export to production servers.
StreamStats in Georgia: a water-resources web application
Gotvald, Anthony J.; Musser, Jonathan W.
2015-07-31
StreamStats is being implemented on a State-by-State basis to allow for customization of the data development and underlying datasets to address their specific needs, issues, and objectives. The USGS, in cooperation with the Georgia Environmental Protection Division and Georgia Department of Transportation, has implemented StreamStats for Georgia. The Georgia StreamStats Web site is available through the national StreamStats Web-page portal at http://streamstats.usgs.gov. Links are provided on this Web page for individual State applications, instructions for using StreamStats, definitions of basin characteristics and streamflow statistics, and other supporting information.
Vulnerabilities in Bytecode Removed by Analysis, Nuanced Confinement and Diversification (VIBRANCE)
2015-06-01
VIBRANCE tool starts with a vulnerable Java application and automatically hardens it against SQL injection, OS command injection, file path traversal...7 2.2 Java Front End...7 2.2.2 Java Byte Code Parser
ERIC Educational Resources Information Center
Thomas, David A.; Li, Qing
2008-01-01
The World Wide Web is evolving in response to users who demand faster and more efficient access to information, portability, and reusability of digital objects between Web-based and computer-based applications and powerful communication, publication, collaboration, and teaching and learning tools. This article reviews current uses of Web-based…
a Web Api and Web Application Development for Dissemination of Air Quality Information
NASA Astrophysics Data System (ADS)
Şahin, K.; Işıkdağ, U.
2017-11-01
Various studies have been carried out since 2005 under the leadership of Ministry of Environment and Urbanism of Turkey, in order to observe the quality of air in Turkey, to develop new policies and to develop a sustainable air quality management strategy. For this reason, a national air quality monitoring network has been developed providing air quality indices. By this network, the quality of the air has been continuously monitored and an important information system has been constructed in order to take precautions for preventing a dangerous situation. The biggest handicap in the network is the data access problem for instant and time series data acquisition and processing because of its proprietary structure. Currently, there is no service offered by the current air quality monitoring system for exchanging information with third party applications. Within the context of this work, a web service has been developed to enable location based querying of the current/past air quality data in Turkey. This web service is equipped with up-todate and widely preferred technologies. In other words, an architecture is chosen in which applications can easily integrate. In the second phase of the study, a web-based application was developed to test the developed web service and this testing application can perform location based acquisition of air-quality data. This makes it possible to easily carry out operations such as screening and examination of the area in the given time-frame which cannot be done with the national monitoring network.
WebChem Viewer: a tool for the easy dissemination of chemical and structural data sets
2014-01-01
Background Sharing sets of chemical data (e.g., chemical properties, docking scores, etc.) among collaborators with diverse skill sets is a common task in computer-aided drug design and medicinal chemistry. The ability to associate this data with images of the relevant molecular structures greatly facilitates scientific communication. There is a need for a simple, free, open-source program that can automatically export aggregated reports of entire chemical data sets to files viewable on any computer, regardless of the operating system and without requiring the installation of additional software. Results We here present a program called WebChem Viewer that automatically generates these types of highly portable reports. Furthermore, in designing WebChem Viewer we have also created a useful online web application for remotely generating molecular structures from SMILES strings. We encourage the direct use of this online application as well as its incorporation into other software packages. Conclusions With these features, WebChem Viewer enables interdisciplinary collaborations that require the sharing and visualization of small molecule structures and associated sets of heterogeneous chemical data. The program is released under the FreeBSD license and can be downloaded from http://nbcr.ucsd.edu/WebChemViewer. The associated web application (called “Smiley2png 1.0”) can be accessed through freely available web services provided by the National Biomedical Computation Resource at http://nbcr.ucsd.edu. PMID:24886360
Framework for Supporting Web-Based Collaborative Applications
NASA Astrophysics Data System (ADS)
Dai, Wei
The article proposes an intelligent framework for supporting Web-based applications. The framework focuses on innovative use of existing resources and technologies in the form of services and takes the leverage of theoretical foundation of services science and the research from services computing. The main focus of the framework is to deliver benefits to users with various roles such as service requesters, service providers, and business owners to maximize their productivity when engaging with each other via the Web. The article opens up with research motivations and questions, analyses the existing state of research in the field, and describes the approach in implementing the proposed framework. Finally, an e-health application is discussed to evaluate the effectiveness of the framework where participants such as general practitioners (GPs), patients, and health-care workers collaborate via the Web.
Integrating UIMA annotators in a web-based text processing framework.
Chen, Xiang; Arnold, Corey W
2013-01-01
The Unstructured Information Management Architecture (UIMA) [1] framework is a growing platform for natural language processing (NLP) applications. However, such applications may be difficult for non-technical users deploy. This project presents a web-based framework that wraps UIMA-based annotator systems into a graphical user interface for researchers and clinicians, and a web service for developers. An annotator that extracts data elements from lung cancer radiology reports is presented to illustrate the use of the system. Annotation results from the web system can be exported to multiple formats for users to utilize in other aspects of their research and workflow. This project demonstrates the benefits of a lay-user interface for complex NLP applications. Efforts such as this can lead to increased interest and support for NLP work in the clinical domain.
Large area sheet task: Advanced dendritic web growth development
NASA Technical Reports Server (NTRS)
Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.
1981-01-01
The growth of silicon dendritic web for photovoltaic applications was investigated. The application of a thermal model for calculating buckling stresses as a function of temperature profile in the web is discussed. Lid and shield concepts were evaluated to provide the data base for enhancing growth velocity. An experimental web growth machine which embodies in one unit the mechanical and electronic features developed in previous work was developed. In addition, evaluation of a melt level control system was begun, along with preliminary tests of an elongated crucible design. The economic analysis was also updated to incorporate some minor cost changes. The initial applications of the thermal model to a specific configuration gave results consistent with experimental observation in terms of the initiation of buckling vs. width for a given crystal thickness.
Earth-Base: A Free And Open Source, RESTful Earth Sciences Platform
NASA Astrophysics Data System (ADS)
Kishor, P.; Heim, N. A.; Peters, S. E.; McClennen, M.
2012-12-01
This presentation describes the motivation, concept, and architecture behind Earth-Base, a web-based, RESTful data-management, analysis and visualization platform for earth sciences data. Traditionally web applications have been built directly accessing data from a database using a scripting language. While such applications are great at bring results to a wide audience, they are limited in scope to the imagination and capabilities of the application developer. Earth-Base decouples the data store from the web application by introducing an intermediate "data application" tier. The data application's job is to query the data store using self-documented, RESTful URIs, and send the results back formatted as JavaScript Object Notation (JSON). Decoupling the data store from the application allows virtually limitless flexibility in developing applications, both web-based for human consumption or programmatic for machine consumption. It also allows outside developers to use the data in their own applications, potentially creating applications that the original data creator and app developer may not have even thought of. Standardized specifications for URI-based querying and JSON-formatted results make querying and developing applications easy. URI-based querying also allows utilizing distributed datasets easily. Companion mechanisms for querying data snapshots aka time-travel, usage tracking and license management, and verification of semantic equivalence of data are also described. The latter promotes the "What You Expect Is What You Get" (WYEIWYG) principle that can aid in data citation and verification.
Construction of road network vulnerability evaluation index based on general travel cost
NASA Astrophysics Data System (ADS)
Leng, Jun-qiang; Zhai, Jing; Li, Qian-wen; Zhao, Lin
2018-03-01
With the development of China's economy and the continuous improvement of her urban road network, the vulnerability of the urban road network has attracted increasing attention. Based on general travel cost, this work constructs the vulnerability evaluation index for the urban road network, and evaluates the vulnerability of the urban road network from the perspective of user generalised travel cost. Firstly, the generalised travel cost model is constructed based on vehicle cost, travel time, and traveller comfort. Then, the network efficiency index is selected as an evaluation index of vulnerability: the network efficiency index is composed of the traffic volume and the generalised travel cost, which are obtained from the equilibrium state of the network. In addition, the research analyses the influence of traffic capacity decrease, road section attribute value, and location of road section, on vulnerability. Finally, the vulnerability index is used to analyse the local area network of Harbin and verify its applicability.
Fisher, Jason C; Kuenzler, Keith A; Tomita, Sandra S; Sinha, Prashant; Shah, Paresh; Ginsburg, Howard B
2017-01-01
Documenting surgical complications is limited by multiple barriers and is not fostered in the electronic health record. Tracking complications is essential for quality improvement (QI) and required for board certification. Current registry platforms do not facilitate meaningful complication reporting. We developed a novel web application that improves accuracy and reduces barriers to documenting complications. We deployed a custom web application that allows pediatric surgeons to maintain case logs. The program includes a module for entering complication data in real time. Reminders to enter outcome data occur at key postoperative intervals to optimize recall of events. Between October 1, 2014, and March 31, 2015, frequencies of surgical complications captured by the existing hospital reporting system were compared with data aggregated by our application. 780 cases were captured by the web application, compared with 276 cases registered by the hospital system. We observed an increase in the capture of major complications when compared to the hospital dataset (14 events vs. 4 events). This web application improved real-time reporting of surgical complications, exceeding the accuracy of administrative datasets. Custom informatics solutions may help reduce barriers to self-reporting of adverse events and improve the data that presently inform pediatric surgical QI. Diagnostic study/Retrospective study. Level III - case control study. Copyright © 2017 Elsevier Inc. All rights reserved.
Ontology Reuse in Geoscience Semantic Applications
NASA Astrophysics Data System (ADS)
Mayernik, M. S.; Gross, M. B.; Daniels, M. D.; Rowan, L. R.; Stott, D.; Maull, K. E.; Khan, H.; Corson-Rikert, J.
2015-12-01
The tension between local ontology development and wider ontology connections is fundamental to the Semantic web. It is often unclear, however, what the key decision points should be for new semantic web applications in deciding when to reuse existing ontologies and when to develop original ontologies. In addition, with the growth of semantic web ontologies and applications, new semantic web applications can struggle to efficiently and effectively identify and select ontologies to reuse. This presentation will describe the ontology comparison, selection, and consolidation effort within the EarthCollab project. UCAR, Cornell University, and UNAVCO are collaborating on the EarthCollab project to use semantic web technologies to enable the discovery of the research output from a diverse array of projects. The EarthCollab project is using the VIVO Semantic web software suite to increase discoverability of research information and data related to the following two geoscience-based communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) diverse research projects informed by geodesy through the UNAVCO geodetic facility and consortium. This presentation will outline of EarthCollab use cases, and provide an overview of key ontologies being used, including the VIVO-Integrated Semantic Framework (VIVO-ISF), Global Change Information System (GCIS), and Data Catalog (DCAT) ontologies. We will discuss issues related to bringing these ontologies together to provide a robust ontological structure to support the EarthCollab use cases. It is rare that a single pre-existing ontology meets all of a new application's needs. New projects need to stitch ontologies together in ways that fit into the broader semantic web ecosystem.
A Novel Web Application to Analyze and Visualize Extreme Heat Events
NASA Astrophysics Data System (ADS)
Li, G.; Jones, H.; Trtanj, J.
2016-12-01
Extreme heat is the leading cause of weather-related deaths in the United States annually and is expected to increase with our warming climate. However, most of these deaths are preventable with proper tools and services to inform the public about heat waves. In this project, we have investigated the key indicators of a heat wave, the vulnerable populations, and the data visualization strategies of how those populations most effectively absorb heat wave data. A map-based web app has been created that allows users to search and visualize historical heat waves in the United States incorporating these strategies. This app utilizes daily maximum temperature data from NOAA Global Historical Climatology Network which contains about 2.7 million data points from over 7,000 stations per year. The point data are spatially aggregated into county-level data using county geometry from US Census Bureau and stored in Postgres database with PostGIS spatial capability. GeoServer, a powerful map server, is used to serve the image and data layers (WMS and WFS). The JavaScript-based web-mapping platform Leaflet is used to display the temperature layers. A number of functions have been implemented for the search and display. Users can search for extreme heat events by county or by date. The "by date" option allows a user to select a date and a Tmax threshold which then highlights all of the areas on the map that meet those date and temperature parameters. The "by county" option allows the user to select a county on the map which then retrieves a list of heat wave dates and daily Tmax measurements. This visualization is clean, user-friendly, and novel because while this sort of time, space, and temperature measurements can be found by querying meteorological datasets, there does not exist a tool that neatly packages this information together in an easily accessible and non-technical manner, especially in a time where climate change urges a better understanding of heat waves.
VegScape: U.S. Crop Condition Monitoring Service
NASA Astrophysics Data System (ADS)
mueller, R.; Yang, Z.; Di, L.
2013-12-01
Since 1995, the US Department of Agriculture (USDA)/National Agricultural Statistics Service (NASS) has provided qualitative biweekly vegetation condition indices to USDA policymakers and the public on a weekly basis during the growing season. Vegetation indices have proven useful for assessing crop condition and identifying the areal extent of floods, drought, major weather anomalies, and vulnerabilities of early/late season crops. With growing emphasis on more extreme weather events and food security issues rising to the forefront of national interest, a new vegetation condition monitoring system was developed. The new vegetation condition portal named VegScape was initiated at the start of the 2013 growing season. VegScape delivers web mapping service based interactive vegetation indices. Users can use an interactive map to explore, query and disseminate current crop conditions. Vegetation indices like Normal Difference Vegetation Index (NDVI), Vegetation Condition Index (VCI), and mean, median, and ratio comparisons to prior years can be constructed for analytical purposes and on-demand crop statistics. The NASA MODIS satellite with 250 meter (15 acres) resolution and thirteen years of data history provides improved spatial and temporal resolutions and delivers improved detailed timely (i.e., daily) crop specific condition and dynamics. VegScape thus provides supplemental information to support NASS' weekly crop reports. VegScape delivers an agricultural cultivated crop mask and the most recent Cropland Data Layer (CDL) product to exploit the agricultural domain and visualize prior years' planted crops. Additionally, the data can be directly exported to Google Earth for web mashups or delivered via web mapping services for uses in other applications. VegScape supports the ethos of data democracy by providing free and open access to digital geospatial data layers using open geospatial standards, thereby supporting transparent and collaborative government initiatives. NASS developed VegScape in cooperation with the Center for Spatial Information Science and Systems, George Mason University, Fairfax, VA. VegScape Ratio to Median NDVI
Jung, Tae-Sung; Yeo, Hock Chuan; Reddy, Satty G; Cho, Wan-Sup; Lee, Dong-Yup
2009-11-01
WEbcoli is a WEb application for in silico designing, analyzing and engineering Escherichia coli metabolism. It is devised and implemented using advanced web technologies, thereby leading to enhanced usability and dynamic web accessibility. As a main feature, the WEbcoli system provides a user-friendly rich web interface, allowing users to virtually design and synthesize mutant strains derived from the genome-scale wild-type E.coli model and to customize pathways of interest through a graph editor. In addition, constraints-based flux analysis can be conducted for quantifying metabolic fluxes and charactering the physiological and metabolic states under various genetic and/or environmental conditions. WEbcoli is freely accessible at http://webcoli.org. cheld@nus.edu.sg.
NASA Astrophysics Data System (ADS)
Manuaba, I. B. P.; Rudiastini, E.
2018-01-01
Assessment of lecturers is a tool used to measure lecturer performance. Lecturer’s assessment variable can be measured from three aspects : teaching activities, research and community service. Broad aspect to measure the performance of lecturers requires a special framework, so that the system can be developed in a sustainable manner. Issues of this research is to create a API web service data tool, so the lecturer assessment system can be developed in various frameworks. The research was developed with web service and php programming language with the output of json extension data. The conclusion of this research is API web service data application can be developed using several platforms such as web, mobile application
Scholz-Starke, Björn; Burkhardt, Ulrich; Lesch, Stephan; Rick, Sebastian; Russell, David; Roß-Nickoll, Martina; Ottermanns, Richard
2017-01-01
Abstract The Edaphostat web application allows interactive and dynamic analyses of soil organism data stored in the Edaphobase data warehouse. It is part of the Edaphobase web application and can be accessed by any modern browser. The tool combines data from different sources (publications, field studies and museum collections) and allows species preferences along various environmental gradients (i.e. C/N ratio and pH) and classification systems (habitat type and soil type) to be analyzed. Database URL: Edaphostat is part of the Edaphobase Web Application available at https://portal.edaphobase.org PMID:29220469
Trust evaluation in health information on the World Wide Web.
Moturu, Sai T; Liu, Huan; Johnson, William G
2008-01-01
The impact of health information on the web is mounting and with the Health 2.0 revolution around the corner, online health promotion and management is becoming a reality. User-generated content is at the core of this revolution and brings to the fore the essential question of trust evaluation, a pertinent problem for health applications in particular. Evolving Web 2.0 health applications provide abundant opportunities for research. We identify these applications, discuss the challenges for trust assessment, characterize conceivable variables, list potential techniques for analysis, and provide a vision for future research.
Semantic-Web Technology: Applications at NASA
NASA Technical Reports Server (NTRS)
Ashish, Naveen
2004-01-01
We provide a description of work at the National Aeronautics and Space Administration (NASA) on building system based on semantic-web concepts and technologies. NASA has been one of the early adopters of semantic-web technologies for practical applications. Indeed there are several ongoing 0 endeavors on building semantics based systems for use in diverse NASA domains ranging from collaborative scientific activity to accident and mishap investigation to enterprise search to scientific information gathering and integration to aviation safety decision support We provide a brief overview of many applications and ongoing work with the goal of informing the external community of these NASA endeavors.
Development of Web-Based Learning Application for Generation Z
ERIC Educational Resources Information Center
Hariadi, Bambang; Dewiyani Sunarto, M. J.; Sudarmaningtyas, Pantjawati
2016-01-01
This study aimed to develop a web-based learning application as a form of learning revolution. The form of learning revolution includes the provision of unlimited teaching materials, real time class organization, and is not limited by time or place. The implementation of this application is in the form of hybrid learning by using Google Apps for…
Journalism Students, Web 2.0 and the Digital Divide
ERIC Educational Resources Information Center
Green, Mary Elizabeth
2009-01-01
The purpose of this study was to find out if students were utilizing Web 2.0 applications. Since the applications in question are often employed by the media industry, the study aspired to find out if students majoring in mass communication and journalism utilized the applications more often than other students. The "digital divide" is a term used…
The Challenge of Handling Big Data Sets in the Sensor Web
NASA Astrophysics Data System (ADS)
Autermann, Christian; Stasch, Christoph; Jirka, Simon
2016-04-01
More and more Sensor Web components are deployed in different domains such as hydrology, oceanography or air quality in order to make observation data accessible via the Web. However, besides variability of data formats and protocols in environmental applications, the fast growing volume of data with high temporal and spatial resolution is imposing new challenges for Sensor Web technologies when sharing observation data and metadata about sensors. Variability, volume and velocity are the core issues that are addressed by Big Data concepts and technologies. Most solutions in the geospatial sector focus on remote sensing and raster data, whereas big in-situ observation data sets relying on vector features require novel approaches. Hence, in order to deal with big data sets in infrastructures for observational data, the following questions need to be answered: 1. How can big heterogeneous spatio-temporal datasets be organized, managed, and provided to Sensor Web applications? 2. How can views on big data sets and derived information products be made accessible in the Sensor Web? 3. How can big observation data sets be processed efficiently? We illustrate these challenges with examples from the marine domain and outline how we address these challenges. We therefore show how big data approaches from mainstream IT can be re-used and applied to Sensor Web application scenarios.
Kushniruk, A W; Patel, C; Patel, V L; Cimino, J J
2001-04-01
The World Wide Web provides an unprecedented opportunity for widespread access to health-care applications by both patients and providers. The development of new methods for assessing the effectiveness and usability of these systems is becoming a critical issue. This paper describes the distance evaluation (i.e. 'televaluation') of emerging Web-based information technologies. In health informatics evaluation, there is a need for application of new ideas and methods from the fields of cognitive science and usability engineering. A framework is presented for conducting evaluations of health-care information technologies that integrates a number of methods, ranging from deployment of on-line questionnaires (and Web-based forms) to remote video-based usability testing of user interactions with clinical information systems. Examples illustrating application of these techniques are presented for the assessment of a patient clinical information system (PatCIS), as well as an evaluation of use of Web-based clinical guidelines. Issues in designing, prototyping and iteratively refining evaluation components are discussed, along with description of a 'virtual' usability laboratory.
Developing Distributed Collaboration Systems at NASA: A Report from the Field
NASA Technical Reports Server (NTRS)
Becerra-Fernandez, Irma; Stewart, Helen; Knight, Chris; Norvig, Peter (Technical Monitor)
2001-01-01
Web-based collaborative systems have assumed a pivotal role in the information systems development arena. While business to customers (B-to-C) and business to business (B-to-B) electronic commerce systems, search engines, and chat sites are the focus of attention, web-based systems span the gamut of information systems that were traditionally confined to internal organizational client server networks. For example, the Domino Application Server allows Lotus Notes (trademarked) uses to build collaborative intranet applications and mySAP.com (trademarked) enables web portals and e-commerce applications for SAP users. This paper presents the experiences in the development of one such system: Postdoc, a government off-the-shelf web-based collaborative environment. Issues related to the design of web-based collaborative information systems, including lessons learned from the development and deployment of the system as well as measured performance, are presented in this paper. Finally, the limitations of the implementation approach as well as future plans are presented as well.
Web-based three-dimensional geo-referenced visualization
NASA Astrophysics Data System (ADS)
Lin, Hui; Gong, Jianhua; Wang, Freeman
1999-12-01
This paper addresses several approaches to implementing web-based, three-dimensional (3-D), geo-referenced visualization. The discussion focuses on the relationship between multi-dimensional data sets and applications, as well as the thick/thin client and heavy/light server structure. Two models of data sets are addressed in this paper. One is the use of traditional 3-D data format such as 3-D Studio Max, Open Inventor 2.0, Vis5D and OBJ. The other is modelled by a web-based language such as VRML. Also, traditional languages such as C and C++, as well as web-based programming tools such as Java, Java3D and ActiveX, can be used for developing applications. The strengths and weaknesses of each approach are elaborated. Four practical solutions for using VRML and Java, Java and Java3D, VRML and ActiveX and Java wrapper classes (Java and C/C++), to develop applications are presented for web-based, real-time interactive and explorative visualization.
Dynamic Space for Rent: Using Commercial Web Hosting to Develop a Web 2.0 Intranet
ERIC Educational Resources Information Center
Hodgins, Dave
2010-01-01
The explosion of Web 2.0 into libraries has left many smaller academic libraries (and other libraries with limited computing resources or support) to work in the cloud using free Web applications. The use of commercial Web hosting is an innovative approach to the problem of inadequate local resources. While the idea of insourcing IT will seem…
NASA Astrophysics Data System (ADS)
Mogaji, K. A.
2017-04-01
Producing a bias-free vulnerability assessment map model is significantly needed for planning a scheme of groundwater quality protection. This study developed a GIS-based AHPDST vulnerability index model for producing groundwater vulnerability model map in the hard rock terrain, Nigeria by exploiting the potentials of analytic hierarchy process (AHP) and Dempster-Shafer theory (DST) data mining models. The acquired borehole and geophysical data in the study area were processed to derive five groundwater vulnerability conditioning factors (GVCFs), namely recharge rate, aquifer transmissivity, hydraulic conductivity, transverse resistance and longitudinal conductance. The produced GVCFs' thematic maps were multi-criterially analyzed by employing the mechanisms of AHP and DST models to determine the normalized weight ( W) parameter for the GVCFs and mass function factors (MFFs) parameter for the GVCFs' thematic maps' class boundaries, respectively. Based on the application of the weighted linear average technique, the determined W and MFFs parameters were synthesized to develop groundwater vulnerability potential index (GVPI)-based AHPDST model algorithm. The developed model was applied to establish four GVPI mass/belief function indices. The estimates based on the applied GVPI belief function indices were processed in GIS environment to create prospective groundwater vulnerability potential index maps. The most representative of the resulting vulnerability maps (the GVPIBel map) was considered for producing the groundwater vulnerability potential zones (GVPZ) map for the area. The produced GVPZ map established 48 and 52% of the areal extent to be covered by the lows/moderate and highs vulnerable zones, respectively. The success and the prediction rates of the produced GVPZ map were determined using the relative operating characteristics technique to give 82.3 and 77.7%, respectively. The analyzed results reveal that the developed GVPI-based AHPDST model algorithm is capable of producing efficient groundwater vulnerability potential zones prediction map and characterizing the predicted zones uncertainty via the DST mechanism processes in the area. The produced GVPZ map in this study can be used by decision makers to formulate appropriate groundwater management strategies and the approach may be well opted in other hard rock regions of the world, especially in economically poor nations.
The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox
NASA Astrophysics Data System (ADS)
Harris, A. T., III; Goodman, J.; Justice, B.
2014-12-01
As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.
NASA Astrophysics Data System (ADS)
Pulsani, B. R.
2017-11-01
Tank Information System is a web application which provides comprehensive information about minor irrigation tanks of Telangana State. As part of the program, a web mapping application using Flex and ArcGIS server was developed to make the data available to the public. In course of time as Flex be-came outdated, a migration of the client interface to the latest JavaScript based technologies was carried out. Initially, the Flex based application was migrated to ArcGIS JavaScript API using Dojo Toolkit. Both the client applications used published services from ArcGIS server. To check the migration pattern from proprietary to open source, the JavaScript based ArcGIS application was later migrated to OpenLayers and Dojo Toolkit which used published service from GeoServer. The migration pattern noticed in the study especially emphasizes upon the use of Dojo Toolkit and PostgreSQL database for ArcGIS server so that migration to open source could be performed effortlessly. The current ap-plication provides a case in study which could assist organizations in migrating their proprietary based ArcGIS web applications to open source. Furthermore, the study reveals cost benefits of adopting open source against commercial software's.
A resource-oriented architecture for a Geospatial Web
NASA Astrophysics Data System (ADS)
Mazzetti, Paolo; Nativi, Stefano
2010-05-01
In this presentation we discuss some architectural issues on the design of an architecture for a Geospatial Web, that is an information system for sharing geospatial resources according to the Web paradigm. The success of the Web in building a multi-purpose information space, has raised questions about the possibility of adopting the same approach for systems dedicated to the sharing of more specific resources, such as the geospatial information, that is information characterized by spatial/temporal reference. To this aim an investigation on the nature of the Web and on the validity of its paradigm for geospatial resources is required. The Web was born in the early 90's to provide "a shared information space through which people and machines could communicate" [Berners-Lee 1996]. It was originally built around a small set of specifications (e.g. URI, HTTP, HTML, etc.); however, in the last two decades several other technologies and specifications have been introduced in order to extend its capabilities. Most of them (e.g. the SOAP family) actually aimed to transform the Web in a generic Distributed Computing Infrastructure. While these efforts were definitely successful enabling the adoption of service-oriented approaches for machine-to-machine interactions supporting complex business processes (e.g. for e-Government and e-Business applications), they do not fit in the original concept of the Web. In the year 2000, R. T. Fielding, one of the designers of the original Web specifications, proposes a new architectural style for distributed systems, called REST (Representational State Transfer), aiming to capture the fundamental characteristics of the Web as it was originally conceived [Fielding 2000]. In this view, the nature of the Web lies not so much in the technologies, as in the way they are used. Maintaining the Web architecture conform to the REST style would then assure the scalability, extensibility and low entry barrier of the original Web. On the contrary, systems using the same Web technologies and specifications but according to a different architectural style, despite their usefulness, should not be considered part of the Web. If the REST style captures the significant Web characteristics, then, in order to build a Geospatial Web it is necessary that its architecture satisfies all the REST constraints. One of them is of particular importance: the adoption of a Uniform Interface. It prescribes that all the geospatial resources must be accessed through the same interface; moreover according to the REST style this interface must satisfy four further constraints: a) identification of resources; b) manipulation of resources through representations; c) self-descriptive messages; and, d) hypermedia as the engine of application state. In the Web, the uniform interface provides basic operations which are meaningful for generic resources. They typically implement the CRUD pattern (Create-Retrieve-Update-Delete) which demonstrated to be flexible and powerful in several general-purpose contexts (e.g. filesystem management, SQL for database management systems, etc.). Restricting the scope to a subset of resources it would be possible to identify other generic actions which are meaningful for all of them. For example for geospatial resources, subsetting, resampling, interpolation and coordinate reference systems transformations functionalities are candidate functionalities for a uniform interface. However an investigation is needed to clarify the semantics of those actions for different resources, and consequently if they can really ascend the role of generic interface operation. Concerning the point a), (identification of resources), it is required that every resource addressable in the Geospatial Web has its own identifier (e.g. a URI). This allows to implement citation and re-use of resources, simply providing the URI. OPeNDAP and KVP encodings of OGC data access services specifications might provide a basis for it. Concerning point b) (manipulation of resources through representations), the Geospatial Web poses several issues. In fact, while the Web mainly handles semi-structured information, in the Geospatial Web the information is typically structured with several possible data models (e.g. point series, gridded coverages, trajectories, etc.) and encodings. A possibility would be to simplify the interchange formats, choosing to support a subset of data models and format(s). This is what actually the Web designers did choosing to define a common format for hypermedia (HTML), although the underlying protocol would be generic. Concerning point c), self-descriptive messages, the exchanged messages should describe themselves and their content. This would not be actually a major issue considering the effort put in recent years on geospatial metadata models and specifications. The point d), hypermedia as the engine of application state, is actually where the Geospatial Web would mainly differ from existing geospatial information sharing systems. In fact the existing systems typically adopt a service-oriented architecture, where applications are built as a single service or as a workflow of services. On the other hand, in the Geospatial Web, applications should be built following the path between interconnected resources. The link between resources should be made explicit as hyperlinks. The adoption of Semantic Web solutions would allow to define not only the existence of a link between two resources, but also the nature of the link. The implementation of a Geospatial Web would allow to build an information system with the same characteristics of the Web sharing its points-of-strength and weaknesses. The main advantages would be the following: • The user would interact with the Geospatial Web according to the well-known Web navigation paradigm. This would lower the barrier to the access to geospatial applications for non-specialists (e.g. the success of Google Maps and other Web mapping applications); • Successful Web and Web 2.0 applications - search engines, feeds, social network - could be integrated/replicated in the Geospatial Web; The main drawbacks would be the following: • The Uniform Interface simplifies the overall system architecture (e.g. no service registry, and service descriptors required), but moves the complexity to the data representation. Moreover since the interface must stay generic, it results really simple and therefore complex interactions would require several transfers. • In the geospatial domain one of the most valuable resources are processes (e.g. environmental models). How they can be modeled as resources accessed through the common interface is an open issue. Taking into account advantages and drawback it seems that a Geospatial Web would be useful, but its use would be limited to specific use-cases not covering all the possible applications. The Geospatial Web architecture could be partly based on existing specifications, while other aspects need investigation. References [Berners-Lee 1996] T. Berners-Lee, "WWW: Past, present, and future". IEEE Computer, 29(10), Oct. 1996, pp. 69-77. [Fielding 2000] Fielding, R. T. 2000. Architectural styles and the design of network-based software architectures. PhD Dissertation. Dept. of Information and Computer Science, University of California, Irvine
The Naïve nurse: revisiting vulnerability for nursing
2012-01-01
Background Nurses in the Western world have given considerable attention to the concept of vulnerability in recent decades. However, nurses have tended to view vulnerability from an individualistic perspective, and have rarely taken into account structural or collective dimensions of the concept. As the need grows for health workers to engage in the global health agenda, nurses must broaden earlier works on vulnerability, noting that conventional conceptualizations and practical applications on the notion of vulnerability warrant extension to include more collective conceptualizations thereby making a more complete understanding of vulnerability in nursing discourse. Discussion The purpose of this paper is to examine nursing contributions to the concept of vulnerability and consider how a broader perspective that includes socio-political dimensions may assist nurses to reach beyond the immediate milieu of the patient into the dominant social, political, and economic structures that produce and sustain vulnerability. Summary By broadening nurse’s conceptualization of vulnerability, nurses can obtain the consciousness needed to move beyond a peripheral role of nursing that has been dominantly situated within institutional settings to contribute in the larger arena of social, economic, political and global affairs. PMID:22520841
Dynamically Allocated Virtual Clustering Management System Users Guide
2016-11-01
provides usage instructions for the DAVC version 2.0 web application. 15. SUBJECT TERMS DAVC, Dynamically Allocated Virtual Clustering...This report provides usage instructions for the DAVC version 2.0 web application. This report is separated into the following sections, which detail
DOE Office of Scientific and Technical Information (OSTI.GOV)
Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa
2016-05-01
The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.
Automatically Preparing Safe SQL Queries
NASA Astrophysics Data System (ADS)
Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.
We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.
75 FR 32692 - Schools and Libraries Universal Service Support Mechanism
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
..., wireless Internet access applications, and web hosting. We propose to revise the Commission's rules to.../anti-spam software, scheduling services, wireless Internet access applications, and web hosting should... schools and libraries may receive discounts for eligible telecommunications services, Internet access, and...
Changing Paradigms Managed Learning Environments and Web 2.0
ERIC Educational Resources Information Center
Craig, Emory M.
2007-01-01
Purpose: The purpose of this paper is to understand how emerging technologies and Web 2.0 services are transforming the structure of the web and their potential impact on managed learning environments (MLS) and learning content management systems (LCMS). Design/methodology/approach: Innovative Web 2.0 applications are reviewed in the paper to…
ERIC Educational Resources Information Center
Berger, Pam
2010-01-01
Web 2.0 applications are changing how educators interact both with each other and with their students. Educators can use these new Web tools daily to create, share, socialize, and collaborate with students, colleagues, and newly developed network contacts. School librarians are finding that Web 2.0 tools are bringing them more ways to embrace and…
ERIC Educational Resources Information Center
Chang, Chi-Cheng; Wu, Bing-Hong
2012-01-01
This study explored the reliability and validity of teacher assessment under a Web-based portfolio assessment environment (or Web-based teacher portfolio assessment). Participants were 72 eleventh graders taking the "Computer Application" course. The students perform portfolio creation, inspection, self- and peer-assessment using the Web-based…
20 CFR 656.17 - Basic labor certification process.
Code of Federal Regulations, 2010 CFR
2010-04-01
... participant in the job fair. (B) Employer's Web site. The use of the employer's Web site as a recruitment... involved in the application. (C) Job search Web site other than the employer's. The use of a job search Web...) The Department of Labor may issue or require the use of certain identifying information, including...
Infusing Classrooms with Web 2.0 Technologies
ERIC Educational Resources Information Center
Velasco, Richard Carlos L.
2018-01-01
The evolution of digital technologies over the past couple of decades has contributed to a paradigm shift in education where the internet and web-based applications have become ubiquitous in primary and secondary classrooms (Glassman & Kang, 2011). With this shift came the digital phenomena known today as Web 2.0 technologies. Web 2.0…
AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields
NASA Astrophysics Data System (ADS)
López, R.; San-Juan, J. F.
2013-05-01
Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.
Blaz, Jacquelyn W; Pearce, Patricia F
2009-01-01
The world is becoming increasingly web-based. Health care institutions are utilizing the web for personal health records, surveillance, communication, and education; health care researchers are finding value in using the web for research subject recruitment, data collection, and follow-up. Programming languages, such as Java, require knowledge and experience usually found only in software engineers and consultants. The purpose of this paper is to demonstrate Ruby on Rails as a feasible alternative for programming questionnaires for use on the web. Ruby on Rails was specifically designed for the development, deployment, and maintenance of database-backed web applications. It is flexible, customizable, and easy to learn. With a relatively little initial training, a novice programmer can create a robust web application in a small amount of time, without the need of a software consultant. The translation of the Children's Computerized Physical Activity Reporter (C-CPAR) from a local installation in Microsoft Access to a web-based format utilizing Ruby on Rails is given as an example.
Mfold web server for nucleic acid folding and hybridization prediction.
Zuker, Michael
2003-07-01
The abbreviated name, 'mfold web server', describes a number of closely related software applications available on the World Wide Web (WWW) for the prediction of the secondary structure of single stranded nucleic acids. The objective of this web server is to provide easy access to RNA and DNA folding and hybridization software to the scientific community at large. By making use of universally available web GUIs (Graphical User Interfaces), the server circumvents the problem of portability of this software. Detailed output, in the form of structure plots with or without reliability information, single strand frequency plots and 'energy dot plots', are available for the folding of single sequences. A variety of 'bulk' servers give less information, but in a shorter time and for up to hundreds of sequences at once. The portal for the mfold web server is http://www.bioinfo.rpi.edu/applications/mfold. This URL will be referred to as 'MFOLDROOT'.
Web Navigation Sequences Automation in Modern Websites
NASA Astrophysics Data System (ADS)
Montoto, Paula; Pan, Alberto; Raposo, Juan; Bellas, Fernando; López, Javier
Most today’s web sources are designed to be used by humans, but they do not provide suitable interfaces for software programs. That is why a growing interest has arisen in so-called web automation applications that are widely used for different purposes such as B2B integration, automated testing of web applications or technology and business watch. Previous proposals assume models for generating and reproducing navigation sequences that are not able to correctly deal with new websites using technologies such as AJAX: on one hand existing systems only allow recording simple navigation actions and, on the other hand, they are unable to detect the end of the effects caused by an user action. In this paper, we propose a set of new techniques to record and execute web navigation sequences able to deal with all the complexity existing in AJAX-based web sites. We also present an exhaustive evaluation of the proposed techniques that shows very promising results.
NASA Astrophysics Data System (ADS)
Du, Xiaofeng; Song, William; Munro, Malcolm
Web Services as a new distributed system technology has been widely adopted by industries in the areas, such as enterprise application integration (EAI), business process management (BPM), and virtual organisation (VO). However, lack of semantics in the current Web Service standards has been a major barrier in service discovery and composition. In this chapter, we propose an enhanced context-based semantic service description framework (CbSSDF+) that tackles the problem and improves the flexibility of service discovery and the correctness of generated composite services. We also provide an agile transformation method to demonstrate how the various formats of Web Service descriptions on the Web can be managed and renovated step by step into CbSSDF+ based service description without large amount of engineering work. At the end of the chapter, we evaluate the applicability of the transformation method and the effectiveness of CbSSDF+ through a series of experiments.
webpic: A flexible web application for collecting distance and count measurements from images
2018-01-01
Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592
Aanensen, David M; Huntley, Derek M; Feil, Edward J; al-Own, Fada'a; Spratt, Brian G
2009-09-16
Epidemiologists and ecologists often collect data in the field and, on returning to their laboratory, enter their data into a database for further analysis. The recent introduction of mobile phones that utilise the open source Android operating system, and which include (among other features) both GPS and Google Maps, provide new opportunities for developing mobile phone applications, which in conjunction with web applications, allow two-way communication between field workers and their project databases. Here we describe a generic framework, consisting of mobile phone software, EpiCollect, and a web application located within www.spatialepidemiology.net. Data collected by multiple field workers can be submitted by phone, together with GPS data, to a common web database and can be displayed and analysed, along with previously collected data, using Google Maps (or Google Earth). Similarly, data from the web database can be requested and displayed on the mobile phone, again using Google Maps. Data filtering options allow the display of data submitted by the individual field workers or, for example, those data within certain values of a measured variable or a time period. Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting 'citizen scientists' to contribute data easily to central databases through their mobile phone.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface.
Jeliazkova, Nina; Jeliazkov, Vedrin
2011-05-16
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations.The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems.
AMBIT RESTful web services: an implementation of the OpenTox application programming interface
2011-01-01
The AMBIT web services package is one of the several existing independent implementations of the OpenTox Application Programming Interface and is built according to the principles of the Representational State Transfer (REST) architecture. The Open Source Predictive Toxicology Framework, developed by the partners in the EC FP7 OpenTox project, aims at providing a unified access to toxicity data and predictive models, as well as validation procedures. This is achieved by i) an information model, based on a common OWL-DL ontology ii) links to related ontologies; iii) data and algorithms, available through a standardized REST web services interface, where every compound, data set or predictive method has a unique web address, used to retrieve its Resource Description Framework (RDF) representation, or initiate the associated calculations. The AMBIT web services package has been developed as an extension of AMBIT modules, adding the ability to create (Quantitative) Structure-Activity Relationship (QSAR) models and providing an OpenTox API compliant interface. The representation of data and processing resources in W3C Resource Description Framework facilitates integrating the resources as Linked Data. By uploading datasets with chemical structures and arbitrary set of properties, they become automatically available online in several formats. The services provide unified interfaces to several descriptor calculation, machine learning and similarity searching algorithms, as well as to applicability domain and toxicity prediction models. All Toxtree modules for predicting the toxicological hazard of chemical compounds are also integrated within this package. The complexity and diversity of the processing is reduced to the simple paradigm "read data from a web address, perform processing, write to a web address". The online service allows to easily run predictions, without installing any software, as well to share online datasets and models. The downloadable web application allows researchers to setup an arbitrary number of service instances for specific purposes and at suitable locations. These services could be used as a distributed framework for processing of resource-intensive tasks and data sharing or in a fully independent way, according to the specific needs. The advantage of exposing the functionality via the OpenTox API is seamless interoperability, not only within a single web application, but also in a network of distributed services. Last, but not least, the services provide a basis for building web mashups, end user applications with friendly GUIs, as well as embedding the functionalities in existing workflow systems. PMID:21575202
Migrating Department of Defense (DoD) Web Service Based Applications to Mobile Computing Platforms
2012-03-01
World Wide Web Consortium (W3C) Geolocation API to identify the device’s location and then center the map on the device. Finally, we modify the entry...THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations API Application Programming Interface CSS Cascading Style Sheets CLIMO...Java API for XML Web Services Reference Implementation JS JavaScript JSNI JavaScript Native Interface METOC Meteorological and Oceanographic MAA Mobile
Vascular applications of contrast-enhanced ultrasound imaging.
Mehta, Kunal S; Lee, Jake J; Taha, Ashraf G; Avgerinos, Efthymios; Chaer, Rabih A
2017-07-01
Contrast-enhanced ultrasound (CEUS) imaging is a powerful noninvasive modality offering numerous potential diagnostic and therapeutic applications in vascular medicine. CEUS imaging uses microbubble contrast agents composed of an encapsulating shell surrounding a gaseous core. These microbubbles act as nearly perfect intravascular reflectors of ultrasound energy and may be used to enhance the overall contrast and quality of ultrasound images. The purpose of this narrative review is to survey the current literature regarding CEUS imaging and discuss its diagnostic and therapeutic roles in current vascular and selected nonvascular applications. The PubMed, MEDLINE, and Embase databases were searched until July 2016 using the PubMed and Ovid Web-based search engines. The search terms used included contrast-enhanced, microbubble, ultrasound, carotid, aneurysm, and arterial. The diagnostic and therapeutic utility of CEUS imaging has grown exponentially, particularly in the realms of extracranial carotid arterial disease, aortic disease, and peripheral arterial disease. Studies have demonstrated that CEUS imaging is diagnostically superior to conventional ultrasound imaging in identifying vessel irregularities and measuring neovascularization to assess plaque vulnerability and end-muscle perfusion. Groups have begun to use microbubbles as agents in therapeutic applications for targeted drug and gene therapy delivery as well as for the enhancement of sonothrombolysis. The emerging technology of microbubbles and CEUS imaging holds considerable promise for cardiovascular medicine and cancer therapy given its diagnostic and therapeutic utility. Overall, with proper training and credentialing of technicians, the clinical implications are innumerable as microbubble technology is rapidly bursting onto the scene of cardiovascular medicine. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Hinds, Richard M; Klifto, Christopher S; Naik, Amish A; Sapienza, Anthony; Capo, John T
2016-08-01
The Internet is a common resource for applicants of hand surgery fellowships, however, the quality and accessibility of fellowship online information is unknown. The objectives of this study were to evaluate the accessibility of hand surgery fellowship Web sites and to assess the quality of information provided via program Web sites. Hand fellowship Web site accessibility was evaluated by reviewing the American Society for Surgery of the Hand (ASSH) on November 16, 2014 and the National Resident Matching Program (NRMP) fellowship directories on February 12, 2015, and performing an independent Google search on November 25, 2014. Accessible Web sites were then assessed for quality of the presented information. A total of 81 programs were identified with the ASSH directory featuring direct links to 32% of program Web sites and the NRMP directory directly linking to 0%. A Google search yielded direct links to 86% of program Web sites. The quality of presented information varied greatly among the 72 accessible Web sites. Program description (100%), fellowship application requirements (97%), program contact email address (85%), and research requirements (75%) were the most commonly presented components of fellowship information. Hand fellowship program Web sites can be accessed from the ASSH directory and, to a lesser extent, the NRMP directory. However, a Google search is the most reliable method to access online fellowship information. Of assessable programs, all featured a program description though the quality of the remaining information was variable. Hand surgery fellowship applicants may face some difficulties when attempting to gather program information online. Future efforts should focus on improving the accessibility and content quality on hand surgery fellowship program Web sites.
US Geoscience Information Network, Web Services for Geoscience Information Discovery and Access
NASA Astrophysics Data System (ADS)
Richard, S.; Allison, L.; Clark, R.; Coleman, C.; Chen, G.
2012-04-01
The US Geoscience information network has developed metadata profiles for interoperable catalog services based on ISO19139 and the OGC CSW 2.0.2. Currently data services are being deployed for the US Dept. of Energy-funded National Geothermal Data System. These services utilize OGC Web Map Services, Web Feature Services, and THREDDS-served NetCDF for gridded datasets. Services and underlying datasets (along with a wide variety of other information and non information resources are registered in the catalog system. Metadata for registration is produced by various workflows, including harvest from OGC capabilities documents, Drupal-based web applications, transformation from tabular compilations. Catalog search is implemented using the ESRI Geoportal open-source server. We are pursuing various client applications to demonstrated discovery and utilization of the data services. Currently operational applications allow catalog search and data acquisition from map services in an ESRI ArcMap extension, a catalog browse and search application built on openlayers and Django. We are developing use cases and requirements for other applications to utilize geothermal data services for resource exploration and evaluation.
NASA Astrophysics Data System (ADS)
Papathoma-Köhle, Maria
2016-08-01
The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.
DEMONSTRATING APPLICATIONS OF REGIONAL VULNERABILITY ASSESSMENT
This task is designed to respond to 2 Congressional earmarks of $1,000,000 to the Canaan Valley Institute (CVI) to work in close coordination with the Regional Vulnerability Assessment (ReVA) initiative to develop research and educational tools using integrative technologies to p...
Wagner, Chad R.; Tighe, Kirsten C.; Terziotti, Silvia
2009-01-01
StreamStats is a Web-based Geographic Information System (GIS) application that was developed by the U.S. Geological Survey (USGS) in cooperation with Environmental Systems Research Institute, Inc. (ESRI) to provide access to an assortment of analytical tools that are useful for water-resources planning and management. StreamStats allows users to easily obtain streamflow statistics, basin characteristics, and descriptive information for USGS data-collection sites and selected ungaged sites. StreamStats also allows users to identify stream reaches upstream and downstream from user-selected sites and obtain information for locations along streams where activities occur that can affect streamflow conditions. This functionality can be accessed through a map-based interface with the user's Web browser or through individual functions requested remotely through other Web applications.
Security and Dependability Solutions for Web Services and Workflows
NASA Astrophysics Data System (ADS)
Kokolakis, Spyros; Rizomiliotis, Panagiotis; Benameur, Azzedine; Sinha, Smriti Kumar
In this chapter we present an innovative approach towards the design and application of Security and Dependability (S&D) solutions for Web services and service-based workflows. Recently, several standards have been published that prescribe S&D solutions for Web services, e.g. OASIS WS-Security. However,the application of these solutions in specific contexts has been proven problematic. We propose a new framework for the application of such solutions based on the SERENITY S&D Pattern concept. An S&D Pattern comprises all the necessary information for the implementation, verification, deployment, and active monitoring of an S&D Solution. Thus, system developers may rely on proven solutions that are dynamically deployed and monitored by the Serenity Runtime Framework. Finally, we further extend this approach to cover the case of executable workflows which are realised through the orchestration of Web services.
AMPA: an automated web server for prediction of protein antimicrobial regions.
Torrent, Marc; Di Tommaso, Paolo; Pulido, David; Nogués, M Victòria; Notredame, Cedric; Boix, Ester; Andreu, David
2012-01-01
AMPA is a web application for assessing the antimicrobial domains of proteins, with a focus on the design on new antimicrobial drugs. The application provides fast discovery of antimicrobial patterns in proteins that can be used to develop new peptide-based drugs against pathogens. Results are shown in a user-friendly graphical interface and can be downloaded as raw data for later examination. AMPA is freely available on the web at http://tcoffee.crg.cat/apps/ampa. The source code is also available in the web. marc.torrent@upf.edu; david.andreu@upf.edu Supplementary data are available at Bioinformatics online.
A Generic Evaluation Model for Semantic Web Services
NASA Astrophysics Data System (ADS)
Shafiq, Omair
Semantic Web Services research has gained momentum over the last few Years and by now several realizations exist. They are being used in a number of industrial use-cases. Soon software developers will be expected to use this infrastructure to build their B2B applications requiring dynamic integration. However, there is still a lack of guidelines for the evaluation of tools developed to realize Semantic Web Services and applications built on top of them. In normal software engineering practice such guidelines can already be found for traditional component-based systems. Also some efforts are being made to build performance models for servicebased systems. Drawing on these related efforts in component-oriented and servicebased systems, we identified the need for a generic evaluation model for Semantic Web Services applicable to any realization. The generic evaluation model will help users and customers to orient their systems and solutions towards using Semantic Web Services. In this chapter, we have presented the requirements for the generic evaluation model for Semantic Web Services and further discussed the initial steps that we took to sketch such a model. Finally, we discuss related activities for evaluating semantic technologies.
Schäuble, Sascha; Stavrum, Anne-Kristin; Bockwoldt, Mathias; Puntervoll, Pål; Heiland, Ines
2017-06-24
Systems Biology Markup Language (SBML) is the standard model representation and description language in systems biology. Enriching and analysing systems biology models by integrating the multitude of available data, increases the predictive power of these models. This may be a daunting task, which commonly requires bioinformatic competence and scripting. We present SBMLmod, a Python-based web application and service, that automates integration of high throughput data into SBML models. Subsequent steady state analysis is readily accessible via the web service COPASIWS. We illustrate the utility of SBMLmod by integrating gene expression data from different healthy tissues as well as from a cancer dataset into a previously published model of mammalian tryptophan metabolism. SBMLmod is a user-friendly platform for model modification and simulation. The web application is available at http://sbmlmod.uit.no , whereas the WSDL definition file for the web service is accessible via http://sbmlmod.uit.no/SBMLmod.wsdl . Furthermore, the entire package can be downloaded from https://github.com/MolecularBioinformatics/sbml-mod-ws . We envision that SBMLmod will make automated model modification and simulation available to a broader research community.
NASA Astrophysics Data System (ADS)
Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.
2007-12-01
NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.
NASA Astrophysics Data System (ADS)
Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.
2015-12-01
The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application programming interface (API), which will allow other organizations to build their own custom applications and tools. New features such as finer scale aggregation and an online carbon calculator are being added to the LandCarbon web application to continue to make the site interactive, visually compelling, and useful for a wide range of users.
Web 2.0 applications in medicine: trends and topics in the literature.
Boudry, Christophe
2015-04-01
The World Wide Web has changed research habits, and these changes were further expanded when "Web 2.0" became popular in 2005. Bibliometrics is a helpful tool used for describing patterns of publication, for interpreting progression over time, and the geographical distribution of research in a given field. Few studies employing bibliometrics, however, have been carried out on the correlative nature of scientific literature and Web 2.0. The aim of this bibliometric analysis was to provide an overview of Web 2.0 implications in the biomedical literature. The objectives were to assess the growth rate of literature, key journals, authors, and country contributions, and to evaluate whether the various Web 2.0 applications were expressed within this biomedical literature, and if so, how. A specific query with keywords chosen to be representative of Web 2.0 applications was built for the PubMed database. Articles related to Web 2.0 were downloaded in Extensible Markup Language (XML) and were processed through developed hypertext preprocessor (PHP) scripts, then imported to Microsoft Excel 2010 for data processing. A total of 1347 articles were included in this study. The number of articles related to Web 2.0 has been increasing from 2002 to 2012 (average annual growth rate was 106.3% with a maximum of 333% in 2005). The United States was by far the predominant country for authors, with 514 articles (54.0%; 514/952). The second and third most productive countries were the United Kingdom and Australia, with 87 (9.1%; 87/952) and 44 articles (4.6%; 44/952), respectively. Distribution of number of articles per author showed that the core population of researchers working on Web 2.0 in the medical field could be estimated at approximately 75. In total, 614 journals were identified during this analysis. Using Bradford's law, 27 core journals were identified, among which three (Studies in Health Technology and Informatics, Journal of Medical Internet Research, and Nucleic Acids Research) produced more than 35 articles related to Web 2.0 over the period studied. A total of 274 words in the field of Web 2.0 were found after manual sorting of the 15,878 words appearing in title and abstract fields for articles. Word frequency analysis reveals "blog" as the most recurrent, followed by "wiki", "Web 2.0", "social media", "Facebook", "social networks", "blogger", "cloud computing", "Twitter", and "blogging". All categories of Web 2.0 applications were found, indicating the successful integration of Web 2.0 into the biomedical field. This study shows that the biomedical community is engaged in the use of Web 2.0 and confirms its high level of interest in these tools. Therefore, changes in the ways researchers use information seem to be far from over.
CH5M3D: an HTML5 program for creating 3D molecular structures.
Earley, Clarke W
2013-11-18
While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user's computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/.
CH5M3D: an HTML5 program for creating 3D molecular structures
2013-01-01
Background While a number of programs and web-based applications are available for the interactive display of 3-dimensional molecular structures, few of these provide the ability to edit these structures. For this reason, we have developed a library written in JavaScript to allow for the simple creation of web-based applications that should run on any browser capable of rendering HTML5 web pages. While our primary interest in developing this application was for educational use, it may also prove useful to researchers who want a light-weight application for viewing and editing small molecular structures. Results Molecular compounds are drawn on the HTML5 Canvas element, with the JavaScript code making use of standard techniques to allow display of three-dimensional structures on a two-dimensional canvas. Information about the structure (bond lengths, bond angles, and dihedral angles) can be obtained using a mouse or other pointing device. Both atoms and bonds can be added or deleted, and rotation about bonds is allowed. Routines are provided to read structures either from the web server or from the user’s computer, and creation of galleries of structures can be accomplished with only a few lines of code. Documentation and examples are provided to demonstrate how users can access all of the molecular information for creation of web pages with more advanced features. Conclusions A light-weight (≈ 75 kb) JavaScript library has been made available that allows for the simple creation of web pages containing interactive 3-dimensional molecular structures. Although this library is designed to create web pages, a web server is not required. Installation on a web server is straightforward and does not require any server-side modules or special permissions. The ch5m3d.js library has been released under the GNU GPL version 3 open-source license and is available from http://sourceforge.net/projects/ch5m3d/. PMID:24246004
ERIC Educational Resources Information Center
Lazarinis, Fotis
2014-01-01
iLM is a Web based application for representation, management and sharing of IMS LIP conformant user profiles. The tool is developed using a service oriented architecture with emphasis on the easy data sharing. Data elicitation from user profiles is based on the utilization of XQuery scripts and sharing with other applications is achieved through…
Web-Enabled Systems for Student Access.
ERIC Educational Resources Information Center
Harris, Chad S.; Herring, Tom
1999-01-01
California State University, Fullerton is developing a suite of server-based, Web-enabled applications that distribute the functionality of its student information system software to external customers without modifying the mainframe applications or databases. The cost-effective, secure, and rapidly deployable business solution involves using the…
New Perspectives on Intelligence Collection and Processing
2016-06-01
gained attention in recent years with applications in areas such as web advertising , classification, and decision making. In this thesis, we develop a...research that has gained attention in recent years with applications in areas such as web advertising , classification, and decision making. In this
Information Retrieval System for Japanese Standard Disease-Code Master Using XML Web Service
Hatano, Kenji; Ohe, Kazuhiko
2003-01-01
Information retrieval system of Japanese Standard Disease-Code Master Using XML Web Service is developed. XML Web Service is a new distributed processing system by standard internet technologies. With seamless remote method invocation of XML Web Service, users are able to get the latest disease code master information from their rich desktop applications or internet web sites, which refer to this service. PMID:14728364
Web-based applications for building, managing and analysing kinetic models of biological systems.
Lee, Dong-Yup; Saha, Rajib; Yusufi, Faraaz Noor Khan; Park, Wonjun; Karimi, Iftekhar A
2009-01-01
Mathematical modelling and computational analysis play an essential role in improving our capability to elucidate the functions and characteristics of complex biological systems such as metabolic, regulatory and cell signalling pathways. The modelling and concomitant simulation render it possible to predict the cellular behaviour of systems under various genetically and/or environmentally perturbed conditions. This motivates systems biologists/bioengineers/bioinformaticians to develop new tools and applications, allowing non-experts to easily conduct such modelling and analysis. However, among a multitude of systems biology tools developed to date, only a handful of projects have adopted a web-based approach to kinetic modelling. In this report, we evaluate the capabilities and characteristics of current web-based tools in systems biology and identify desirable features, limitations and bottlenecks for further improvements in terms of usability and functionality. A short discussion on software architecture issues involved in web-based applications and the approaches taken by existing tools is included for those interested in developing their own simulation applications.
Who Was that Masked Man? Biographical Sites on the Web.
ERIC Educational Resources Information Center
Byerly, Greg; Brodie, Carolyn S.
1999-01-01
Identifies some of the best general biographical sites on the Web and offers examples of some categorized biographical sites. Highlights include Web encyclopedias; presidents; women; scientists; children's literature authors and illustrators; popular culture; and classroom applications. (LRW)
The wireless Web and patient care.
Bergeron, B P
2001-01-01
Wireless computing, when integrated with the Web, is poised to revolutionize the practice and teaching of medicine. As vendors introduce wireless Web technologies in the medical community that have been used successfully in the business and consumer markets, clinicians can expect profound increases in the amount of patient data, as well as the ease with which those data are acquired, analyzed, and disseminated. The enabling technologies involved in this transformation to the wireless Web range from the new generation of wireless PDAs, eBooks, and wireless data acquisition peripherals to new wireless network protocols. The rate-limiting step in the application of this technology in medicine is not technology per se but rather how quickly clinicians and their patients come to accept and appreciate the benefits and limitations of the application of wireless Web technology.
How Japanese students characterize information from web-sites.
Iwahara, A; Yamada, M; Hatta, T; Kawakami, A; Okamoto, M
2000-12-01
How 352 Japanese university students regard web-site information was investigated by two kinds of survey. Application of correspondence analysis and cluster analysis to the questionnaire responses to the web-site advertisement showed students regarded a web-site as a new alien medium which is different from current media. Students regarded web-sites as simply complicated, intellectual, and impermanent, or not memorable. Students got precise information from web-sites but they did not use it in making decisions to purchase goods.
Catholic hospital services for vulnerable populations: are system values sufficient determinants?
White, Kenneth R; Chou, Tiang-Hong; Dandi, Roberto
2010-01-01
Catholic hospitals and health systems comprise a substantial segment of nonprofit, mission-driven, health care services, with accountability to institutional pressures of the Roman Catholic Church as well as economic pressures for solvency. Values are the way in which the organization expresses its faith-based institutional identity, which may used to select services that represent those values. The purpose of this study was to identify whether Catholic health systems' explicit values of justice or compassion (and derivatives of those words, known to have similar meaning) were associated with a greater number of system member hospitals' services aimed at vulnerable populations. Using information from Web sites of 41 Catholic health systems in 2007 and data describing their 452 hospitals from the American Hospital Association Annual Survey, the relationship of health system values with hospital services for vulnerable populations was examined while controlling for organizational, market, and demand variables. Although Catholic hospitals as a whole are more likely to provide services to vulnerable populations than to other ownership types, the results show that among Catholic hospitals, values of justice or compassion are not associated with more services (defined in this study) that reflect those values. System hospitals likely to have more services that represent the values of justice and compassion are larger, have a higher Medicaid payer mix, are located in less dense urban areas, and are members of geographically dispersed systems. Hospitals select services that may represent symbolic system values, but community need and financial means are stronger determinants. To bolster community benefit to justify tax exempt status, Catholic hospitals and systems may benefit from further defining, analyzing, and reporting the impact of access to relatively unprofitable services for previously underserved vulnerable populations.
Temporal trends in human vulnerability to excessive heat
NASA Astrophysics Data System (ADS)
Sheridan, Scott C.; Allen, Michael J.
2018-04-01
Over recent decades, studies have examined various morbidity and mortality outcomes associated with heat exposure. This review explores the collective knowledge of the temporal trends of heat on human health, with regard to the hypothesis that humans are less vulnerable to heat events presently than in the past. Using Web of Science and Scopus, the authors identified all peer-reviewed articles that contained keywords on human impact (e.g. mortality, morbidity) and meteorological component (e.g. heat, heatwave). After sorting, a total of 71 articles, both case studies and epidemiological studies, contained explicit assessments of temporal trends in human vulnerability, and thus were used in this review. Most of the studies utilized mortality data, focused on the developed world, and showed a general decrease in heat sensitivity. Factors such as the implementation of a heat warning system, increased awareness, and improved quality of life were cited as contributing factors that led to the decreased impact of heat. Despite the overall recent decreases in heat vulnerability, spatial variability was shown, and differences with respect to health outcomes were also discussed. Several papers noted increases in heat’s impact on human health, particularly when unprecedented conditions occurred. Further, many populations, from outdoor workers to rural residents, in addition to the populations in much of the developing world, have been significantly underrepresented in research to date, and temporal changes in their vulnerability should be assessed in future studies. Moreover, continued monitoring and improvement of heat intervention is needed; with projected changes in the frequency, duration, and intensity of heat events combined with shifts in demographics, heat will remain a major public health issue moving forward.
Dynamic selection mechanism for quality of service aware web services
NASA Astrophysics Data System (ADS)
D'Mello, Demian Antony; Ananthanarayana, V. S.
2010-02-01
A web service is an interface of the software component that can be accessed by standard Internet protocols. The web service technology enables an application to application communication and interoperability. The increasing number of web service providers throughout the globe have produced numerous web services providing the same or similar functionality. This necessitates the use of tools and techniques to search the suitable services available over the Web. UDDI (universal description, discovery and integration) is the first initiative to find the suitable web services based on the requester's functional demands. However, the requester's requirements may also include non-functional aspects like quality of service (QoS). In this paper, the authors define a QoS model for QoS aware and business driven web service publishing and selection. The authors propose a QoS requirement format for the requesters, to specify their complex demands on QoS for the web service selection. The authors define a tree structure called quality constraint tree (QCT) to represent the requester's variety of requirements on QoS properties having varied preferences. The paper proposes a QoS broker based architecture for web service selection, which facilitates the requesters to specify their QoS requirements to select qualitatively optimal web service. A web service selection algorithm is presented, which ranks the functionally similar web services based on the degree of satisfaction of the requester's QoS requirements and preferences. The paper defines web service provider qualities to distinguish qualitatively competitive web services. The paper also presents the modelling and selection mechanism for the requester's alternative constraints defined on the QoS. The authors implement the QoS broker based system to prove the correctness of the proposed web service selection mechanism.
NASA Astrophysics Data System (ADS)
Knouz, Najat; Boudhar, Abdelghani; Bachaoui, El Mostafa
2016-04-01
Fresh water is the condition of all life on Earth for its vital role in the survival of living beings and in the social, economic and technological development. The Groundwater, as the surface water, is increasingly threatened by agricultural and industrial pollution. In this respect, the groundwater vulnerability assessment to pollution is a very valuable tool for resource protection, management of its quality and uses it in a sustainable way. The main objective of this study is the evaluation of groundwater vulnerability to pollution of the study area, Beni Amir, located in the first irrigated perimeter of Morocco, Tadla, using the DRASTIC method (depth to water, net recharge, aquifer media, soil media, Topography, impact of Vadose zone and hydraulic conductivity), and assessing the impact of each parameter on the DRASTIC vulnerability index by a sensitivity analysis. This study also highlights the role of geographic information systems (GIS) in assessing vulnerability. The Vulnerability index is calculated as the sum of product of ratings and weights assigned to each of the parameter DRASTIC. The results revealed four vulnerability classes, 7% of the study area has a high vulnerability, 31% are moderately vulnerable, 57% have a low vulnerability and 5% are of very low vulnerability.
Detection and Prevention of Insider Threats in Database Driven Web Services
NASA Astrophysics Data System (ADS)
Chumash, Tzvi; Yao, Danfeng
In this paper, we take the first step to address the gap between the security needs in outsourced hosting services and the protection provided in the current practice. We consider both insider and outsider attacks in the third-party web hosting scenarios. We present SafeWS, a modular solution that is inserted between server side scripts and databases in order to prevent and detect website hijacking and unauthorized access to stored data. To achieve the required security, SafeWS utilizes a combination of lightweight cryptographic integrity and encryption tools, software engineering techniques, and security data management principles. We also describe our implementation of SafeWS and its evaluation. The performance analysis of our prototype shows the overhead introduced by security verification is small. SafeWS will allow business owners to significantly reduce the security risks and vulnerabilities of outsourcing their sensitive customer data to third-party providers.
Protecting posted genes: social networking and the limits of GINA.
Soo-Jin Lee, Sandra; Borgelt, Emily
2014-01-01
The combination of decreased genotyping costs and prolific social media use is fueling a personal genetic testing industry in which consumers purchase and interact with genetic risk information online. Consumers and their genetic risk profiles are protected in some respects by the 2008 federal Genetic Information Nondiscrimination Act (GINA), which forbids the discriminatory use of genetic information by employers and health insurers; however, practical and technical limitations undermine its enforceability, given the everyday practices of online social networking and its impact on the workplace. In the Web 2.0 era, employers in most states can legally search about job candidates and employees online, probing social networking sites for personal information that might bear on hiring and employment decisions. We examine GINA's protections for online sharing of genetic information as well as its limitations, and propose policy recommendations to address current gaps that leave employees' genetic information vulnerable in a Web-based world.
NASA Astrophysics Data System (ADS)
Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore
2016-08-01
In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.
EFEHR - the European Facilities for Earthquake Hazard and Risk: beyond the web-platform
NASA Astrophysics Data System (ADS)
Danciu, Laurentiu; Wiemer, Stefan; Haslinger, Florian; Kastli, Philipp; Giardini, Domenico
2017-04-01
European Facilities for Earthquake Hazard and Risk (EEFEHR) represents the sustainable community resource for seismic hazard and risk in Europe. The EFEHR web platform is the main gateway to access data, models and tools as well as provide expertise relevant for assessment of seismic hazard and risk. The main services (databases and web-platform) are hosted at ETH Zurich and operated by the Swiss Seismological Service (Schweizerischer Erdbebendienst SED). EFEHR web-portal (www.efehr.org) collects and displays (i) harmonized datasets necessary for hazard and risk modeling, e.g. seismic catalogues, fault compilations, site amplifications, vulnerabilities, inventories; (ii) extensive seismic hazard products, namely hazard curves, uniform hazard spectra and maps for national and regional assessments. (ii) standardized configuration files for re-computing the regional seismic hazard models; (iv) relevant documentation of harmonized datasets, models and web-services. Today, EFEHR distributes full output of the 2013 European Seismic Hazard Model, ESHM13, as developed within the SHARE project (http://www.share-eu.org/); the latest results of the 2014 Earthquake Model of the Middle East (EMME14), derived within the EMME Project (www.emme-gem.org); the 2001 Global Seismic Hazard Assessment Project (GSHAP) results and the 2015 updates of the Swiss Seismic Hazard. New datasets related to either seismic hazard or risk will be incorporated as they become available. We present the currents status of the EFEHR platform, with focus on the challenges, summaries of the up-to-date datasets, user experience and feedback, as well as the roadmap to future technological innovation beyond the web-platform development. We also show the new services foreseen to fully integrate with the seismological core services of European Plate Observing System (EPOS).
Aided generation of search interfaces to astronomical archives
NASA Astrophysics Data System (ADS)
Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo
2016-07-01
Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.
Patel, Shyamal; Chen, Bor-Rong; Buckley, Thomas; Rednic, Ramona; McClure, Doug; Tarsy, Daniel; Shih, Ludy; Dy, Jennifer; Welsh, Matt; Bonato, Paolo
2010-01-01
Objective long-term health monitoring can improve the clinical management of several medical conditions ranging from cardiopulmonary diseases to motor disorders. In this paper, we present our work toward the development of a home-monitoring system. The system is currently used to monitor patients with Parkinson's disease who experience severe motor fluctuations. Monitoring is achieved using wireless wearable sensors whose data are relayed to a remote clinical site via a web-based application. The work herein presented shows that wearable sensors combined with a web-based application provide reliable quantitative information that can be used for clinical decision making.
Web 2 Technologies for Net Native Language Learners: A "Social CALL"
ERIC Educational Resources Information Center
Karpati, Andrea
2009-01-01
In order to make optimal educational use of social spaces offered by thousands of international communities in the second generation web applications termed Web 2 or Social Web, ICT competences as well as social skills are needed for both teachers and learners. The paper outlines differences in competence structures of Net Natives (who came of age…
Metacognitive Skills Development: A Web-Based Approach in Higher Education
ERIC Educational Resources Information Center
Shen, Chun-Yi; Liu, Hsiu-Chuan
2011-01-01
Although there were studies that presented the applications of metacognitive skill training, the research on web-based metacognitive skills training are few. The purpose of this study is to design a web-based learning environment and further examine the effect of the web-based training. A pretest-posttest quasi-experimental design was used in this…
Therapeutic Uses of the WebCam in Child Psychiatry
ERIC Educational Resources Information Center
Chlebowski, Susan; Fremont, Wanda
2011-01-01
Objective: The authors provide examples for the use of the WebCam as a therapeutic tool in child psychiatry, discussing cases to demonstrate the application of the WebCam, which is most often used in psychiatry training programs during resident supervision and for case presentations. Method: Six cases illustrate the use of the WebCam in individual…
Streamlining Data for Cross-Platform Web Delivery
ERIC Educational Resources Information Center
Watkins, Sean; Battles, Jason; Vacek, Rachel
2013-01-01
Smartphone users expect the presentation of Web sites on their mobile browsers to look and feel like native applications. With the pressure on library Web developers to produce app-like mobile sites, there is often a rush to get a site up without considering the importance of reusing or even restructuring the data driving the Web sites. An…
ERIC Educational Resources Information Center
Dodge, Lucy
The report describes San Jose College's (California) two Web site management and design programs, and provides employment information and job market analysis for the field. The College's Web Site Administration and Web Application Solutions programs offer classes designed to give students the necessary skills in administering a Web site and in…
User Needs of Digital Service Web Portals: A Case Study
ERIC Educational Resources Information Center
Heo, Misook; Song, Jung-Sook; Seol, Moon-Won
2013-01-01
The authors examined the needs of digital information service web portal users. More specifically, the needs of Korean cultural portal users were examined as a case study. The conceptual framework of a web-based portal is that it is a complex, web-based service application with characteristics of information systems and service agents. In…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-27
... representative, already holds an NRC- issued digital ID certificate). Based upon this information, the Secretary... online, Web-based submission form. In order to serve documents through EIE, users will be required to install a Web browser plug-in from the NRC Web site. Further information on the Web- based submission form...
ERIC Educational Resources Information Center
Fast, Karl V.; Campbell, D. Grant
2001-01-01
Compares the implied ontological frameworks of the Open Archives Initiative Protocol for Metadata Harvesting and the World Wide Web Consortium's Semantic Web. Discusses current search engine technology, semantic markup, indexing principles of special libraries and online databases, and componentization and the distinction between data and…
Integrating Climate Information and Decision Processes for Regional Climate Resilience
NASA Astrophysics Data System (ADS)
Buizer, James; Goddard, Lisa; Guido, Zackry
2015-04-01
An integrated multi-disciplinary team of researchers from the University of Arizona and the International Research Institute for Climate and Society at Columbia University have joined forces with communities and institutions in the Caribbean, South Asia and West Africa to develop relevant, usable climate information and connect it to real decisions and development challenges. The overall objective of the "Integrating Climate Information and Decision Processes for Regional Climate Resilience" program is to build community resilience to negative impacts of climate variability and change. We produce and provide science-based climate tools and information to vulnerable peoples and the public, private, and civil society organizations that serve them. We face significant institutional challenges because of the geographical and cultural distance between the locale of climate tool-makers and the locale of climate tool-users and because of the complicated, often-inefficient networks that link them. To use an accepted metaphor, there is great institutional difficulty in coordinating the supply of and the demand for useful climate products that can be put to the task of building local resilience and reducing climate vulnerability. Our program is designed to reduce the information constraint and to initiate a linkage that is more demand driven, and which provides a set of priorities for further climate tool generation. A demand-driven approach to the co-production of appropriate and relevant climate tools seeks to meet the direct needs of vulnerable peoples as these needs have been canvassed empirically and as the benefits of application have been adequately evaluated. We first investigate how climate variability and climate change affect the livelihoods of vulnerable peoples. In so doing we assess the complex institutional web within which these peoples live -- the public agencies that serve them, their forms of access to necessary information, the structural constraints under which they make their decisions, and the non-public institutions of support that are available to them. We then interpret this complex reality in terms of the demand for science-based climate products and analyze the channels through which such climate support must pass, thus linking demand assessment with the scientific capacity to create appropriate decision support tools. In summary, the approach we employ is: 1) Demand-driven, beginning with a knowledge of the impacts of climate variability and change upon targeted populations, 2) Focused on vulnerability and resilience, which requires an understanding of broader networks of institutional actors who contribute to the adaptive capacity of vulnerable peoples, 3) Needs-based in that the climate needs matrix set priorities for the assessment of relevant climate products, 4) Dynamic in that the producers of climate products are involved at the point of demand assessment and can respond directly to stated needs, 5) Reflective in that the impacts of climate product interventions are subject to monitoring and evaluation throughout the process. Methods, approaches and preliminary results of our work in the Caribbean will be presented.
Design for Connecting Spatial Data Infrastructures with Sensor Web (sensdi)
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; M., M.
2016-06-01
Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. It is about research to harness the sensed environment by utilizing domain specific sensor data to create a generalized sensor webframework. The challenges being semantic enablement for Spatial Data Infrastructures, and connecting the interfaces of SDI with interfaces of Sensor Web. The proposed research plan is to Identify sensor data sources, Setup an open source SDI, Match the APIs and functions between Sensor Web and SDI, and Case studies like hazard applications, urban applications etc. We take up co-operative development of SDI best practices to enable a new realm of a location enabled and semantically enriched World Wide Web - the "Geospatial Web" or "Geosemantic Web" by setting up one to one correspondence between WMS, WFS, WCS, Metadata and 'Sensor Observation Service' (SOS); 'Sensor Planning Service' (SPS); 'Sensor Alert Service' (SAS); a service that facilitates asynchronous message interchange between users and services, and between two OGC-SWE services, called the 'Web Notification Service' (WNS). Hence in conclusion, it is of importance to geospatial studies to integrate SDI with Sensor Web. The integration can be done through merging the common OGC interfaces of SDI and Sensor Web. Multi-usability studies to validate integration has to be undertaken as future research.
Goldberg, Rachel E; Short, Susan E
2016-03-01
Millions of children in Sub-Saharan Africa live with adults, often parents, who are HIV-infected or ill due to AIDS. These children experience social, emotional, and health vulnerabilities that overlap with, but are not necessarily the same as, those of orphans or other vulnerable children. Despite their distinctive vulnerabilities, research aimed at understanding the situation of these children has been limited until very recently. This review summarizes the state of knowledge based on a systematic search of PubMed and Web of Science that identified 47 empirical research articles that examined either the population prevalence of children living with HIV-infected or AIDS-sick adults, or the consequences of adult HIV infection or AIDS illness for child well-being. This review confirms that this population of children is substantial in size, and that the vulnerabilities they experience are multi-faceted, spanning physical and emotional health and schooling. Mechanisms were examined empirically in only a small number of studies, but encompass poverty, transmission of opportunistic infections, care for unwell adults, adult distress, AIDS stigma, lack of social support, maternal breastfeeding issues, and vertical HIV transmission. Some evidence is provided that infants, adolescents, children with infected or ill mothers, and children living with severely ill adults are particularly vulnerable. Future research would benefit from more attention to causal inference and further characterization of processes and circumstances related to vulnerability and resilience. It would also benefit from further study of variation in observed associations between adult HIV/AIDS and child well-being based on characteristics such as age, sex, kinship, severity of illness, TB co-infection, disclosure, and serostatus awareness. Almost one-quarter of the studies reviewed did not investigate variation based on any of these factors. More nuanced understanding of the short- and long-term effects of adult HIV on children's needs and circumstances will be important to ongoing discussions about equity in policies and interventions.
Goldberg, Rachel E.; Short, Susan E.
2016-01-01
ABSTRACT Millions of children in Sub-Saharan Africa live with adults, often parents, who are HIV-infected or ill due to AIDS. These children experience social, emotional, and health vulnerabilities that overlap with, but are not necessarily the same as, those of orphans or other vulnerable children. Despite their distinctive vulnerabilities, research aimed at understanding the situation of these children has been limited until very recently. This review summarizes the state of knowledge based on a systematic search of PubMed and Web of Science that identified 47 empirical research articles that examined either the population prevalence of children living with HIV-infected or AIDS-sick adults, or the consequences of adult HIV infection or AIDS illness for child well-being. This review confirms that this population of children is substantial in size, and that the vulnerabilities they experience are multi-faceted, spanning physical and emotional health and schooling. Mechanisms were examined empirically in only a small number of studies, but encompass poverty, transmission of opportunistic infections, care for unwell adults, adult distress, AIDS stigma, lack of social support, maternal breastfeeding issues, and vertical HIV transmission. Some evidence is provided that infants, adolescents, children with infected or ill mothers, and children living with severely ill adults are particularly vulnerable. Future research would benefit from more attention to causal inference and further characterization of processes and circumstances related to vulnerability and resilience. It would also benefit from further study of variation in observed associations between adult HIV/AIDS and child well-being based on characteristics such as age, sex, kinship, severity of illness, TB co-infection, disclosure, and serostatus awareness. Almost one-quarter of the studies reviewed did not investigate variation based on any of these factors. More nuanced understanding of the short- and long-term effects of adult HIV on children’s needs and circumstances will be important to ongoing discussions about equity in policies and interventions. PMID:27392008
Working with South Florida County Planners to Understand and Mitigate Uncertain Climate Risks
NASA Astrophysics Data System (ADS)
Knopman, D.; Groves, D. G.; Berg, N.
2017-12-01
This talk describes a novel approach for evaluating climate change vulnerabilities and adaptations in Southeast Florida to support long-term resilience planning. The work is unique in that it combines state-of-the-art hydrologic modeling with the region's long-term land use and transportation plans to better assess the future climate vulnerability and adaptations for the region. Addressing uncertainty in future projections is handled through the use of decisionmaking under deep uncertainty methods. Study findings, including analysis of key tradeoffs, were conveyed to the region's stakeholders through an innovative web-based decision support tool. This project leverages existing groundwater models spanning Miami-Dade and Broward Counties developed by the USGS, along with projections of land use and asset valuations for Miami-Dade and Broward County planning agencies. Model simulations are executed on virtual cloud-based servers for a highly scalable and parallelized platform. Groundwater elevations and the saltwater-freshwater interface and intrusion zones from the integrated modeling framework are analyzed under a wide range of long-term climate futures, including projected sea level rise and precipitation changes. The hydrologic hazards are then combined with current and future land use and asset valuation projections to estimate assets at risk across the range of futures. Lastly, an interactive decision support tool highlights the areas with critical climate vulnerabilities; distinguishes between vulnerability due to new development, increased climate hazards, or both; and provides guidance for adaptive management and development practices and decisionmaking in Southeast Florida.
2016-04-01
the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data
Web-Based Distributed Simulation of Aeronautical Propulsion System
NASA Technical Reports Server (NTRS)
Zheng, Desheng; Follen, Gregory J.; Pavlik, William R.; Kim, Chan M.; Liu, Xianyou; Blaser, Tammy M.; Lopez, Isaac
2001-01-01
An application was developed to allow users to run and view the Numerical Propulsion System Simulation (NPSS) engine simulations from web browsers. Simulations were performed on multiple INFORMATION POWER GRID (IPG) test beds. The Common Object Request Broker Architecture (CORBA) was used for brokering data exchange among machines and IPG/Globus for job scheduling and remote process invocation. Web server scripting was performed by JavaServer Pages (JSP). This application has proven to be an effective and efficient way to couple heterogeneous distributed components.
ERIC Educational Resources Information Center
Technology & Learning, 2005
2005-01-01
In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…
Kortüm, K; Reznicek, L; Leicht, S; Ulbig, M; Wolf, A
2013-07-01
The importance and complexity of clinical trials is continuously increasing, especially in innovative specialties like ophthalmology. Therefore an efficient clinical trial site organisational structure is essential. In modern internet times, this can be accomplished by web-based applications. In total, 3 software applications (Vibe on Prem, Sharepoint and open source software) were evaluated in a clinical trial site in ophthalmology. Assessment criteria were set; they were: reliability, easiness of administration, usability, scheduling, task list, knowledge management, operating costs and worldwide availability. Vibe on Prem customised by the local university met the assessment criteria best. Other applications were not as strong. By introducing a web-based application for administrating and organising an ophthalmological trial site, studies can be conducted in a more efficient and reliable manner. Georg Thieme Verlag KG Stuttgart · New York.
Strategies for expanding health insurance coverage in vulnerable populations
Jia, Liying; Yuan, Beibei; Huang, Fei; Lu, Ying; Garner, Paul; Meng, Qingyue
2014-01-01
Background Health insurance has the potential to improve access to health care and protect people from the financial risks of diseases. However, health insurance coverage is often low, particularly for people most in need of protection, including children and other vulnerable populations. Objectives To assess the effectiveness of strategies for expanding health insurance coverage in vulnerable populations. Search methods We searched Cochrane Central Register of Controlled Trials (CENTRAL), part of The Cochrane Library. www.thecochranelibrary.com (searched 2 November 2012), PubMed (searched 1 November 2012), EMBASE (searched 6 July 2012), Global Health (searched 6 July 2012), IBSS (searched 6 July 2012), WHO Library Database (WHOLIS) (searched 1 November 2012), IDEAS (searched 1 November 2012), ISI-Proceedings (searched 1 November 2012),OpenGrey (changed from OpenSIGLE) (searched 1 November 2012), African Index Medicus (searched 1 November 2012), BLDS (searched 1 November 2012), Econlit (searched 1 November 2012), ELDIS (searched 1 November 2012), ERIC (searched 1 November 2012), HERDIN NeON Database (searched 1 November 2012), IndMED (searched 1 November 2012), JSTOR (searched 1 November 2012), LILACS(searched 1 November 2012), NTIS (searched 1 November 2012), PAIS (searched 6 July 2012), Popline (searched 1 November 2012), ProQuest Dissertation &Theses Database (searched 1 November 2012), PsycINFO (searched 6 July 2012), SSRN (searched 1 November 2012), Thai Index Medicus (searched 1 November 2012), World Bank (searched 2 November 2012), WanFang (searched 3 November 2012), China National Knowledge Infrastructure (CHKD-CNKI) (searched 2 November 2012). In addition, we searched the reference lists of included studies and carried out a citation search for the included studies via Web of Science to find other potentially relevant studies. Selection criteria Randomised controlled trials (RCTs), non-randomised controlled trials (NRCTs), controlled before-after (CBA) studies and Interrupted time series (ITS) studies that evaluated the effects of strategies on increasing health insurance coverage for vulnerable populations. We defined strategies as measures to improve the enrolment of vulnerable populations into health insurance schemes. Two categories and six specified strategies were identified as the interventions. Data collection and analysis At least two review authors independently extracted data and assessed the risk of bias. We undertook a structured synthesis. Main results We included two studies, both from the United States. People offered health insurance information and application support by community-based case managers were probably more likely to enrol their children into health insurance programmes (risk ratio (RR) 1.68, 95% confidence interval (CI) 1.44 to 1.96, moderate quality evidence) and were probably more likely to continue insuring their children (RR 2.59, 95% CI 1.95 to 3.44, moderate quality evidence). Of all the children that were insured, those in the intervention group may have been insured quicker (47.3 fewer days, 95% CI 20.6 to 74.0 fewer days, low quality evidence) and parents may have been more satisfied on average (satisfaction score average difference 1.07, 95% CI 0.72 to 1.42, low quality evidence). In the second study applications were handed out in emergency departments at hospitals, compared to not handing out applications, and may have had an effect on enrolment (RR 1.5, 95% CI 1.03 to 2.18, low quality evidence). Authors' conclusions Community-based case managers who provide health insurance information, application support, and negotiate with the insurer probably increase enrolment of children in health insurance schemes. However, the transferability of this intervention to other populations or other settings is uncertain. Handing out insurance application materials in hospital emergency departments may help increase the enrolment of children in health insurance schemes. Further studies evaluating the effectiveness of different strategies for expanding health insurance coverage in vulnerable population are needed in different settings, with careful attention given to study design. PLAIN LANGUAGE SUMMARY Strategies for expanding health insurance coverage in vulnerable populations Researchers in The Cochrane Collaboration conducted a review of the effect of strategies to increase the number of people from vulnerable populations that are enrolled into health insurance programmes. They searched for all relevant studies and found two studies. Their findings are summarised below. What is a health insurance programme? Governments in many countries offer healthcare services at low rates or free of charge to all their citizens, often paying for these services through taxes. However, in many developing countries and some developed countries this is not the case. In these countries, many people get their healthcare expenses covered through government health insurance programmes, which are often paid for through membership fees. But certain groups of people, such as children, the elderly, women, people with low incomes, people living in rural areas, racial and ethnic minorities, immigrants, and people with chronic diseases or disabilities, are less likely to be members of these programmes even though they are more likely to have health problems. In some of these countries governments have tried to make sure that health insurance programmes cover these vulnerable groups. One way of doing this is to improve the design of the programme. For instance, governments can change the rules for who can join the programme or they can make it cheaper to join. But even if a programme is well-designed, people may still not join it. For instance, they may not know that they can become members or they may find the application process too difficult. To address these problems, for instance, governments can give people more information about the programme and who can join, or can make the application process easier. What this research says Both studies in this review took place in the USA and were aimed at uninsured children. In the first study, case managers contacted the families of uninsured Latin American children, gave them information about health insurance, helped them apply, and helped them appeal when a wrong decision was made. In the second study, insurance application forms were handed out to the families of children visiting hospital emergency departments. In both studies, these families were compared to families who were not given additional information or support. The studies showed the following: People who are offered health insurance information and application support: - are probably more likely to enrol their children into health insurance programmes (moderate quality evidence); - are probably more likely to continue insuring their children (moderate quality evidence); - may be quicker at getting insurance (low quality evidence); - may be more satisfied with the process of enrolment (low quality evidence). People who are given insurance application forms in the emergency departments of hospitals: - may be more likely to enrol their children into health insurance programmes (low quality evidence). No unwanted effects were reported in the studies. A possible unwanted effect might be that people could experience the information and support as annoying or unhelpful. However, in the one study that measured the parents' satisfaction, people were more satisfied when given information and support. PMID:25425010
NASA Astrophysics Data System (ADS)
Álvarez Francoso, Jose; Prieto Campos, Antonio; Ojeda Zujar, Jose; Guisado-Pintado, Emilia; Pérez Alcántara, Juan Pedro
2017-04-01
The accessibility to environmental information via web viewers using map services (OGC or proprietary services) has become more frequent since newly information sources (ortophotos, LIDAR, GPS) are of great detailed and thus generate a great volume of data which barely can be disseminated using either analogue (paper maps) or digital (pdf) formats. Moreover, governments and public institutions are concerned about the need of facilitates provision to research results and improve communication about natural hazards to citizens and stakeholders. This information ultimately, if adequately disseminated, it's crucial in decision making processes, risk management approaches and could help to increase social awareness related to environmental issues (particularly climate change impacts). To overcome this issue, two strategies for wide dissemination and communication of the results achieved in the calculation of beach erosion for the 640 km length of the Andalusian coast (South Spain) using web viewer technology are presented. Each of them are oriented to different end users and thus based on different methodologies. Erosion rates has been calculated at 50m intervals for different periods (1956-1977-2001-2011) as part of a National Research Project based on the spasialisation and web-access of coastal vulnerability indicators for Andalusian region. The 1st proposal generates WMS services (following OGC standards) that are made available by Geoserver, using a geoviewer client developed through Leaflet. This viewer is designed to be used by the general public (citizens, politics, etc) by combining a set of tools that give access to related documents (pdfs), visualisation tools (panoramio pictures, geo-localisation with GPS) are which are displayed within an user-friendly interface. Further, the use of WMS services (implemented on Geoserver) provides a detailed semiology (arrows and proportional symbols, using alongshore coastaline buffers to represent data) which not only enhances access to erosion rates but also enables multi-scale data representation. The 2nd proposal, as intended to be used by technicians and specialists on the field, includes a geoviewer with an innovative profile (including visualization of time-ranges, application of different uncertainty levels to the data, etc) to fulfil the needs of these users. For its development, a set of Javascript libraries combined with Openlayers (or Leaflet) are implemented to guarantee all the functionalities existing for the basic geoviewer. Further to this, the viewer has been improved by i) the generation of services by request through the application of a filter in ECQL language (Extended Common Query Language), using the vendor parameter CQL_FILTER from Geoserver. These dynamic filters allow the final user to predefine the visualised variable, its spatial and temporal domain, a range of specific values and other attributes, thus multiplying the generation of real-time cartography; ii) by using the layer's WFS service, the Javascript application exploit the alphanumeric data to generate related statistics in real time (e.g. mean rates, length of eroded coast, etc.) and interactive graphs (via HighCharts.js library) which accurately help in beach erosion rates interpretation (representing trends and bars diagrams, among others. As a result two approaches for communicating scientific results to different audiences based on web-based with complete dataset of geo-information, services and functionalities are implemented. The combination of standardised environmental data with tailor-made exploitation techniques (interactive maps, and real-time statistics) assures the correct access and interpretation of the information.
Wauchope, R Don; Estes, Tammara L; Allen, Richard; Baker, James L; Hornsby, Arthur G; Jones, Russell L; Richards, R Peter; Gustafson, David I
2002-02-01
In the intensely farmed corn-growing regions of the mid-western USA, surface waters have often been contaminated by herbicides, principally as a result of rainfall runoff occurring shortly after application of these to corn and other crops. In some vulnerable watersheds, water quality criteria for chronic human exposure through drinking water are occasionally exceeded. We selected three settings representative of vulnerable corn-region watersheds, and used the PRZM-EXAMS model with the Index Reservoir scenario to predict corn herbicide concentrations in the reservoirs as a function of herbicide properties and use pattern, site characteristics and weather in the watersheds. We compared herbicide application scenarios, including broadcast surface pre-plant atrazine and alachlor applications with a glyphosate pre-plant application, scenarios in which losses of herbicides were mitigated by incorporation or banding, and scenarios in which only glyphosate or glufosinate post-emergent herbicides were used with corn genetically modified to be resistant to them. In the absence of drift, in almost all years a single runoff event dominates the input into the reservoir. As a result, annual average pesticide concentrations are highly correlated with annual maximum daily values. The modeled concentrations were generally higher than those derived from monitoring data, even for no-drift model scenarios. Because of their lower post-emergent application rates and greater soil sorptivity, glyphosate and glufosinate loads in runoff were generally one-fifth to one-tenth those of atrazine and alachlor. These model results indicate that the replacement of pre-emergent corn herbicides with the post-emergent herbicides allowed by genetic modification of crops would dramatically reduce herbicide concentrations in vulnerable watersheds. Given the significantly lower chronic mammalian toxicity of these compounds, and their vulnerability to breakdown in the drinking water treatment process, risks to human populations through drinking water would also be reduced.
ERIC Educational Resources Information Center
Ram, Shri; Anbu K., John Paul; Kataria, Sanjay
2011-01-01
Purpose: This paper seeks to provide an insight into the implementation of some of the innovative Web 2.0 applications at Jaypee University of Information Technology with the aim of exploring the expectations of the users and their awareness and usage of such applications. Design/methodology/approach: The study was undertaken at the Learning…
The classification and assessment of vulnerability of man-land system of oasis city in arid area
NASA Astrophysics Data System (ADS)
Gao, Chao; Lei, Jun; Jin, Fengjun
2013-12-01
Oasis city system is the center of the man-land relationship in arid area and it is the most influential spatial and temporal multiple dynamic system. Oasis city system is not only the largest area where artificial disturbances occur at a regional scale but also the most concentrated area of human activity in arid area. In this study, we developed an applicable and convenient method to assess vulnerability of man-land system of oasis cities with vulnerability indicator system, respectively evaluating the sensitivity, adaptability and vulnerability of the eco-environment system, the economic system and the social system. The results showed that the sensitivity and vulnerability of oasis cities in Xinjiang, China have significant differences while their adaptability does little. In order to find the inherent differences in the vulnerability of oasis cities, triangle methodology has been adopted to divide Xinjiang oasis cities into five types. Some adaptive developing policies specific for individual cities are also proposed based on their vulnerability type and constraining factors.
The Adversarial Route Analysis Tool: A Web Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casson, William H. Jr.
2012-08-02
The Adversarial Route Analysis Tool is a type of Google maps for adversaries. It's a web-based Geospatial application similar to Google Maps. It helps the U.S. government plan operations that predict where an adversary might be. It's easily accessible and maintainble and it's simple to use without much training.
20 CFR 418.3220 - When is your application considered filed?
Code of Federal Regulations, 2010 CFR
2010-04-01
...? 418.3220 Section 418.3220 Employees' Benefits SOCIAL SECURITY ADMINISTRATION MEDICARE SUBSIDIES... the day it is submitted electronically through our Internet Web site. If a State Medicaid agency... subsidy application from our Internet Web site where the requirements set forth in § 418.3230 are met. ...