Adak, C 2013, 'Dual Layer Textual Message Cryptosystem with Randomized Sequence of Symmetric Key', Vol. 4 No., vol. 2, p. 2012.
View description>>
This paper introduces a new concept of textual message encryption anddecryption through a pool of randomized symmetric key and the dual layercryptosystem with the concept of visual cryptography and steganography. Atextual message is converted into two image slides, and the images areencrypted through two different randomized sequences of symmetric key. Thedecryption is done in the reverse way. The encrypted images are decrypted bythose two symmetric keys. The decrypted image slides are merged together andconverted into textual message. Here the image sharing is done through theconcept of visual cryptography and the textual message to image conversion isdone through the concept of steganography.
Adak, C 2013, 'Robust Steganography Using LSB-XOR and Image Sharing', Tata McGraw-Hill, ISBN (13): 978-1-25-906393-0, 2013.
View description>>
Hiding and securing the secret digital information and data that aretransmitted over the internet is of widespread and most challenging interest.This paper presents a new idea of robust steganography using bitwise-XORoperation between stego-key-image-pixel LSB (Least Significant Bit) value andsecret message-character ASCII-binary value (or, secret image-pixel value). Thestego-key-image is shared in dual-layer using odd-even position of each pixelto make the system robust. Due to image sharing, the detection can only be donewith all the image shares.
Anaissi, A, Kennedy, PJ, Goyal, M & Catchpoole, DR 2013, 'A balanced iterative random forest for gene selection from microarray data', BMC BIOINFORMATICS, vol. 14, no. 1, pp. 1-10.
View/Download from: Publisher's site
View description>>
Background: The wealth of gene expression values being generated by high throughput microarray technologies leads to complex high dimensional datasets. Moreover, many cohorts have the problem of imbalanced classes where the number of patients belonging to each class is not the same. With this kind of dataset, biologists need to identify a small number of informative genes that can be used as biomarkers for a disease.Results: This paper introduces a Balanced Iterative Random Forest (BIRF) algorithm to select the most relevant genes for a disease from imbalanced high-throughput gene expression microarray data. Balanced iterative random forest is applied on four cancer microarray datasets: a childhood leukaemia dataset, which represents the main target of this paper, collected from The Children's Hospital at Westmead, NCI 60, a Colon dataset and a Lung cancer dataset. The results obtained by BIRF are compared to those of Support Vector Machine-Recursive Feature Elimination (SVM-RFE), Multi-class SVM-RFE (MSVM-RFE), Random Forest (RF) and Naive Bayes (NB) classifiers. The results of the BIRF approach outperform these state-of-the-art methods, especially in the case of imbalanced datasets. Experiments on the childhood leukaemia dataset show that a 7% ∼ 12% better accuracy is achieved by BIRF over MSVM-RFE with the ability to predict patients in the minor class. The informative biomarkers selected by the BIRF algorithm were validated by repeating training experiments three times to see whether they are globally informative, or just selected by chance. The results show that 64% of the top genes consistently appear in the three lists, and the top 20 genes remain near the top in the other three lists.Conclusion: The designed BIRF algorithm is an appropriate choice to select genes from imbalanced high-throughput gene expression microarray data. BIRF outperforms the state-of-the-art methods, especially the ability to handle the class-imbalanced data. Moreover, the...
Arodudu, O, Voinov, A & van Duren, I 2013, 'Assessing bioenergy potential in rural areas – A NEG-EROEI approach', Biomass and Bioenergy, vol. 58, pp. 350-364.
View/Download from: Publisher's site
Ashraf, J, Hussain, OK & Hussain, FK 2013, 'A Framework for Measuring Ontology Usage on the Web', COMPUTER JOURNAL, vol. 56, no. 9, pp. 1083-1101.
View/Download from: Publisher's site
View description>>
A decade-long conscious effort by the Semantic Web community has resulted in the formation of a decentralized knowledge platform which enables data interoperability at a syntactic and semantic level. For information interoperability, at a syntactic level, RDF provides the standard format for publishing data and RDFS gives structure to the information. For semantic-level interoperability, ontologies are used which allow information dissemination and assimilation among diverse applications and systems; where information is equally accessible and useful to humans and machines. The success of the linked open data project, recognition of explicit semantics (annotated through web ontologies) by search engines and the realized potential advantages of semantic data for publishers have resulted in tremendous growth in the use of web ontologies on the web. In order to promote the adoption of ontologies (to new users), reusability of adopted ontologies, effective and efficient utilization on ontological knowledge and evolving the ontological model, erudite insight on the usage of ontologies is imperative. While ontology evaluation attempts to evaluate a developed ontology to assess its fitness and quality, it does not provide any insight into how ontologies are being used and what is the state of prevalent knowledge patterns. Realizing the importance of measuring and analysing ontology usage to advance the adoption, reusability and exploitation of ontologies, we present a semantic framework for measuring and analysing ontology usage on the Web on empirical grounding. Our methodological approach is discussed to highlight the detail and role of each step. A framework is presented along with the set of metrics developed to measure ontology usage from different aspects such as ontology richness, usage and incentives to provide a holistic view on the state of ontology usage. The framework is then evaluated using an important use-case scenario to identify the prevalent knowledge ...
Azadeh, A, Jiryaei Sharahi, Z, Ashjari, B & Saberi, M 2013, 'A flexible intelligent algorithm for identification of optimum mix of demographic variables for integrated HSEE-ISO systems: The case of a gas transmission refinery', Journal of Loss Prevention in the Process Industries, vol. 26, no. 6, pp. 1159-1182.
View/Download from: Publisher's site
Azadeh, A, Rouzbahman, M, Saberi, M, Valianpour, F & Keramati, A 2013, 'Improved prediction of mental workload versus HSE and ergonomics factors by an adaptive intelligent algorithm', Safety Science, vol. 58, pp. 59-75.
View/Download from: Publisher's site
Azadeh, A, Saberi, M & Gitiforouz, A 2013, 'An integrated fuzzy mathematical model and principal component analysis algorithm for forecasting uncertain trends of electricity consumption', Quality & Quantity, vol. 47, no. 4, pp. 2163-2176.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Asadzadeh, SM & Anvarian, N 2013, 'An Adaptive-Network-Based Fuzzy Inference System-Data Envelopment Analysis Algorithm for Optimization of Long-Term Electricity Consumption, Forecasting and Policy Analysis: The Case of Seven Industrialized Countries', Energy Sources, Part B: Economics, Planning, and Policy, vol. 8, no. 1, pp. 56-66.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Asadzadeh, SM, Hussain, OK & Saberi, Z 2013, 'A neuro-fuzzy-multivariate algorithm for accurate gas consumption estimation in South America with noisy inputs', International Journal of Electrical Power & Energy Systems, vol. 46, pp. 315-325.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Kazem, A, Ebrahimipour, V, Nourmohammadzadeh, A & Saberi, Z 2013, 'A flexible algorithm for fault diagnosis in a centrifugal pump with corrupted data and noise based on ANN and support vector machine with hyper-parameters optimization', Applied Soft Computing, vol. 13, no. 3, pp. 1478-1485.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Rouzbahman, M & Saberi, Z 2013, 'An intelligent algorithm for performance evaluation of job stress and HSE factors in petrochemical plants with noise and uncertainty', Journal of Loss Prevention in the Process Industries, vol. 26, no. 1, pp. 140-152.
View/Download from: Publisher's site
Azadeh, A, Sheikhalishahi, M, Asadzadeh, SM, Saberi, M & Neghab, AEP 2013, 'Forecasting and optimization of service level in vague and complex SCM by a flexible neural network–fuzzy mathematical programming approach', The International Journal of Advanced Manufacturing Technology, vol. 68, no. 5-8, pp. 1453-1470.
View/Download from: Publisher's site
Azadeh, A, Sheikhalishahi, M, Saberi, M & Mostaghimi, MH 2013, 'An intelligent multivariate approach for optimum forecasting of daily ozone concentration in large metropolitans with incomplete inputs', International Journal of Productivity and Quality Management, vol. 12, no. 2, pp. 209-209.
View/Download from: Publisher's site
View description>>
Previous studies show that it is quite necessary to accurately analyse and forecast ozone level especially in complex and large urban regions with incomplete inputs. Also, there is a need for more precise and efficient models to determine effective warning policies with respect to ozone concentration level in large cities. This study presents a flexible and adaptive approach to overcome the above issues. Moreover, an adaptive approach based on artificial neural network (ANN), adaptive neuro-fuzzy interference system (ANFIS) and conventional regression for forecasting of daily ozone levels is developed and discussed. The preferred model is selected via mean absolute percentage of error (MAPE). The proposed model is applied to one of the most polluted and populated cities in the world. Five pollutants and four meteorological variables are considered as inputs and ozone level is considered as output. The results show the flexibility of the proposed approach. The superiority and applicability of the proposed approach over previous models are also shown and discussed in this paper.Copyright © 2013 Inderscience Enterprises Ltd.
BEHBOOD, V, LU, JIE & ZHANG, G 2013, 'FUZZY BRIDGED REFINEMENT DOMAIN ADAPTATION: LONG-TERM BANK FAILURE PREDICTION', International Journal of Computational Intelligence and Applications, vol. 12, no. 01, pp. 1350003-1350003.
View/Download from: Publisher's site
View description>>
Machine learning methods, such as neural network (NN) and support vector machine, assume that the training data and the test data are drawn from the same distribution. This assumption may not be satisfied in many real world applications, like long-term financial failure prediction, because the training and test data may each come from different time periods or domains. This paper proposes a novel algorithm known as fuzzy bridged refinement-based domain adaptation to solve the problem of long-term prediction. The algorithm utilizes the fuzzy system and similarity concepts to modify the target instances' labels which were initially predicted by a shift-unaware prediction model. The experiments are performed using three shift-unaware prediction models based on nine different settings including two main situations: (1) there is no labeled instance in the target domain; (2) there are a few labeled instances in the target domain. In these experiments bank failure financial data is used to validate the algorithm. The results demonstrate a significant improvement in the predictive accuracy, particularly in the second situation identified above.
Belles-Sampera, J, Merigó, JM & Santolino, M 2013, 'Some New Definitions of Indicators for the Choquet Integral', AGGREGATION FUNCTIONS IN THEORY AND IN PRACTISE, vol. 228, pp. 467-476.
View/Download from: Publisher's site
Belles-Sampera, J, Merigó, JM, Guillén, M & Santolino, M 2013, 'The connection between distortion risk measures and ordered weighted averaging operators', Insurance: Mathematics and Economics, vol. 52, no. 2, pp. 411-420.
View/Download from: Publisher's site
Bennett, ND, Croke, BFW, Guariso, G, Guillaume, JHA, Hamilton, SH, Jakeman, AJ, Marsili-Libelli, S, Newham, LTH, Norton, JP, Perrin, C, Pierce, SA, Robson, B, Seppelt, R, Voinov, AA, Fath, BD & Andreassian, V 2013, 'Characterising performance of environmental models', Environmental Modelling & Software, vol. 40, pp. 1-20.
View/Download from: Publisher's site
Blooma, MJ, Kurian, JC, Chua, AYK, Goh, DHL & Lien, NH 2013, 'Social question answering: Analyzing knowledge, cognitive processes and social dimensions of micro-collaborations', Computers & Education, vol. 69, pp. 109-120.
View/Download from: Publisher's site
Boomer, KMB, Weller, DE, Jordan, TE, Linker, L, Liu, Z, Reilly, J, Shenk, G & Voinov, AA 2013, 'Using Multiple Watershed Models to Predict Water, Nitrogen, and Phosphorus Discharges to the Patuxent Estuary1', JAWRA Journal of the American Water Resources Association, vol. 49, no. 1, pp. 15-39.
View/Download from: Publisher's site
View description>>
Boomer, Kathleen M.B., Donald E. Weller, Thomas E. Jordan, Lewis Linker, Zhi‐Jun Liu, James Reilly, Gary Shenk, and Alexey A. Voinov, 2012. Using Multiple Watershed Models to Predict Water, Nitrogen, and Phosphorus Discharges to the Patuxent Estuary. Journal of the American Water Resources Association (JAWRA) 1‐25. DOI: 10.1111/j.1752‐1688.2012.00689.xAbstract: We analyzed an ensemble of watershed models that predict flow, nitrogen, and phosphorus discharges. The models differed in scope and complexity and used different input data, but all had been applied to evaluate human impacts on discharges to the Patuxent River or to the Chesapeake Bay. We compared predictions to observations of average annual, annual time series, and monthly discharge leaving three basins. No model consistently matched observed discharges better than the others, and predictions differed as much as 150% for every basin. Models that agreed best with the observations in one basin often were among the worst models for another material or basin. Combining model predictions into a model average improved overall reliability in matching observations, and the range of predictions helped describe uncertainty. The model average was not the closest to the observed discharge for every material, basin, and time frame, but the model average had the highest Nash–Sutcliffe performance across all combinations. Consistently poor performance in predicting phosphorus loads suggests that none of the models capture major controls. Differences among model predictions came from differences in model structures, input data, and the time period considered, and also to errors in the observed discharge. Ensemble watershed modeling helped identify research needs and quantify the uncertainties that should be considered when using the models in management decisions.
Bressan, N, James, A, Lecce, L & McGregor, C 2013, 'Cardiorespiratory physiological data as an indicator of fentanyl pharmacokinetics and pharmacodynamics in critically ill newborn infants: A case report', Journal of Critical Care, vol. 28, no. 6, pp. e30-e31.
View/Download from: Publisher's site
Budka, M, Juszczyszyn, K, Musial, K & Musial, A 2013, 'Molecular model of dynamic social network based on e-mail communication', Social Network Analysis and Mining, vol. 3, no. 3, pp. 543-563.
View/Download from: Publisher's site
View description>>
In this work we consider an application of physically inspired sociodynamical model to the modelling of the evolution of email-based social network. Contrary to the standard approach of sociodynamics, which assumes expressing of system dynamics with heuristically defined simple rules, we postulate the inference of these rules from the real data and their application within a dynamic molecular model. We present how to embed the n-dimensional social space in Euclidean one. Then, inspired by the Lennard-Jones potential, we define a data-driven social potential function and apply the resultant force to a real e-mail communication network in a course of a molecular simulation, with network nodes taking on the role of interacting particles. We discuss all steps of the modelling process, from data preparation, through embedding and the molecular simulation itself, to transformation from the embedding space back to a graph structure. The conclusions, drawn from examining the resultant networks in stable, minimum-energy states, emphasize the role of the embedding process projecting the non–metric social graph into the Euclidean space, the significance of the unavoidable loss of information connected with this procedure and the resultant preservation of global rather than local properties of the initial network. We also argue applicability of our method to some classes of problems, while also signalling the areas which require further research in order to expand this applicability domain.
Cetindamar, D & Kilitcioglu, H 2013, 'Measuring the competitiveness of a firm for an award system', Competitiveness Review: An International Business Journal, vol. 23, no. 1, pp. 7-22.
View/Download from: Publisher's site
View description>>
PurposeCompetition is of interest to both policy makers and managers. However, existing studies concentrate on the measurement of national competitiveness while neglecting firm competitiveness. The purpose of this paper is to fill this gap by developing a comprehensive and generic measurement model to understand firm competitiveness. The model is used to develop an award system to help companies in the self‐assessment of their competitiveness.Design/methodology/approachThe theoretical base of the measurement of firm level competitiveness is driven from two national competitiveness models, namely World Competitive Yearbook and Global Competitiveness Index, while the assessment structure is based on the well‐known European Foundation for Quality Management Excellence Award. The competitiveness model developed in this paper is put into use in Turkey. The measures of the model are used for assessing the competitiveness of ten firms, in order to choose the most competitive firm of the year. The study in Turkey explains how the measurement model works by illustrating an example.FindingsThis paper attempts to develop a generic model in which the competition parameters do not change for individual companies. The model covers a wide variety of parameters that form the base of competition at the firm level. It is demonstrated that the competition model developed in the paper works in practice.Originality/valueThis paper contributes to the national competitiveness by providing deeper understanding of the dynamics of firm‐level competitiveness and provides some implications and suggestions for further studies.
Chin-Teng Lin, Shu-Fang Tsai & Li-Wei Ko 2013, 'EEG-Based Learning System for Online Motion Sickness Level Estimation in a Dynamic Vehicle Environment', IEEE Transactions on Neural Networks and Learning Systems, vol. 24, no. 10, pp. 1689-1700.
View/Download from: Publisher's site
View description>>
Motion sickness is a common experience for many people. Several previous researches indicated that motion sickness has a negative effect on driving performance and sometimes leads to serious traffic accidents because of a decline in a person's ability to maintain self-control. This safety issue has motivated us to find a way to prevent vehicle accidents. Our target was to determine a set of valid motion sickness indicators that would predict the occurrence of a person's motion sickness as soon as possible. A successful method for the early detection of motion sickness will help us to construct a cognitive monitoring system. Such a monitoring system can alert people before they become sick and prevent them from being distracted by various motion sickness symptoms while driving or riding in a car. In our past researches, we investigated the physiological changes that occur during the transition of a passenger's cognitive state using electroencephalography (EEG) power spectrum analysis, and we found that the EEG power responses in the left and right motors, parietal, lateral occipital, and occipital midline brain areas were more highly correlated to subjective sickness levels than other brain areas. In this paper, we propose the use of a self-organizing neural fuzzy inference network (SONFIN) to estimate a driver's/passenger's sickness level based on EEG features that have been extracted online from five motion sickness-related brain areas, while either in real or virtual vehicle environments. The results show that our proposed learning system is capable of extracting a set of valid motion sickness indicators that originated from EEG dynamics, and through SONFIN, a neuro-fuzzy prediction model, we successfully translated the set of motion sickness indicators into motion sickness levels. The overall performance of this proposed EEG-based learning system can achieve an average prediction accuracy of ∼ 82%. © 2013 IEEE.
Choi, Y, Bressan, N, James, A, Pugh, E & McGregor, C 2013, 'Design of temporal analysis of neonatal vagal spells at different gestational ages using the artemis' framework', Journal of Critical Care, vol. 28, no. 1, pp. e4-e5.
View/Download from: Publisher's site
Cirelli, J, McGregor, C, Graydon, B & James, A 2013, 'Analysis of continuous oxygen saturation data for accurate representation of retinal exposure to oxygen in the preterm infant.', Stud Health Technol Inform, vol. 183, pp. 126-131.
View/Download from: Publisher's site
View description>>
Maintaining blood oxygen saturation within the intended target range for preterm infants receiving neonatal intensive care is challenging. Supplemental oxygen is believed to lead to increased risk of retinopathy of prematurity and hence managing the level of oxygen within this population is important within their care. Current quality improvement activities use coarse hourly spot readings to measure supplemental oxygen levels as associated with targeted ranges that vary based on gestational age. In this research we use Artemis, a real-time online healthcare analytics platform to ascertain if the collection of second by second data provides a better representation of retinal exposure to oxygen than an infrequent, intermittent spot reading. We show that Artemis is capable of producing more accurate information from the higher frequency data, as it includes all the episodic events in the activity of the hour, which provides a better understanding of oxygen fluctuation ranges which affect the physiological status of the infant.
Dayong Ye, Minjie Zhang & Sutanto, D 2013, 'Self-Adaptation-Based Dynamic Coalition Formation in a Distributed Agent Network: A Mechanism and a Brief Survey', IEEE Transactions on Parallel and Distributed Systems, vol. 24, no. 5, pp. 1042-1051.
View/Download from: Publisher's site
Do, QNT & Hussain, FK 2013, 'A hybrid approach for the personalisation of cloud-based e-governance services', International Journal of High Performance Computing and Networking, vol. 7, no. 3, pp. 205-205.
View/Download from: Publisher's site
View description>>
Cloud computing is a new and promising paradigm for service delivery including computing resources over the internet. Cloud computing standards and architecture play an important role in benefiting governments by reducing operating costs and increasing governance effectiveness. Cloud-based e-governance contributes to managing security, reducing cost based on a pay-as-you-go method, IT labour cost reduction, and increasing scalability. Given the importance of cloud computing in the today's emerging technologies, personalisation in cloud computing is also significant in supporting users to obtain what they need without being required to request it explicitly. This research will focus mainly on a personalisation algorithm to for cloud computing. A case study in which a user can suggest the language they want to use without making an explicit request will be provided to assist further understanding of the new algorithm, which is a combination of the TOPSIS and Pearson correlation coefficient methods.
Dong, H & Hussain, FK 2013, 'SOF: a semi-supervised ontology-learning-based focused crawler', CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, vol. 25, no. 12, pp. 1755-1770.
View/Download from: Publisher's site
View description>>
The rapid increase in the volume of data available on the Internet makes it increasingly impractical for a crawler to index the whole Web. Instead, many intelligent crawlers, known as ontology-based semantic focused crawlers, have been designed by making use of Semantic Web technologies for topic-centered Web information crawling. Ontologies, however, have constraints of validity and time, which may influence the performance of the crawlers. Ontology-learning-based focused crawlers are therefore designed to automatically evolve ontologies by integrating ontology learning technologies. Nevertheless, surveys indicate that the existing ontology-learning-based focused crawlers do not have the capability to automatically enrich the content of ontologies, which makes these crawlers unreliable in the open and heterogeneous Web environment. Hence, in this paper, we propose a framework for a novel semi-supervised ontology-learning-based focused (SOF) crawler, the SOF crawler, which embodies a series of schemas for ontology generation and Web information formatting, a semi-supervised ontology learning framework, and a hybrid Web page classification approach aggregated by a group of support vector machine models. A series of tests are implemented to evaluate the technical feasibility of this proposed framework. The conclusion and the future work are summarized in the final section
Dong, H, Hussain, FK & Chang, E 2013, 'Semantic Web Service matchmakers: state of the art andchallenges', CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, vol. 25, no. 7, pp. 961-988.
View/Download from: Publisher's site
View description>>
Web services provide a standard means for the interoperable operations between electronic devices in a network. The mission of Web service discovery is to seek an appropriate Web service for a service requester on the basis of the service descriptions in Web service advertisements and the service requesterâs requirements. Nevertheless, the standard language used for encoding service descriptions does not have the capacity to specify the capabilities of a Web service, leading to the problem of ambiguity in the service discovery process. This brings up the vision of SemanticWeb Services and SemanticWeb Service discovery, which make use of the SemanticWeb technologies to enrich the semantics of service descriptions for service discovery. Semantic Web Service matchmakers are the programs or frameworks designed to implement the task of Semantic Web Service discovery and have drawn a significant amount of attention from both academia and industry from the start of this century. In this paper, we conduct a survey of the contemporary Semantic Web Service matchmakers in order to obtain an overview of the state of the art in this research area. We summarize six technical dimensions from the past literature and analyze the typical Semantic Web Service matchmakers mostly developed during the past 4 or 5 years in terms of the six dimensions. By means of this analysis, we gain an understanding of the current research and summarize a series of potential issues to that would provide the foundation for future research in this area. Copyright © 2012 John Wiley & Sons, Ltd
Dong, Y, Hong, W-C, Xu, Y & Yu, S 2013, 'Numerical scales generated individually for analytic hierarchy process', European Journal of Operational Research, vol. 229, no. 3, pp. 654-662.
View/Download from: Publisher's site
Dong, Y, Zhang, G, Hong, W-C & Yu, S 2013, 'Linguistic Computational Model Based on 2-Tuples and Intervals', IEEE Transactions on Fuzzy Systems, vol. 21, no. 6, pp. 1006-1018.
View/Download from: Publisher's site
Doss, R, Zhou, W & Yu, S 2013, 'Secure RFID Tag Ownership Transfer Based on Quadratic Residues', IEEE Transactions on Information Forensics and Security, vol. 8, no. 2, pp. 390-401.
View/Download from: Publisher's site
Esfijani, A, Hussain, FK & Chang, E 2013, 'University social responsibility ontology', Engineering Intelligent Systems, vol. 21, no. 4, pp. 271-281.
View description>>
This paper draws on the existing body of knowledge to develop an ontology for university social responsibility (USR). There are numerous terms and definitions for USR in the existing literature. However, there is no consensus among them. In order to address this issue, we used a semi-automated text mining approach for ontology engineering. The developed ontology covered USR and its associated terms by which social responsibilities of a university to its communities have been described in the existing literature. The developed ontology, which is an explicit specification of USR concept, its components and their relationships, can contribute to develop a unified understanding of the concept for measurement purposes. © 2013 CRL Publishing Ltd.
Fis, AM & Çetindamar, D 2013, 'Start-Up Information Search Practices: The Case of Turkey', Emerging Markets Finance and Trade, vol. 49, no. 6, pp. 22-36.
View/Download from: Publisher's site
View description>>
Information search may be especially crucial in an emerging economy context where gaps in knowledge are magnified due to the limited availability, accessibility, and quality of sources. Under the framework of social embeddedness, we observe the role of previous entrepreneurial experience in information search conducted during start-up. The impact of information search on future growth is also explored. Based on an empirical study of 172 Turkish entrepreneurs, the results indicate that (1) first-time entrepreneurs search more intensely, (2) first-time entrepreneurs utilize a greater number of formal resources, and (3) the intensity of information search is positively related with future growth.
Fowler, AG, Devitt, SJ & Jones, C 2013, 'Surface code implementation of block code state distillation', Scientific Reports, vol. 3, no. 1.
View/Download from: Publisher's site
Frati, F, Gaspers, S, Gudmundsson, J & Mathieson, L 2013, 'Augmenting graphs to minimize the diameter', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 8283 LNCS, pp. 383-393.
View/Download from: Publisher's site
View description>>
We study the problem of augmenting a weighted graph by inserting edges ofbounded total cost while minimizing the diameter of the augmented graph. Ourmain result is an FPT 4-approximation algorithm for the problem.
Gao, L, Li, M, Bonti, A, Zhou, W & Yu, S 2013, 'Multidimensional Routing Protocol in Human-Associated Delay-Tolerant Networks', IEEE Transactions on Mobile Computing, vol. 12, no. 11, pp. 2132-2144.
View/Download from: Publisher's site
Garde-Perik, EVD, Offermans, S, Boerdonk, KV, Lenssen, K-M & Hoven, EVD 2013, 'An analysis of input-output relations in interaction with smart tangible objects', ACM Transactions on Interactive Intelligent Systems, vol. 3, no. 2, pp. 1-20.
View/Download from: Publisher's site
View description>>
This article focuses on the conceptual relation between the user's input and a system's output in interaction with smart tangible objects. Understanding this input-output relation (IO relation) is a prerequisite for the design of meaningful interaction. A meaningful IO relation allows the user to know what to do with a system to achieve a certain goal and to evaluate the outcome. The work discussed in this article followed a design research process in which four concepts were developed and prototyped. An evaluation was performed using these prototypes to investigate the effect of highly different IO relations on the user's understanding of the interaction. The evaluation revealed two types of IO relations differing in functionality and the number of mappings between the user and system actions. These two types of relations are described by two IO models that provide an overview of these mappings. Furthermore, they illustrate the role of the user and the influence of the system in the process of understanding the interaction. The analysis of the two types of IO models illustrates the value of understanding IO relations for the design of smart tangible objects.
Gil-Lafuente, AM & Merigo, JM 2013, 'Modelling and Simulation in Enterprises – MS’10 Barcelona', Kybernetes, vol. 42, no. 5, pp. 251-269.
View/Download from: Publisher's site
Gil-Lafuente, AM & Merigo, JM 2013, 'Modelling and Simulation in Enterprises - MS'10 Barcelona', KYBERNETES, vol. 42, no. 5, pp. 671-673.
View/Download from: Publisher's site
Gluga, R, Kay, J, Lister, R, Simon & Kleitman, S 2013, 'Mastering cognitive development theory in computer science education', Computer Science Education, vol. 23, no. 1, pp. 24-57.
View/Download from: Publisher's site
View description>>
Abstract: To design an effective computer science curriculum, educators require a systematic method of classifying the difficulty level of learning activities and assessment tasks. This is important for curriculum design and implementation and for communication between educators. Different educators must be able to use the method consistently, so that classified activities and assessments are comparable across the subjects of a degree, and, ideally, comparable across institutions. One widespread approach to supporting this is to write learning objects in terms of Blooms Taxonomy. This, or other such classifications, is likely to be more effective if educators can use them consistently, in the way experts would use them. To this end, we present the design and evaluation of our online interactive web-based tutorial system, which can be configured and used to offer training in different classification schemes. We report on results from three evaluations. First, 17 computer science educators complete a tutorial on using Blooms Taxonomy to classify programming examination questions. Second, 20 computer science educators complete a Neo-Piagetian tutorial. Third evaluation was a comparison of inter-rater reliability scores of computer science educators classifying programming questions using Blooms Taxonomy, before and after taking our tutorial. Based on the results from these evaluations, we discuss the effectiveness of our tutorial system design for teaching computer science educators how to systematically and consistently classify programming examination questions. We also discuss the suitability of Blooms Taxonomy and Neo-Piagetian theory for achieving this goal. The Blooms and Neo-Piagetian tutorials are made available as a community resource. The contributions of this paper are the following: the tutorial system for learning classification schemes for the purpose of coding the difficulty of computing learning materials; its evaluation; new insights into the consis...
Golsteijn, C & van den Hoven, E 2013, 'Facilitating parent-teenager communication through interactive photo cubes', PERSONAL AND UBIQUITOUS COMPUTING, vol. 17, no. 2, pp. 273-286.
View/Download from: Publisher's site
View description>>
Because most teenagers strive for freedom and try to live autonomously, communication with their parents could be improved. It appeared from a literature review and a diary study that parent-teenager communication primarily addresses teenager-oriented everyday activities. However, it also showed teenagers have a substantial interest in getting to know their parents and their parents' past. The study described in this paper seeks to address this opportunity by designing a product for parents and teenagers that facilitates communication about the past of the parents. The resulting design, called Cueb, is a set of interactive digital photo cubes with which parents and teenagers can explore individual and shared experiences and are triggered to exchange stories. An evaluation of a prototype of Cueb with four families showed that the participants felt significantly more triggered and supported to share their experiences and tell stories with Cueb's full functionality (connecting cubes, switching, and locking photographs) than with limited functionality (shaking to display random photographs), similar to more traditional photo media.
Goodswen, SJ, Kennedy, PJ & Ellis, JT 2013, 'A guide to in silico vaccine discovery for eukaryotic pathogens', BRIEFINGS IN BIOINFORMATICS, vol. 14, no. 6, pp. 753-774.
View/Download from: Publisher's site
View description>>
In this article, a framework for an in silico pipeline is presented as a guide to high-throughput vaccine candidate discovery for eukaryotic pathogens, such as helminths and protozoa. Eukaryotic pathogens are mostly parasitic and cause some of the most damaging and difficult to treat diseases in humans and livestock. Consequently, these parasitic pathogens have a significant impact on economy and human health. The pipeline is based on the principle of reverse vaccinology and is constructed from freely available bioinformatics programs. There are several successful applications of reverse vaccinology to the discovery of subunit vaccines against prokaryotic pathogens but not yet against eukaryotic pathogens. The overriding aim of the pipeline, which focuses on eukaryotic pathogens, is to generate through computational processes of elimination and evidence gathering a ranked list of proteins based on a scoring system. These proteins are either surface components of the target pathogen or are secreted by the pathogen and are of a type known to be antigenic. No perfect predictive method is yet available; therefore, the highest-scoring proteins from the list require laboratory validation.
Goodswen, SJ, Kennedy, PJ & Ellis, JT 2013, 'A novel strategy for classifying the output from an in silico vaccine discovery pipeline for eukaryotic pathogens using machine learning algorithms', BMC BIOINFORMATICS, vol. 14, no. 1, pp. 315-327.
View/Download from: Publisher's site
View description>>
An in silico vaccine discovery pipeline for eukaryotic pathogens typically consists of several computational tools to predict protein characteristics. The aim of the in silico approach to discovering subunit vaccines is to use predicted characteristics to identify proteins which are worthy of laboratory investigation. A major challenge is that these predictions are inherent with hidden inaccuracies and contradictions. This study focuses on how to reduce the number of false candidates using machine learning algorithms rather than relying on expensive laboratory validation. Proteins from Toxoplasma gondii, Plasmodium sp., and Caenorhabditis elegans were used as training and test datasets.
Goodswen, SJ, Kennedy, PJ & Ellis, JT 2013, 'A review of the infection, genetics, and evolution of Neospora caninum: From the past to the present', INFECTION GENETICS AND EVOLUTION, vol. 13, no. 1, pp. 133-150.
View/Download from: Publisher's site
View description>>
This paper is a review of current knowledge on Neospora caninum in the context of other apicomplexan parasites and with an emphasis on: life cycle, disease, epidemiology, immunity, control and treatment, evolution, genomes, and biological databases and web resources. N. caninum is an obligate, intracellular, coccidian, protozoan parasite of the phylum Apicomplexa. Infection can cause the clinical disease neosporosis, which most notably is associated with abortion in cattle. These abortions are a major root cause of economic loss to both the dairy and beef industries worldwide. N. caninum has been detected in every country in which a study has been specifically conducted to detect this parasite in cattle. The major mode of transmission in cattle is transplacental (or vertical) transmission and several elements of the N. caninum life cycle are yet to be studied in detail. The outcome of an infection is inextricably linked to the precise timing of the infection coupled with the status of the immune system of the dam and foetus. There is no community consensus as to whether it is the dams pro-inflammatory cytotoxic response to tachyzoites that kills the foetus or the tachyzoites themselves. From economic analysis the most cost-effective approach to control neosporosis is a vaccine. The perfect vaccine would protect against both infection and the clinical disease, and this implies a vaccine is needed that can induce a non-foetopathic cell mediated immunity response. Researchers are beginning to capitalise on the vast potential of -omics data (e.g. genomes, transcriptomes, and proteomes) to further our understanding of pathogens but especially to identify vaccine and drug targets. The recent publication of a genome for N. caninum offers vast opportunities in these areas.
Hammadi, A, Hussain, OK, Dillon, T & Hussain, FK 2013, 'A framework for SLA management in cloud computing for informed decision making', CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, vol. 16, no. 4, pp. 961-977.
View/Download from: Publisher's site
View description>>
In cloud computing, service providers offer cost-effective and on-demand IT services to service users on the basis of Service Level Agreements (SLAs). However the effective management of SLAs in cloud computing is essential for the service users to ensure that they achieve the desired outcomes from the formed service. In this paper, we introduce a SLA management framework that will enable service users to select the best available service provider on the basis of its reputation and then monitor the run time performance of the service provider to determine whether or not it will fulfill its promise defined in the SLA. Such analysis will assist the service user to make an informed decision about the continuation of service with the service provider.
Hassan Mohammed, A, Dai, B, Huang, B, Azhar, M, Xu, G, Qin, P & Yu, S 2013, 'A survey and tutorial of wireless relay network protocols based on network coding', Journal of Network and Computer Applications, vol. 36, no. 2, pp. 593-610.
View/Download from: Publisher's site
Janjua, NK, Hussain, FK & Hussain, OK 2013, 'Semantic information and knowledge integration through argumentative reasoning to support intelligent decision making', INFORMATION SYSTEMS FRONTIERS, vol. 15, no. 2, pp. 167-192.
View/Download from: Publisher's site
View description>>
The availability of integrated, high quality information is a pre-requisite for a decision support system (DSS) to aid in the decision-making process. The introduction of semantic web ensures the seamless integration of information derived from diverse sources and transforms the DSS to an adoptable and flexible Semantic Web-DSS (Web-DSS). However, due to the monotonic nature of the layered development of semantic web, it lacks the capability to represent, reason and integrate incomplete and conflicting information. This, in turn, renders an enterprise incapable of knowledge integration; that is, integration of information about a subject that could potentially be incomplete, inconsistent and distributed among different Web-DSS within or across enterprises. In this article, we address the issues of incomplete and inconsistent semantic information and knowledge integration by using argumentation and argumentation schemes. We discuss the Argumentation-enabled Information Integration Web-DSS (Web@IDSS) along with its syntax and semantics for semantic information integration, and devise a methodology for sharing the results of Web@IDSS in Argument Interchange Format (AIF) format. We also discuss Argumentation-enabled Knowledge Integration Web-DSS (Web@KIDSS) for semantic knowledge integration. We provide formal syntax and semantics for the Web@KIDSS, propose a conceptual framework, and describe it in detail. We present the algorithms for knowledge integration and the prototype application for validation of results
Jebelli Javan, A, Saberi, M, Javaheri Vayeghan, A, Ghaffari Khaligh, S, Rezaian, H & Nejabat, N 2013, 'The effect of dietary Aloe vera gel extract supplementation on lipid peroxidation of broiler breast fillets during frozen storage', Journal of Veterinary Research, vol. 68, no. 3, pp. 233-240.
View description>>
BACKGROUND: To improve the oxidative stability of meat products, the use of the dietary form of natural additives, especially those with plant origin is increasing. Aloe vera plant, the in vitro antioxidant effect of which has been previously discussed, is a potential candidate for this purpose. OBJECTIVES: This study was designed to evaluate the effects of feed supplementation with Aloe vera gel extract on lipid peroxidation of broiler breast fillets during frozen storage. METHODS: Fifty-four 1-day old broilers were allocated into three groups (basal diet as control, basal diet supplemented with 100 and 300 mg/kg methanol extract of Aloe vera gel) and fed for 6 weeks. In the term, chicks were slaughtered and their breast fillets were stored at -20°C for 9 months. Lipid peroxidation was assessed after 1, 3, 6 and 9 months of frozen storage using chemical (PV and TBARS) and sensory evaluations. RESULTS: Results indicated that incorporation of 300 mg/kg Aloe vera gel methanol extract in broiler diets caused the delay of lipid peroxidation in raw breast meat (with 9.6 meq/kg, 92.67 μg/kg and 6.3 in PV, TBARS and Sensory evaluations, respectively) in comparison with control sample (with 15.2 meq/kg, 139.33 μg/kg and 3 in mentioned evaluations) at the last day of the experiment (p<0.05). CONCLUSIONS: This study showed that methanol extract of Aloe vera gel can be considered as a dietary supplementation substance in chicken diet and can delay the oxidative spoilage of chicken breast fillets during frozen spoilage.
Kazem, A, Sharifi, E, Hussain, FK, Saberi, M & Hussain, OK 2013, 'Support vector regression with chaos-based firefly algorithm for stock market price forecasting', APPLIED SOFT COMPUTING, vol. 13, no. 2, pp. 947-958.
View/Download from: Publisher's site
View description>>
Due to the inherent non-linearity and non-stationary characteristics of financial stock market price time series, conventional modeling techniques such as the Box-Jenkins autoregressive integrated moving average (ARIMA) are not adequate for stock market price forecasting. In this paper, a forecasting model based on chaotic mapping, firefly algorithm, and support vector regression (SVR) is proposed to predict stock market price. The forecasting model has three stages. In the first stage, a delay coordinate embedding method is used to reconstruct unseen phase space dynamics. In the second stage, a chaotic firefly algorithm is employed to optimize SVR hyperparameters. Finally in the third stage, the optimized SVR is used to forecast stock market price. The significance of the proposed algorithm is 3-fold. First, it integrates both chaos theory and the firefly algorithm to optimize SVR hyperparameters, whereas previous studies employ a genetic algorithm (GA) to optimize these parameters. Second, it uses a delay coordinate embedding method to reconstruct phase space dynamics. Third, it has high prediction accuracy due to its implementation of structural risk minimization (SRM). To show the applicability and superiority of the proposed algorithm, we selected the three most challenging stock market time series data from NASDAQ historical quotes, namely Intel, National Bank shares and Microsoft daily closed (last) stock price, and applied the proposed algorithm to these data. Compared with genetic algorithm-based SVR (SVR-GA), chaotic genetic algorithm-based SVR (SVR-CGA), firefly-based SVR (SVR-FA), artificial neural networks (ANNs) and adaptive neuro-fuzzy inference systems (ANFIS), the proposed model performs best based on two error measures, namely mean squared error (MSE) and mean absolute percent error (MAPE). Copyright © 2012 Published by Elsevier B.V. All rights reserved.
Kazienko, P, Musial, K & Kajdanowicz, T 2013, 'Multidimensional Social Network in the Social Recommender System', Kazienko, P.; Musial, K.; Kajdanowicz, T.;, 'Multidimensional Social Network in the Social Recommender System,' Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, vol.41, no.4, pp.746-759, July 2011, vol. 41, no. 4, pp. 746-759.
View/Download from: Publisher's site
View description>>
All online sharing systems gather data that reflects users' collectivebehaviour and their shared activities. This data can be used to extractdifferent kinds of relationships, which can be grouped into layers, and whichare basic components of the multidimensional social network proposed in thepaper. The layers are created on the basis of two types of relations betweenhumans, i.e. direct and object-based ones which respectively correspond toeither social or semantic links between individuals. For better understandingof the complexity of the social network structure, layers and their profileswere identified and studied on two, spanned in time, snapshots of the Flickrpopulation. Additionally, for each layer, a separate strength measure wasproposed. The experiments on the Flickr photo sharing system revealed that therelationships between users result either from semantic links between objectsthey operate on or from social connections of these users. Moreover, thedensity of the social network increases in time. The second part of the studyis devoted to building a social recommender system that supports the creationof new relations between users in a multimedia sharing system. Its main goal isto generate personalized suggestions that are continuously adapted to users'needs depending on the personal weights assigned to each layer in themultidimensional social network. The conducted experiments confirmed theusefulness of the proposed model.
Kelly (Letcher), RA, Jakeman, AJ, Barreteau, O, Borsuk, ME, ElSawah, S, Hamilton, SH, Henriksen, HJ, Kuikka, S, Maier, HR, Rizzoli, AE, van Delden, H & Voinov, AA 2013, 'Selecting among five common modelling approaches for integrated environmental assessment and management', Environmental Modelling & Software, vol. 47, pp. 159-181.
View/Download from: Publisher's site
Kheirkhah, A, Azadeh, A, Saberi, M, Azaron, A & Shakouri, H 2013, 'Improved estimation of electricity demand function by using of artificial neural network, principal component analysis and data envelopment analysis', Computers & Industrial Engineering, vol. 64, no. 1, pp. 425-441.
View/Download from: Publisher's site
Konyagin, SV, Luca, F, Mans, B, Mathieson, L, Sha, M & Shparlinski, IE 2013, 'Functional Graphs of Polynomials over Finite Fields', Journal of Combinatorial Theory. Series B, vol. 116, pp. 87-122.
View/Download from: Publisher's site
View description>>
Given a function $f$ in a finite field ${\mathbb F}_q$ of $q$ elements, wedefine the functional graph of $f$ as a directed graph on $q$ nodes labelled bythe elements of ${\mathbb F}_q$ where there is an edge from $u$ to $v$ if andonly if $f(u) = v$. We obtain some theoretic estimates on the number ofnon-isomorphic graphs generated by all polynomials of a given degree. We thendevelop a simple and practical algorithm to test the isomorphism of quadraticpolynomials that has linear memory and time complexities. Furthermore, weextend this isomorphism testing algorithm to the general case of functionalgraphs, and prove that, while its time complexity increases only slightly, itsmemory complexity remains linear. We exploit this algorithm to provide an upperbound on the number of functional graphs corresponding to polynomials of degree$d$ over ${\mathbb F}_q$. Finally, we present some numerical results andcompare function graphs of quadratic polynomials with those generated by randommaps and pose interesting new problems.
Laniak, GF, Olchin, G, Goodall, J, Voinov, A, Hill, M, Glynn, P, Whelan, G, Geller, G, Quinn, N, Blind, M, Peckham, S, Reaney, S, Gaber, N, Kennedy, R & Hughes, A 2013, 'Integrated environmental modeling: A vision and roadmap for the future', Environmental Modelling & Software, vol. 39, pp. 3-23.
View/Download from: Publisher's site
Laniak, GF, Rizzoli, AE & Voinov, A 2013, 'Thematic Issue on the Future of Integrated Modeling Science and Technology', Environmental Modelling & Software, vol. 39, pp. 1-2.
View/Download from: Publisher's site
Lawrence*, CG 2013, 'The urgency of monitoring salt consumption and its effects in Aboriginal and Torres Strait Islander Australians', Medical Journal of Australia, vol. 198, no. 7, pp. 365-366.
View/Download from: Publisher's site
Li, B, Chen, L, Zhu, X & Zhang, C 2013, 'Noisy but non-malicious user detection in social recommender systems', World Wide Web, vol. 16, no. 5-6, pp. 677-699.
View/Download from: Publisher's site
View description>>
Social recommender systems largely rely on user-contributed data to infer users' preference. While this feature has enabled many interesting applications in social networking services, it also introduces unreliability to recommenders as users are allowed to insert data freely. Although detecting malicious attacks from social spammers has been studied for years, little work was done for detecting Noisy but Non-Malicious Users (NNMUs), which refers to those genuine users who may provide some untruthful data due to their imperfect behaviors. Unlike colluded malicious attacks that can be detected by finding similarly-behaved user profiles, NNMUs are more difficult to identify since their profiles are neither similar nor correlated from one another. In this article, we study how to detect NNMUs in social recommender systems. Based on the assumption that the ratings provided by a same user on closely correlated items should have similar scores, we propose an effective method for NNMU detection by capturing and accumulating user's 'self-contradictions', i.e., the cases that a user provides very different rating scores on closely correlated items. We show that self-contradiction capturing can be formulated as a constrained quadratic optimization problem w.r.t. a set of slack variables, which can be further used to quantify the underlying noise in each test user profile. We adopt three real-world data sets to empirically test the proposed method. The experimental results show that our method (i) is effective in real-world NNMU detection scenarios, (ii) can significantly outperform other noisy-user detection methods, and (iii) can improve recommendation performance for other users after removing detected NNMUs from the recommender system. © 2012 Springer Science+Business Media, LLC.
Li, J & Tao, D 2013, 'A Bayesian Hierarchical Factorization Model for Vector Fields.', IEEE Trans. Image Process., vol. 22, no. 11, pp. 4510-4521.
View/Download from: Publisher's site
View description>>
Factorization-based techniques explain arrays of observations using a relatively small number of factors and provide an essential arsenal for multi-dimensional data analysis. Most factorization models are, however, developed on general arrays of scalar values. For a class of practical data arising from observing spatial signals including images, it is desirable for a model to consider general observations, e.g., handling a vector field and non-exchangeable factors, e.g., handling spatial connections between the columns and the rows of the data. In this paper, a probabilistic model for factorization is proposed. We adopt Bayesian hierarchical modeling and treat the factors as latent random variables. A Markov structure is imposed on the distribution of factors to account for the spatial connections. The model is designed to represent vector arrays sampled from fields of continuous domains. Therefore, a tailored observation model is developed to represent the link between the factor product and the data. The proposed technique has been shown effective in analyzing optical flow fields computed on both synthetic images and real-life videoclips. © 2013 IEEE.
Li, J & Tao, D 2013, 'Exponential Family Factors for Bayesian Factor Analysis.', IEEE Trans. Neural Networks Learn. Syst., vol. 24, no. 6, pp. 964-976.
View/Download from: Publisher's site
View description>>
Expressing data as linear functions of a small number of unknown variables is a useful approach employed by several classical data analysis methods, e.g., factor analysis, principal component analysis, or latent semantic indexing. These models represent the data using the product of two factors. In practice, one important concern is how to link the learned factors to relevant quantities in the context of the application. To this end, various specialized forms of the factors have been proposed to improve interpretability. Toward developing a unified view and clarifying the statistical significance of the specialized factors, we propose a Bayesian model family. We employ exponential family distributions to specify various types of factors, which provide a unified probabilistic formulation. A Gibbs sampling procedure is constructed as a general computation routine. We verify the model by experiments, in which the proposed model is shown to be effective in both emulating existing models and motivating new model designs for particular problem settings. © 2012 IEEE.
Li, J & Tao, D 2013, 'Simple Exponential Family PCA.', IEEE Trans. Neural Networks Learn. Syst., vol. 24, no. 3, pp. 485-497.
View/Download from: Publisher's site
View description>>
Principal component analysis (PCA) is a widely used model for dimensionality reduction. In this paper, we address the problem of determining the intrinsic dimensionality of a general type data population by selecting the number of principal components for a generalized PCA model. In particular, we propose a generalized Bayesian PCA model, which deals with general type data by employing exponential family distributions. Model selection is realized by empirical Bayesian inference of the model. We name the model as simple exponential family PCA (SePCA), since it embraces both the principal of using a simple model for data representation and the practice of using a simplified computational procedure for the inference. Our analysis shows that the empirical Bayesian inference in SePCA formally realizes an intuitive criterion for PCA model selection - a preserved principal component must sufficiently correlate to data variance that is uncorrelated to the other principal components. Experiments on synthetic and real data sets demonstrate effectiveness of SePCA and exemplify its characteristics for model selection. © 2013 IEEE.
Li, J, Bian, W, Tao, D & Zhang, C 2013, 'Learning colours from textures by sparse manifold embedding.', Signal Process., vol. 93, no. 6, pp. 1485-1495.
View/Download from: Publisher's site
View description>>
The capability of inferring colours from the texture (grayscale contents) of an image is useful in many application areas, when the imaging device/environment is limited. Traditional manual or limited automatic colour assignment involves intensive human effort. In this paper, we have developed a user-friendly colourisation technique, where the algorithm learns the relation between textures and colours in a user-provided example image and applies the relation to predict the colours in the target image. The key contribution of the proposed technique is trifold. First, we have explicitly built a linear model for the texture-colour relation. Second, we have considered the global non-linear structure of the data distribution by applying the linear model locally; and the local area is determined automatically by sparsity constraints. Third, we have introduced semantic information to further improve the colourisation. Examples demonstrate the effectiveness of the proposed techniques. Moreover, we have conducted a subjective study, where user experience supports the superiority of our method over existing techniques. © 2012 Elsevier B.V.
Li, L, Xu, G, Yang, Z, Dolog, P, Zhang, Y & Kitsuregawa, M 2013, 'An efficient approach to suggesting topically related web queries using hidden topic model', World Wide Web, vol. 16, no. 3, pp. 273-297.
View/Download from: Publisher's site
View description>>
Keyword-based Web search is a widely used approach for locating information on the Web. However, Web users usually suffer from the difficulties of organizing and formulating appropriate input queries due to the lack of sufficient domain knowledge, which greatly affects the search performance. An effective tool to meet the information needs of a search engine user is to suggest Web queries that are topically related to their initial inquiry. Accurately computing query-to-query similarity scores is a key to improve the quality of these suggestions. Because of the short lengths of queries, traditional pseudo-relevance or implicit-relevance based approaches expand the expression of the queries for the similarity computation. They explicitly use a search engine as a complementary source and directly extract additional features (such as terms or URLs) from the top-listed or clicked search results. In this paper, we propose a novel approach by utilizing the hidden topic as an expandable feature. This has two steps. In the offline model-learning step, a hidden topic model is trained, and for each candidate query, its posterior distribution over the hidden topic space is determined to re-express the query instead of the lexical expression. In the online query suggestion step, after inferring the topic distribution for an input query in a similar way, we then calculate the similarity between candidate queries and the input query in terms of their corresponding topic distributions; and produce a suggestion list of candidate queries based on the similarity scores. Our experimental results on two real data sets show that the hidden topic based suggestion is much more efficient than the traditional term or URL based approach, and is effective in finding topically related queries for suggestion. © 2011 Springer Science+Business Media, LLC.
Liu, B, Rong, B, Hu, R & Qian, Y 2013, 'Neighbor discovery algorithms in directional antenna based synchronous and asynchronous wireless ad hoc networks', IEEE Wireless Communications, vol. 20, no. 6, pp. 106-112.
View/Download from: Publisher's site
Liu, L, Chen, X, Luo, D, Lu, Y, Xu, G & Liu, M 2013, 'HSC: A SPECTRAL CLUSTERING ALGORITHM COMBINED WITH HIERARCHICAL METHOD', Neural Network World, vol. 23, no. 6, pp. 499-521.
View/Download from: Publisher's site
View description>>
Most of the traditional clustering algorithms are poor for clustering more complex structures other than the convex spherical sample space. In the past few years, several spectral clustering algorithms were proposed to cluster arbitrarily shaped data in various real applications. However, spectral clustering relies on the dataset where each cluster is approximately well separated to a certain extent. In the case that the cluster has an obvious inflection point within a non-convex space, the spectral clustering algorithm would mistakenly recognize one cluster to be different clusters. In this paper, we propose a novel spectral clustering algorithm called HSC combined with hierarchical method, which obviates the disadvantage of the spectral clustering by not using the misleading information of the noisy neighboring data points. The simple clustering procedure is applied to eliminate the misleading information, and thus the HSC algorithm could cluster both convex shaped data and arbitrarily shaped data more efficiently and accurately. The experiments on both synthetic data sets and real data sets show that HSC outperforms other popular clustering algorithms. Furthermore, we observed that HSC can also be used for the estimation of the number of clusters
Lu, J, Niu, L & Zhang, G 2013, 'A Situation Retrieval Model for Cognitive Decision Support in Digital Business Ecosystems', IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, vol. 60, no. 3, pp. 1059-1069.
View/Download from: Publisher's site
View description>>
This paper presents a novel situation retrieval (SR) model for supporting cognition-driven decision processes in digital business ecosystems. Cognitive decision support in digital ecosystems is concerned with decision makers' cognitive processes. This study aims to facilitate cognitive decision support to decision makers on the basis of current business intelligence (BI) platform. Underlying foundations of the SR model are two types of mental constructs: situation awareness (SA) and mental models of decision makers and the model of naturalistic decision making (NDM). These mental constructs and NDM are integrated into the BI application framework. Our experiments showed that the SR model was playing a nontrivial role to help decision makers develop enhanced SA and reuse their past experience to make better decisions. © 2012 IEEE.
Lu, J, Shambour, Q, Xu, Y, Lin, Q & Zhang, G 2013, 'A WEB-BASED PERSONALIZED BUSINESS PARTNER RECOMMENDATION SYSTEM USING FUZZY SEMANTIC TECHNIQUES', COMPUTATIONAL INTELLIGENCE, vol. 29, no. 1, pp. 37-69.
View/Download from: Publisher's site
View description>>
The web provides excellent opportunities to businesses in various aspects of development such as finding a business partner online. However, with the rapid growth of web information, business users struggle with information overload and increasingly find it difficult to locate the right information at the right time. Meanwhile, small and medium businesses (SMBs), in particular, are seeking 'one-to-one' e-services from government in current highly competitive markets. How can business users be provided with information and services specific to their needs, rather than an undifferentiated mass of information? An effective solution proposed in this study is the development of personalized e-services. Recommender systems is an effective approach for the implementation of Personalized E-Service which has gained wide exposure in e-commerce in recent years. Accordingly, this paper first presents a hybrid fuzzy semantic recommendation (HFSR) approach which combines item-based fuzzy semantic similarity and item-based fuzzy collaborative filtering (CF) similarity techniques. This paper then presents the implementation of the proposed approach into an intelligent recommendation system prototype called Smart BizSeeker, which can recommend relevant business partners to individual business users, particularly for SMBs. Experimental results show that the HFSR approach can help overcome the semantic limitations of classical CF-based recommendation approaches, namely sparsity and new 'cold start' item problems. © 2012 Wiley Periodicals, Inc.
Luo, Z, Hu, Z, Song, Y, Xu, Z & Lu, H 2013, 'Optimal Coordination of Plug-In Electric Vehicles in Power Grids With Cost-Benefit Analysis—Part I: Enabling Techniques', IEEE Transactions on Power Systems, vol. 28, no. 4, pp. 3546-3555.
View/Download from: Publisher's site
View description>>
Plug-in electric vehicles (PEVs) appear to offer a promising option for mitigating greenhouse emission. However, uncoordinated PEV charging can weaken the reliability of power systems. The proper accommodation of PEVs in a power grid imposes many challenges on system planning and operations. This work aims to investigate optimal PEV coordination strategies with cost-benefit analysis. In Part I, we first present a new method to calculate the charging load of PEVs with a modified Latin hypercube sampling (LHS) method for handling the stochastic property of PEVs. We then propose a new two-stage optimization model to discover the optimal charging states of PEVs in a given day. Using this model, the peak load with charging load of PEVs is minimized in the first stage and the load fluctuation is minimized in the second-stage with peak load being fixed as the value obtained in the first stage. An algorithm based on linear mixed-integer programming is provided as a suitable solution method with fast computation. Finally, we present a new method to calculate the benefit and cost for a PEV charging and discharging coordination strategy from a social welfare approach. These methods are useful for developing PEV coordination strategies in power system planning and supporting PEV-related policy making.
Luo, Z, Hu, Z, Song, Y, Xu, Z & Lu, H 2013, 'Optimal Coordination of Plug-in Electric Vehicles in Power Grids With Cost-Benefit Analysis—Part II: A Case Study in China', IEEE Transactions on Power Systems, vol. 28, no. 4, pp. 3556-3565.
View/Download from: Publisher's site
View description>>
Continuing with a set of enabling techniques for the optimal coordination of plug-in electric vehicles (PEVs) in Part I, we present a case study in this paper using techniques based on the data collected in the BeijingTianjinTangshan Region (BTTR) China to discover optimal PEV coordination strategies and assess the attractiveness of these strategies. In Part II, we first present the charging characteristics for different categories of PEVs in BTTR and predict the optimal seasonal daily loads with PEVs under different PEV penetration levels using a two-stage optimization model in both 2020 and 2030. The simulation results indicate that optimal PEV coordination effectively reduces the peak load and smooths the load curve. Finally, we present a cost-benefit analysis of optimal coordination strategies by taking a social welfare approach. The analysis shows that the optimal coordination strategies are beneficial in terms of the reduction in capital investment in power grid expansion and that the attractiveness of a coordination strategy is related to the coordination level. The results also show that the fully coordinated charging and vehicle to grid are not the most attractive strategies. This case study is useful for better understanding the costs and benefits of PEV coordination strategies and for supporting PEV-related decision and policy making from a power system planning perspective.
Marcias, G, Pietroni, N, Panozzo, D, Puppo, E & Sorkine-Hornung, O 2013, 'Animation-Aware Quadrangulation.', Comput. Graph. Forum, vol. 32, no. 5, pp. 167-175.
View/Download from: Publisher's site
View description>>
Geometric meshes that model animated characters must be designed while taking into account the deformations that the shape will undergo during animation. We analyze an input sequence of meshes with point-to-point correspondence, and we automatically produce a quadrangular mesh that fits well the input animation. We first analyze the local deformation that the surface undergoes at each point, and we initialize a cross field that remains as aligned as possible to the principal directions of deformation throughout the sequence. We then smooth this cross field based on an energy that uses a weighted combination of the initial field and the local amount of stretch. Finally, we compute a field-aligned quadrangulation with an off-the-shelf method. Our technique is fast and very simple to implement, and it significantly improves the quality of the output quad mesh and its suitability for character animation, compared to creating the quad mesh based on a single pose. We present experimental results and comparisons with a state-of-the-art quadrangulation method, on both sequences from 3D scanning and synthetic sequences obtained by a rough animation of a triangulated model. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and John Wiley & Sons Ltd.
Marshall, P, Antle, A, Van Den Hoven, E & Rogers, Y 2013, 'Introduction to the Special Issue on the Theory and Practice of Embodied Interaction in HCI and Interaction Design', ACM TRANSACTIONS ON COMPUTER-HUMAN INTERACTION, vol. 20, no. 1, pp. 1-3.
View/Download from: Publisher's site
View description>>
Theories of embodiment focus on how practical engagement and the structure of the body shape perception, experience, and cognition. They typically reject a view of human cognition as grounded in abstract information processing. The concept of embodied in
McGregor, C 2013, 'Big Data in Neonatal Intensive Care', Computer, vol. 46, no. 6, pp. 54-59.
View/Download from: Publisher's site
McGregor, C, Catley, C, Padbury, J & James, A 2013, 'Late onset neonatal sepsis detection in newborn infants via multiple physiological streams', Journal of Critical Care, vol. 28, no. 1, pp. e11-e12.
View/Download from: Publisher's site
McGregor, C, Steadman, A, Percival, J & James, A 2013, 'Modelling health informatics capacity for neonatal intensive care patient journeys supported by interprofessional teams', International Journal of Biomedical Engineering and Technology, vol. 11, no. 3, pp. 301-301.
View/Download from: Publisher's site
View description>>
Neonatal intensive care is a highly complex area of healthcare requiring coordinated care between multiple healthcare professionals; as a result, information flow within the Neonatal Intensive Care Unit (NICU) can be very complex and impact quality of care. This paper presents initial research findings based on the use of the patient journey modelling technique known as PaJMa to audit the current state of health informatics within NICUs in Canada. In this paper, a case study including three Ontario NICUs is utilised and their 'Investigations' processes are modelled using PaJMa. Copyright © 2013 Inderscience Enterprises Ltd.
Meng, HD, Wu, PF, Song, YC & Xu, GD 2013, 'Research of Clustering Algorithm Based on Different Data Field Model', Advanced Materials Research, vol. 760-762, pp. 1925-1929.
View/Download from: Publisher's site
View description>>
Data field clustering algorithm possesses dynamic characteristics compared with other clustering algorithms. By changing the parameters of the data field model, the results can be dynamically adjusted to meet the target of feature extraction and knowledge discovery in different scales, but the selection and construction of data field model can give rise to different clustering results. This paper presents the different effectiveness of clustering based on various of data field models and its parameters, provides with the scheme to chose the best data field model fitting to the characteristics of the data radiation, and verifies that the best clustering effectiveness can be achieved with the value of radial energy in the golden section.
Merigo, JM 2013, 'The probabilistic weighted averaging distance and its application in group decision making', Kybernetes, vol. 42, no. 5, pp. 686-697.
View/Download from: Publisher's site
Merigó, JM & Gil-Lafuente, AM 2013, 'A Method for Decision Making Based on Generalized Aggregation Operators', International Journal of Intelligent Systems, vol. 28, no. 5, pp. 453-473.
View/Download from: Publisher's site
Merigó, JM & Gil-Lafuente, AM 2013, 'Induced 2-tuple linguistic generalized aggregation operators and their application in decision-making', Information Sciences, vol. 236, pp. 1-16.
View/Download from: Publisher's site
MERIGÓ, JM & YAGER, RR 2013, 'GENERALIZED MOVING AVERAGES, DISTANCE MEASURES AND OWA OPERATORS', International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 21, no. 04, pp. 533-559.
View/Download from: Publisher's site
View description>>
The concept of moving average is studied. We analyze several extensions by using generalized aggregation operators, obtaining the generalized moving average. The main advantage is that it provides a general framework that includes a wide range of specific cases including the geometric and the quadratic moving average. This analysis is extended by using the generalized ordered weighted averaging (GOWA) and the induced GOWA (IGOWA) operator. Thus, we get the generalized ordered weighted moving average (GOWMA) and the induced GOWMA (IGOWMA) operator. Some of their main properties are studied. We further extend this approach by using distance measures suggesting the concept of distance moving average and generalized distance moving average. We also consider the case with the OWA and the IOWA operator, obtaining the generalized ordered weighted moving averaging distance (GOWMAD) and the induced GOWMAD (IGOWMAD) operator. The paper ends with an application in multi-period decision making.
Merigó, JM & Yager, RR 2013, 'Norm Aggregations and OWA Operators', AGGREGATION FUNCTIONS IN THEORY AND IN PRACTISE, vol. 228, pp. 141-151.
View/Download from: Publisher's site
Merigó, JM, Gil-Lafuente, AM & Xu, Y 2013, 'Decision making with induced aggregation operators and the adequacy coefficient', Economic Computation and Economic Cybernetics Studies and Research, vol. 47, no. 1, pp. 185-202.
View description>>
We present a method for decision making by using induced aggregation operators. This method is very useful for business decision making problems such as product management, investment selection and strategic management. We introduce a new aggregation operator that uses the induced ordered weighted averaging (IOWA) operator and the weighted average in the adequacy coefficient. We call it the induced ordered weighted averaging weighted averaging adequacy coefficient (IOWAWAAC) operator. The main advantage is that it is able to deal with complex attitudinal characters in the aggregation process. Thus, we are able to give a better representation of the problem considering the complex environment that affects the decisions. Moreover, it is able to provide a unified framework between the OWA and the weighted average. We generalize it by using generalized aggregation operators, obtaining the induced generalized OWAWAAC (IGOWAWAAC) operator. We study some of the main properties of this approach. We end the paper with a numerical example of the new approach in a group decision making problem in strategic management.
Merigó, JM, Gil-Lafuente, AM & Xu, Y 2013, 'Decision making with induced aggregation operators and the adequacy coefficient', Economic Computation and Economic Cybernetics Studies and Research, vol. 9.
View description>>
We present a method for decision making by using induced aggregation operators. This method is very useful for business decision making problems such as product management, investment selection and strategic management. We introduce a new aggregation operator that uses the induced ordered weighted averaging (IOWA) operator and the weighted average in the adequacy coefficient. We call it the induced ordered weighted averaging weighted averaging adequacy coefficient (IOWAWAAC) operator. The main advantage is that it is able to deal with complex attitudinal characters in the aggregation process. Thus, we are able to give a better representation of the problem considering the complex environment that affects the decisions. Moreover, it is able to provide a unified framework between the OWA and the weighted average. We generalize it by using generalized aggregation operators, obtaining the induced generalized OWAWAAC (IGOWAWAAC) operator. We study some of the main properties of this approach. We end the paper with a numerical example of the new approach in a group decision making problem in strategic management.
Merigó, JM, Rocha, C & Garcia-Agreda, S 2013, 'Entrepreneurial intervention in electronic markets: the influence of customer participation', International Entrepreneurship and Management Journal, vol. 9, no. 4, pp. 521-529.
View/Download from: Publisher's site
Merigó, JM, Xu, Y & Zeng, S 2013, 'Group decision making with distance measures and probabilistic information', Knowledge-Based Systems, vol. 40, pp. 81-87.
View/Download from: Publisher's site
Milne, D & Witten, IH 2013, 'An open-source toolkit for mining Wikipedia', Artificial Intelligence, vol. 194, pp. 222-239.
View/Download from: Publisher's site
Mu, K, Jin, Z, Liu, W, Zowghi, D & Wei, B 2013, 'Measuring the significance of inconsistency in the Viewpoints framework', SCIENCE OF COMPUTER PROGRAMMING, vol. 78, no. 9, pp. 1572-1599.
View/Download from: Publisher's site
View description>>
Measuring inconsistency is crucial to effective inconsistency management in software development. A complete measurement of inconsistency should focus on not only the degree but also the significance of inconsistency. However, most of the approaches available only take the degree of inconsistency into account. The significance of inconsistency has not yet been given much needed consideration. This paper presents an approach for measuring the significance of inconsistency arising from different viewpoints in the Viewpoints framework. We call an individual set of requirements belonging to different viewpoints a combined requirements collection in this paper. We argue that the significance of inconsistency arising in a combined requirements collection is closely associated with global priority levels of requirements involved in the inconsistency. Here we assume that the global priority level of an individual requirement captures the relative importance of every viewpoint including this requirement as well as the local priority level of the requirement within the viewpoint. Then we use the synthesis of global priority levels of all the requirements in a combined collection to measure the significance of the collection. Following this, we present a scoring matrix function to measure the significance of inconsistency in an inconsistent combined requirements collection, which describes the contribution made by each subset of the requirements collection to the significance of the set of requirements involved in the inconsistency. An ordering relationship between inconsistencies of two combined requirements collections, termed more significant than, is also presented by comparing their significance scoring matrix functions. Finally, these techniques were implemented in a prototype tool called IncMeasurer, which we developed as a proof of concept. © 2012 Elsevier B.V. All rights reserved.
Musiał, K & Kazienko, P 2013, 'Social networks on the Internet', World Wide Web, vol. 16, no. 1, pp. 31-72.
View/Download from: Publisher's site
Musial, K, Budka, M & Blysz, W 2013, 'Understanding the Other Side – The Inside Story of the INFER Project', Smart Innovation, Systems and Technologies, vol. 18, pp. 1-9.
View/Download from: Publisher's site
View description>>
In the last few years, the collaboration between research institutions and industry has become a well established process. Transfer of Knowledge (ToK) is required to accelerate the development of both sides and to enable them to unlock their full potential. European Commission within the Marie Curie Industry and Academia Partnerships & Pathways (IAPP) programme supports the cooperation between these two sectors at the international scale by funding research projects that as one of the objectives aim at enhancing human mobility. IAPP projects offer people from different institutions the possibility to move sector and country in order to provide, absorb and implement new knowledge in a professional industrial-academic environment. In this paper, one of such projects is presented and both academia and industry perspectives in regard to opportunities and challenges in Transfer of Knowledge are described. Computational Intelligence Platform for Evolving and Robust Predictive Systems (INFER) is the IAPP project that serves as a case study for this paper. © Springer-Verlag Berlin Heidelberg 2013.
Musial, K, Budka, M & Juszczyszyn, K 2013, 'Creation and growth of online social network', World Wide Web, vol. 16, no. 4, pp. 421-447.
View/Download from: Publisher's site
Naik, T, Bressan, N, James, A & McGregor, C 2013, 'Design of temporal analysis for a novel premature infant pain profile using artemis', Journal of Critical Care, vol. 28, no. 1, pp. e4-e4.
View/Download from: Publisher's site
Niazi, M, Ikram, N, Gill, AQ & Ul Hassan, MR 2013, 'Special Issue on 'Empirical Studies in Software Engineering'', IET SOFTWARE, vol. 7, no. 6, pp. 295-297.
View/Download from: Publisher's site
Niu, L, Lu, J, Zhang, G & Wu, D 2013, 'FACETS: A cognitive business intelligence system', INFORMATION SYSTEMS, vol. 38, no. 6, pp. 835-862.
View/Download from: Publisher's site
View description>>
A cognitive decision support system called FACETS was developed and evaluated based on the situation retrieval (SR) model. The aim of FACETS is to provide decision makers cognitive decision support in ill-structured decision situations. The design and development of FACETS includes novel concepts, models, algorithms and system architecture, such as ontology and experience representation, situation awareness parsing, data warehouse query construction and guided situation presentation. The experiments showed that FACETS is able to play a significant role in supporting ill-structured decision making through developing and enriching situation awareness. © 2013 Elsevier Ltd.
Nizami, S, Green, JR & McGregor, C 2013, 'Implementation of Artifact Detection in Critical Care: A Methodological Review', IEEE Reviews in Biomedical Engineering, vol. 6, pp. 127-142.
View/Download from: Publisher's site
View description>>
Artifact detection (AD) techniques minimize the impact of artifacts on physiologic data acquired in critical care units (CCU) by assessing quality of data prior to clinical event detection (CED) and parameter derivation (PD). This methodological review introduces unique taxonomies to synthesize over 80 AD algorithms based on these six themes: 1) CCU; 2) physiologic data source; 3) harvested data; 4) data analysis; 5) clinical evaluation; and 6) clinical implementation. Review results show that most published algorithms: a) are designed for one specific type of CCU; b) are validated on data harvested only from one OEM monitor; c) generate signal quality indicators (SQI) that are not yet formalized for useful integration in clinical workflows; d) operate either in standalone mode or coupled with CED or PD applications; e) are rarely evaluated in real-time; and f) are not implemented in clinical practice. In conclusion, it is recommended that AD algorithms conform to generic input and output interfaces with commonly defined data: 1) type; 2) frequency; 3) length; and 4) SQIs. This shall promote: a) reusability of algorithms across different CCU domains; b) evaluation on different OEM monitor data; c) fair comparison through formalized SQIs; d) meaningful integration with other AD, CED and PD algorithms; and e) real-time implementation in clinical workflows. © 2008-2011 IEEE.
Oberst, S, Lai, JCS & Marburg, S 2013, 'Guidelines for numerical vibration and acoustic analysis of disc brake squeal using simple models of brake systems', Journal of Sound and Vibration, vol. 332, no. 9, pp. 2284-2299.
View/Download from: Publisher's site
Palacios‐Marqués, D, Peris‐Ortiz, M & Merigó, JM 2013, 'The effect of knowledge transfer on firm performance', Management Decision, vol. 51, no. 5, pp. 973-985.
View/Download from: Publisher's site
View description>>
PurposeThis work aims to analyse the effect of a holistic business view, competency‐based management, continuous learning and information technology infrastructure on knowledge transfer and the subsequent effect on firm performance.Design/methodology/approachStructural equation models and a survey of 222 firms from the Spanish biotechnology and telecommunications industries verify the mediator role of knowledge transfer.FindingsThe implications of confirming these hypotheses for managers are that by emphasising the creation of a holistic business view, competency‐based management, promoting continuous learning and improving information technology infrastructure, managers will improve knowledge transfer and positively influence the creation of superior firm performance.Originality/valueIt is shown that in knowledge‐intensive industries, knowledge transfer acts as a mediating variable between a holistic view of the firm, competency‐based management, continuous learning and information and communication technologies infrastructure and firm performance.
Parvin, S, Hussain, FK & Hussain, OK 2013, 'Conjoint trust assessment for secure communication in cognitive radio networks', MATHEMATICAL AND COMPUTER MODELLING, vol. 58, no. 5-6, pp. 1340-1350.
View/Download from: Publisher's site
View description>>
With the rapid development of wireless communication, the growth of Cognitive Radio (CR) is increasing day by day. Because CR is flexible and operates on the wireless network, there are more security threats to CR technology than to the traditional radio environment. In addition, there is no comprehensive framework for achieving security in Cognitive Radio Networks (CRNs), and the role of trust for achieving security in CRNs has not been explored previously. Security vulnerability in cognitive radio technology is unavoidable due to the intrinsic nature of the technology, so it is critical to ensure system security in CRNs. The issue of secure communication in CRNs thus becomes more important than it is in conventional wireless networks. In this paper, we propose a conjoint trust assessment approach (combining trust assessment from the Primary User Network and the Secondary User Network) in a CRN to solve the security threats brought about by untrustworthy entities, such as selfish, malicious, and faultless nodes, and to ensure secure spectrum sharing in CRNs. A numerical analysis shows the feasibility of our proposed approach.
Parvin, S, Hussain, FK, Hussain, OK, Thein, T & Park, JS 2013, 'Multi-cyber framework for availability enhancement of cyber physical systems', COMPUTING, vol. 95, no. 10-11, pp. 927-948.
View/Download from: Publisher's site
View description>>
With the rapid growth of wireless communication, the deployment of cyber-physical system (CPS) is increasing day by day. As a cyber physical system involves a tight coupling between the physical and computational components, it is critical to ensure that the system, apart from being secure, is available for both the cyber and physical processes. Traditional methods have generally been employed to defend an infrastructure system against physical threats. However, this does not guarantee that the availability of the system will always be high. In this paper, we propose a multi-cyber (computational unit) framework to improve the availability of CPS based on Markov model. We evaluate the effectiveness of our proposed framework in terms of availability, downtime, downtime cost and reliability of the CPS framework. © 2012 Her Majesty the Queen in Right of Australia.
Peng, S, Wang, G & Yu, S 2013, 'Modeling the dynamics of worm propagation using two-dimensional cellular automata in smartphones', Journal of Computer and System Sciences, vol. 79, no. 5, pp. 586-595.
View/Download from: Publisher's site
Pugh, E, Thommandram, A, Ng, E, Mcgregor, C, Eklund, M, Narang, I, Belik, J & James, A 2013, 'Classifying neonatal spells using real-time temporal analysis of physiological data streams—algorithm development', Journal of Critical Care, vol. 28, no. 1, pp. e9-e9.
View/Download from: Publisher's site
Pugh, JE, Thommandram, A, McGregor, C, Eklund, M & James, A 2013, 'Classifying neonatal spells using real-time temporal analysis of physiological data streams—verification tests', Journal of Critical Care, vol. 28, no. 6, pp. e40-e41.
View/Download from: Publisher's site
Qumer Gill, A & Bunker, D 2013, 'Towards the development of a cloud‐based communication technologies assessment tool', VINE, vol. 43, no. 1, pp. 57-77.
View/Download from: Publisher's site
View description>>
PurposeIn distributed adaptive development environments (DADE), a primary concern is that of human communication and knowledge sharing among developers. Developers' task performance will be enhanced when their task needs are aligned with the communication media or technology capabilities of the development environment. What are actual communication needs of developers; and how do we enable developers to self‐assess and select appropriate communication technology for their tasks in the DADE. The purpose of this paper is to investigate and present research based on the developers' needs for communication technologies in the context of DADE.Design/methodology/approachThe authors applied an exploratory qualitative research method to investigate, analyze and integrate survey information sourced from 40 developers, to identify their communication technology needs and, based on this information, the authors then set up a practical tool – communication technologies assessment tool (CTAT) to assist developers in the self‐assessment and selection of appropriate communication technologies for their DADE; and also to share this assessment knowledge with other developers or teams located in various DADEs.FindingsThe results of this research suggest that an effective CTAT should be an integral part of the DADE; and a DADE should have a “single source of information” in order to avoid possible communication inconsistencies and ambiguities.Originality/valueThe study results and the resultant CTAT may help developers to make...
Rehman, ZU, Hussain, FK & Hussain, OK 2013, 'Frequency-based similarity measure for multimedia recommender systems', MULTIMEDIA SYSTEMS, vol. 19, no. 2, pp. 95-102.
View/Download from: Publisher's site
View description>>
Personalized recommendation has become a pivotal aspect of online marketing and e-commerce as a means of overcoming the information overload problem. There are several recommendation techniques but collaborative recommendation is the most effective and widely used technique. It relies on either item-based or user-based nearest neighborhood algorithms which utilize some kind of similarity measure to assess the similarity between different users or items for generating the recommendations. In this paper, we present a new similarity measure which is based on rating frequency and compare its performance with the current most commonly used similarity measures. The applicability and use of this similarity measure from the perspective of multimedia content recommendation is presented and discussed
Saberi, M, Mirtalaie, MS, Hussain, FK, Azadeh, A, Hussain, OK & Ashjari, B 2013, 'A granular computing-based approach to credit scoring modeling', NEUROCOMPUTING, vol. 122, no. 1, pp. 100-115.
View/Download from: Publisher's site
View description>>
The credit card industry has been growing rapidly and thus huge numbers of consumers' credit data are collected by the credit department of the banks. The credit scoring managers often evaluate the consumer's credit with intuitive experience. However, with the support of the credit classification models, the managers can accurately evaluate the applicants' credit score. In this study, a neurocomputing-based granular approach is proposed to model credit scoring. Granular computing is used to compute the size of training and testing groups. Artificial neural networks (ANN) and data envelopment analysis (DEA) are used to model credit lending decisions in the online and offline manner, respectively. Proposed method is composed of three distinct stages based on trust and credibility concept. Trust is introduced and modeled via ANN in online module. Also credibility is modeled via DEA in offline module in present study. This paper is a pioneer in examining the concept of granularity for selecting the optimum size of testing and training group in machine learning area. In addition, proposing flexible trust ranges comparing to the current constant ones will support the importance of customers with higher credit scores to financial markets. To show the applicability and superiority of the proposed algorithm, it is applied to a credit-card data set obtained from the UCI repository. © 2013 Elsevier B.V.
Seppelt, R, Bankamp, D, Voinov, AA & Rizzoli, A 2013, '6th International Congress on Environmental Modelling and Software (iEMSs): “Managing Resources of a Limited Planet: Pathways and Visions under Uncertainty”: A congress report', Environmental Modelling & Software, vol. 43, pp. 160-162.
View/Download from: Publisher's site
Tafavogh, S, Navarro, KF, Catchpoole, DR & Kennedy, PJ 2013, 'Non-parametric and integrated framework for segmenting and counting neuroblastic cells within neuroblastoma tumor images', MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, vol. 51, no. 6, pp. 645-655.
View/Download from: Publisher's site
View description>>
Neuroblastoma is a malignant tumor and a cancer in childhood that derives from the neural crest. The number of neuroblastic cells within the tumor provides significant prognostic information for pathologists. An enormous number of neuroblastic cells makes the process of counting tedious and error-prone. We propose a user interaction-independent framework that segments cellular regions, splits the overlapping cells and counts the total number of single neuroblastic cells. Our novel segmentation algorithm regards an image as a feature space constructed by joint spatial-intensity features of color pixels. It clusters the pixels within the feature space using mean-shift and then partitions the image into multiple tiles. We propose a novel color analysis approach to select the tiles with similar intensity to the cellular regions. The selected tiles contain a mixture of single and overlapping cells. We therefore also propose a cell counting method to analyse morphology of the cells and discriminate between overlapping and single cells. Ultimately, we apply watershed to split overlapping cells. The results have been evaluated by a pathologist. Our segmentation algorithm was compared against adaptive thresholding. Our cell counting algorithm was compared with two state of the art algorithms. The overall cell counting accuracy of the system is 87.65 %. © 2013 International Federation for Medical and Biological Engineering.
Tang, J, Chen, L, King, I & Wang, J 2013, 'Introduction to Special section on Large-scale Data Mining', Data & Knowledge Engineering, vol. 87, pp. 355-356.
View/Download from: Publisher's site
ten Bhomer, M & van den Hoven, E 2013, 'Interaction design for supporting communication between Chinese sojourners', PERSONAL AND UBIQUITOUS COMPUTING, vol. 17, no. 1, pp. 145-157.
View/Download from: Publisher's site
View description>>
In our global village, distance is not a barrier anymore for traveling. People experience new cultures and face accompanying difficulties in order to live anywhere. Social support can help these sojourners to cope with difficulties, such as culture shock. In this paper, we investigate how computer-mediated communication (CMC) tools can facilitate social support when living physically separated from loved-ones in different cultures. The goal is to understand the design considerations necessary to design new CMC tools. We studied communication practices of Chinese sojourners living in the Netherlands and the use of a technology probe with a novel video communication system. These results led to recommendations which can help designers to design interactive communication tools that facilitate communication across cultures. We conclude the paper with an interactive communication device called Circadian, which was designed based on these recommendations. We experienced the design recommendations to be abstract enough to leave space for creativity while providing a set of clear requirements which we used to base design decisions upon.
Van de Velde, S, Vander Stichele, R, Fauquert, B, Geens, S, Heselmans, A, Ramaekers, D, Kunnamo, I & Aertgeerts, B 2013, 'EBMPracticeNet: A Bilingual National Electronic Point-Of-Care Project for Retrieval of Evidence-Based Clinical Guideline Information and Decision Support', JMIR Research Protocols, vol. 2, no. 2, pp. e23-e23.
View/Download from: Publisher's site
View description>>
Background
In Belgium, the construction of a national electronic point-of-care information service, EBMPracticeNet, was initiated in 2011 to optimize quality of care by promoting evidence-based decision-making. The collaboration of the government, health care providers, evidence-based medicine (EBM) partners, and vendors of electronic health records (EHR) is unique to this project. All Belgian health care professionals get free access to an up-to-date database of validated Belgian and nearly 1000 international guidelines, incorporated in a portal that also provides EBM information from other sources than guidelines, including computerized clinical decision support that is integrated in the EHRs.Objective
The objective of this paper was to describe the development strategy, the overall content, and the management of EBMPracticeNet which may be of relevance to other health organizations creating national or regional electronic point-of-care information services.Methods
Several candidate providers of comprehensive guideline solutions were evaluated and one database was selected. Translation of the guidelines to Dutch and French was done with translation software, post-editing by translators and medical proofreading. A strategy is determined to adapt the guideline content to the Belgian context. Acceptance of the computerized clinical decision support tool has been tested and a randomized controlled trial is planned to evaluate the effect on process and patient outcomes.Results
Currently, EBMPracticeNet is in "work in progress" state. Reference is made to the results of a pilot study and to further planned research including a randomized controlled trial.Conclusions
The collaboration of government, health care providers, EBM partners, and vendors of EHRs is unique. The potential value of the project is great. The link between all the EHRs from different vendors and a national database held on a single platform that is controlle...
van den Hoven, E, van de Garde-Perik, E, Offermans, S, van Boerdonk, K & Lenssen, K-MH 2013, 'Moving Tangible Interaction Systems to the Next Level', COMPUTER, vol. 46, no. 8, pp. 70-76.
View/Download from: Publisher's site
View description>>
Understanding tangible interactions foundational concepts can lead to systems with direct, integrated, and meaningful data control and representation.
Verma, P, Singh, R & Singh, AK 2013, 'A framework to integrate speech based interface for blind web users on the websites of public interest', Human-centric Computing and Information Sciences, vol. 3, no. 1, pp. 1-18.
View/Download from: Publisher's site
View description>>
AbstractDespite many assistive tools available for browsing the web, blind persons are not able to perform the tasks using internet that are done by persons without such disability. Even the futuristic social networking sites and other websites using the features of web 2.0 indicate a lesser accessible/responsible web. In this paper, we propose a framework, which can be used by the websites of public interest to make their important utilities better accessible and usable to blind web users. The approach is based on providing an alternate access system on the fly using one single website. The framework makes use of existing technologies like JavaScript, available speech APIs etc. and therefore provides a lightweight and robust solution to the accessibility problem. As a case study, we demonstrate the usefulness of the proposed framework by showing its working on a key functionality of the Indian Railways Reservation Website.
Voinov, A & Shugart, HH 2013, '‘Integronsters’, integral and integrated modeling', Environmental Modelling & Software, vol. 39, pp. 149-158.
View/Download from: Publisher's site
Woodside, AG, Sood, S & Muniz, KM 2013, 'Creating and Interpreting Visual Storytelling Art in Extending Thematic Apperception Tests and Jung's Method of Interpreting Dreams', Advances in Culture, Tourism and Hospitality Research, vol. 7, pp. 15-45.
View/Download from: Publisher's site
View description>>
The main thesis here is that the stories that some brands tell to consumers enable consumers to achieve archetypal experiences. Examining the stories consumers tell in natural contexts involving shopping for and using brands informs explanations of associations of archetypes, brands, and consumers. The study advances the use of degrees-offreedom analysis (DFA) and creating visual narrative art (VNA) as useful steps for confirming or disconfirming whether or not the stories consumers tell have themes, events, and outcomes that match with the core storylines told by brands. As a proposal, an extension of thematic apperception tests (TATs) is relevant in applying the DFA to brand-consumer storytelling research. The study includes a review of early work on TATs, DFA, archetypal theory, and how brands become icons. The study's theory, method, and findings provide useful tools for brand managers and researchers on issues that relate to psychology and marketing. © 2013 by Emerald Group Publishing Limited.
Wu, J, Wang, J, Lu, H, Dong, Y & Lu, X 2013, 'Short term load forecasting technique based on the seasonal exponential adjustment method and the regression model', Energy Conversion and Management, vol. 70, no. 1, pp. 1-9.
View/Download from: Publisher's site
View description>>
For an energy-limited economy system, it is crucial to forecast load demand accurately. This paper devotes to 1-week-ahead daily load forecasting approach in which load demand series are predicted by employing the information of days before being similar to that of the forecast day. As well as in many nonlinear systems, seasonal item and trend item are coexisting in load demand datasets. In this paper, the existing of the seasonal item in the load demand data series is firstly verified according to the Kendall τ correlation testing method. Then in the belief of the separate forecasting to the seasonal item and the trend item would improve the forecasting accuracy, hybrid models by combining seasonal exponential adjustment method (SEAM) with the regression methods are proposed in this paper, where SEAM and the regression models are employed to seasonal and trend items forecasting respectively. Comparisons of the quartile values as well as the mean absolute percentage error values demonstrate this forecasting technique can significantly improve the accuracy though models applied to the trend item forecasting are eleven different ones. This superior performance of this separate forecasting technique is further confirmed by the paired-sample T tests. © 2013 Elsevier B.V. All rights reserved.
Wu, Z, Xu, G, Lu, C, Chen, E, Zhang, Y & Zhang, H 2013, 'Position-wise contextual advertising: Placing relevant ads at appropriate positions of a web page', Neurocomputing, vol. 120, no. 1, pp. 524-535.
View/Download from: Publisher's site
View description>>
Web advertising, a form of online advertising, which uses the Internet as a medium to post product or service information and attract customers, has become one of the most important marketing channels. As one prevalent type of web advertising, contextual
Xie, Y, Hu, J, Xiang, Y, Yu, S, Tang, S & Wang, Y 2013, 'Modeling Oscillation Behavior of Network Traffic by Nested Hidden Markov Model with Variable State-Duration', IEEE Transactions on Parallel and Distributed Systems, vol. 24, no. 9, pp. 1807-1817.
View/Download from: Publisher's site
Xu, C, Du, C, Zhao, GF & Yu, S 2013, 'A novel model for user clicks identification based on hidden semi-Markov', Journal of Network and Computer Applications, vol. 36, no. 2, pp. 791-798.
View/Download from: Publisher's site
Xu, G, Yu, J & Lee, W 2013, 'Guest editorial: Social networks and social Web mining', World Wide Web, vol. 16, no. 5-6, pp. 541-544.
View/Download from: Publisher's site
View description>>
NA
Xu, Y, Shi, P, Merigó, JM & Wang, H 2013, 'Some proportional 2-tuple geometric aggregation operators for linguistic decision making', Journal of Intelligent & Fuzzy Systems, vol. 25, no. 3, pp. 833-843.
View/Download from: Publisher's site
Yang-Yin Lin, Jyh-Yeong Chang, Pal, NR & Chin-Teng Lin 2013, 'A Mutually Recurrent Interval Type-2 Neural Fuzzy System (MRIT2NFS) With Self-Evolving Structure and Parameters', IEEE Transactions on Fuzzy Systems, vol. 21, no. 3, pp. 492-509.
View/Download from: Publisher's site
View description>>
In this paper, a mutually recurrent interval type-2 neural fuzzy system (MRIT2NFS) is proposed for the identification of nonlinear and time-varying systems. The MRIT2NFS uses type-2 fuzzy sets in order to enhance noise tolerance of the system. In the MRIT2NFS, the antecedent part of each recurrent fuzzy rule is defined using interval type-2 fuzzy sets, and the consequent part is of the Takagi-Sugeno-Kang type with interval weights. The antecedent part of MRIT2NFS forms a local internal feedback and interaction loop by feeding the rule firing strength of each rule to others including itself. The consequent is a linear combination of exogenous input variables. The learning of MRIT2NFS starts with an empty rule base and all rules are learned online via structure and parameter learning. The structure learning of MRIT2NFS uses online type-2 fuzzy clustering. For parameter learning, the consequent part parameters are tuned by rule-ordered Kalman filter algorithm to reinforce parameter learning ability. The type-2 fuzzy sets in the antecedent and weights representing the mutual feedback are learned by the gradient descent algorithm. After the training, a weight-elimination scheme eliminates feedback connections that do not have much effect on the network behavior. This method can efficiently remove redundant recurrence and interaction weights. Finally, the MRIT2NFS is used for system identification under both noise-free and noisy environments. For this, we consider both time series prediction and nonlinear plant modeling. Compared with type-1 recurrent fuzzy neural networks, simulation results show that our approach produces smaller root-mean-squared errors using the same number of iterations. © 2013 IEEE.
Yu, D, Merigó, JM & Zhou, L 2013, 'Interval-valued multiplicative intuitionistic fuzzy preference relations', International Journal of Fuzzy Systems, vol. 15, no. 4, pp. 412-422.
View description>>
Inspired by the idea of multiplicative intuitionistic preference relation (Xia MM et al. Preference relations based on intuitionistic multiplicative information, IEEE Transactions on Fuzzy Systems, 2013, 21(1): 113-133), in this paper, a new preference relation called the interval-valued multiplicative intuitionistic preference relation is developed. It is analyzed the basic operations for interval-valued multiplicative intuitionistic preference information and its aggregation techniques. An interval-valued multiplicative intuitionistic group decision making model is presented in which experts provide their preference relation by interval-valued multiplicative intuitionistic fuzzy expressions, and give a real case about talent introduction in Zhejiang University of Finance and Economics to illustrate our methods. © 2013 TFSA.
ZENG, S, LI, WEI & MERIGÓ, JM 2013, 'EXTENDED INDUCED ORDERED WEIGHTED AVERAGING DISTANCE OPERATORS AND THEIR APPLICATION TO GROUP DECISION-MAKING', International Journal of Information Technology & Decision Making, vol. 12, no. 04, pp. 789-811.
View/Download from: Publisher's site
View description>>
The induced ordered weighted averaging distance (IOWAD) approach is very suitable in situations in which the available information is represented with exact numerical values. In this paper, we develop some extended IOWAD operators: the linguistic induced ordered weighted averaging distance (LIOWAD) operator, the uncertain induced ordered weighted averaging distance (UIOWAD) operator and the fuzzy induced ordered weighted averaging distance (FIOWAD) operator. Their main objective is to assess uncertain situations in which the available information is given in the form of linguistic variables, interval numbers and fuzzy numbers. Some special cases of these three new extensions are studied. Finally, we develop an application of the new operators in a group decision-making problem under an uncertain environment and illustrate it with a numerical example.
Zeng, S, Merigó, JM & Su, W 2013, 'The uncertain probabilistic OWA distance operator and its application in group decision making', Applied Mathematical Modelling, vol. 37, no. 9, pp. 6266-6275.
View/Download from: Publisher's site
Zhang, Z, Lin, H, Liu, K, Wu, D, Zhang, G & Lu, J 2013, 'A hybrid fuzzy-based personalized recommender system for telecom products/services', INFORMATION SCIENCES, vol. 235, no. 1, pp. 117-129.
View/Download from: Publisher's site
View description>>
The Internet creates excellent opportunities for businesses to provide personalized online services to their customers. Recommender systems are designed to automatically generate personalized suggestions of products/services to customers. Because various uncertainties exist within both product and customer data, it is a challenge to achieve high recommendation accuracy. This study develops a hybrid recommendation approach which combines user-based and item-based collaborative filtering techniques with fuzzy set techniques and applies it to mobile product and service recommendation. It particularly implements the proposed approach in an intelligent recommender system software called Fuzzy-based Telecom Product Recommender System (FTCP-RS). Experimental results demonstrate the effectiveness of the proposed approach and the initial application shows that the FTCP-RS can effectively help customers to select the most suitable mobile products or services. © 2012 Elsevier Inc. All rights reserved.
Zong, Y, Jin, P, Xu, D & Pan, R 2013, 'A Clustering Algorithm based on Local Accumulative Knowledge', Journal of Computers, vol. 8, no. 2, pp. 365-371.
View/Download from: Publisher's site
View description>>
Clustering as an important unsupervised learning technique is widely used to discover the inherent structure of a given data set. For clustering is depended on applications, researchers use different models to defined clustering problems. Heuristic clustering algorithm is an efficient way to deal with clustering problem defined by combining optimization model, but initialization sensitivity is an inevitable problem. In the past decades, a lot of methods have been proposed to deal with such problem. In this paper, on the contrary, we take the advantage of the initialization sensitivity to design a new clustering algorithm. We, firstly, run K-means, a widely used heuristic clustering algorithm, on data set for multiple times to generate several clustering results; secondly, propose a structure named Local Accumulative Knowledge (LAKE) to capture the common information of clustering results; thirdly, execute the Single-linkage algorithm on LAKE to generate a rough clustering result; eventually, assign the rest data objects to the corresponding clusters. Experimental results on synthetic and real world data sets demonstrate the superiority of the proposed approach in terms of clustering quality measures. © 2013 ACADEMY PUBLISHER.
Adak, C & Chaudhuri, BB 1970, 'Extraction of Doodles and Drawings from Manuscripts', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 515-520.
View/Download from: Publisher's site
View description>>
In this paper we propose an approach to separate the non-texts from texts of a manuscript. The non-texts are mainly in the form of doodles and drawings of some exceptional thinkers and writers. These have enormous historical values due to study on those writers' subconscious as well as productive mind. We also propose a computational approach to recover the struck-out texts to reduce human effort. The proposed technique has a preprocessing stage, which removes noise using median filter and segments object region using fuzzy c-means clustering. Now connected component analysis finds the major portions of non-texts, and window examination eliminates the partially attached texts. The struck-out texts are extracted by eliminating straight lines, measuring degree of continuity, using some morphological operations. © Springer-Verlag 2013.
Ahad, MT, Dyson, LE & Gay, V 1970, 'Exploring m-banking for rural SMES from the bank's perspective: A focus group study in Bangladesh', Proceedings - Pacific Asia Conference on Information Systems, PACIS 2013, Pacific Asia Conference on Information Systems, The Association for Information Systems (AIS), Jeju Island, South Korea, pp. 1-14.
View description>>
There is relatively little known about the m-banking use by the rural small and medium enterprises (SMEs) in Bangladesh. This research fills the gap by presenting the Key advantages of using m-banking by the rural SMEs that a bank can offer in Bangladesh. The research also reports on the critical country level-factors, organizational success factors and obstacles in m-banking adoption for rural SMEs. A focus group was interviewed and the data were analysed using NVivo9.The findings indicate that improved banking facility, an opportunity to create employment in rural area, easy settlement of trade between SMEs, better cash management are some important advantages of m-banking for the rural SMEs. Technology and human resource capabilities of the banks together with the development of mobile infrastructure are some influential factors in m-banking development in Bangladesh. This paper also identifies engagement with the SMEs as an important organizational factor in m-banking diffusion for rural SME that have been rarely identified in prior literature. However, the major constraints are the operational constraints such as cash management, policy and regulation and administrative obstacles. Recommendation for a business version m-banking, joint advertisement by the banks and mobile telecom organisations and a regional m-banking platform are also new knowledge in m-banking research. This is one of the few papers from the perspective of a bank, since most of the literature approaches the matter from the consumer standpoint. The outcomes and results of this research will be of potential value to the government, banks and mobile telecommunications in accelerating the development of m-banking in Bangladesh and in other developing countries.
Ahadi, A & Lister, R 1970, 'Geek genes, prior knowledge, stumbling points and learning edge momentum', Proceedings of the ninth annual international ACM conference on International computing education research, ICER '13: International Computing Education Research Conference, ACM, San Diego, CA, USA, pp. 123-128.
View/Download from: Publisher's site
View description>>
ABSTRACT: Computing academics report bimodal grade distributions in their CS1 classes. Some academics believe that such a distribution is due to their being an innate talent for programming, a geek gene, which some students have, while other students do not have it. Robins introduced the concept of learning edge momentum, which offers an alternative explanation for the purported bimodal grade distribution. In this paper, we analyze empirical data from a real introductory programming class, looking for evidence of geek genes, learning edge momentum and other possible factors.
Al-Jaafreh, AO, Gill, A, Al-Ani, A, Al-adaileh, R & Alzoubi, Y 1970, 'Factors influencing customer's initial trust of internet banking services in the Jordanian context: A review', Creating Global Competitive Economies: 2020 Vision Planning and Implementation - Proceedings of the 22nd International Business Information Management Association Conference, IBIMA 2013, International Business Information Management, IBIM, Rome, Italy, pp. 281-288.
View description>>
Internet banking services (IBS) offer customers and banks many benefits. IBS have been widely adopted and used in developed countries; however IBS adoption in developing countries such as Jordan is still low. Lack of customer' trust is considered the most important impediment to the use of IBS in developing countries. The aim of this study is to investigate and identify the factors that influence customer's initial trust of IBS in the Jordanian context. This paper adopts the qualitative literature survey approach and reports two main categories: Human category and Information Technology category. Human category includes: personality-based trust, cognition-based trust (Reputation), Institutional-based trust (structural assurance), social factors (culture) and supporting factors (relative advantages). Information Technology category includes: website factors (security, privacy, and general online experiences)). We argue that these factors can be useful for organisations in understanding and addressing customer's initial trust about IBS in the Jordanian context.
Andrews, T, Dyson, LE & Wishart, J 1970, 'Supporting Practitioners in Implementing Mobile Learning and Overcoming Ethical Concerns: A Scenario-Based Approach', 12th World Conference on Mobile and Contextual Learning (mLearn 2013), 12th World Conference on Mobile and Contextual Learning (mLearn 2013), Hamad bin Khalifa University Press (HBKU Press), Qatar, pp. 1-8.
View/Download from: Publisher's site
View description>>
AbstractEthical concerns about mobile learning have been raised across all sectors of the educational system, sometimes resultingin the banning of mobile phones in schools and retarding the adoption of mobile learning as rapidly as might haveinitially been envisaged. A way of dealing with this problem is to empower mobile learning practitioners and researchersto deal effectively with ethical dilemmas through the development of their ethical reasoning. A commonly acceptedapproach to ethical development is by means of scenarios, to which ethical principles are applied in order to producesolutions. In this paper four scenarios are presented which were developed at two mobile learning and ethics workshopsconducted in 2012. An ethics framework for the analysis of the scenarios is described and finally a strategy is outlined forconducting professional development of teachers and academics as well as training for student teachers. The authorspropose that ethical scenarios provide not only a means of developing the competence of teachers and academics indealing with ethical issues in their mobile learning practice and research, but may well lead to the greater adoption ofmobile learning as fears of ethical issues diminish once a way to providing solutions is demonstrated. Finally, scenariosare seen as a tool to foster conversations with educational managers and administrators in order to promote policydevelopment and practical responses to ethical issues in mobile learning.
Atif, A, Richards, D & Bilgin, A 1970, 'A student retention model: Empirical, theoretical and pragmatic considerations', Proceedings of the 24th Australasian Conference on Information Systems.
View description>>
This research-in-progress paper draws on an extensive body of literature related to student retention. The purpose of this study is to develop a student retention model utilising student demographic data and a combination of data from student information systems, course management systems and other similar tools to accurately predict academic success of students at our own institution. Our research extends Tinto’s model by incorporating a number of components from Bean’s, Astin’s and Swail’s model. Our proposed eclectic model consists of seven components, identified as determinants of student retention. The strength in the model lies in its ability to help institutions work proactively to support student retention and achievement. The proposed research methodology to be used in this study is “a mixed-methods concurrent triangulation strategy”. The results are expected to indicate which of the factors are most important in developing an information system to predict and suggest interventions to improve retention.
Atif, A, Richards, D, Bilgin, A & Marrone, M 1970, 'Learning analytics in higher education: A summary of tools and approaches', 30th Annual conference on Australian Society for Computers in Learning in Tertiary Education, ASCILITE 2013, pp. 68-72.
View description>>
Higher education institutions recently have been drawing on methods from learning analytics to make decisions about learners’ academic progress, predictions about future performance and to recognise potential issues. As the use of learning analytics in higher education is a relatively new area of practice and research, the intent of this paper is to provide an overview of learning analytics including a summary of some exemplar tools. Finally we conclude the paper with a discussion on challenges and ethical issues.
Azadeh, A, Saberi, M, Atashbar, NZ, Chang, E & Pazhoheshfar, P 1970, 'Z-AHP: A Z-number extension of fuzzy analytical hierarchy process', 2013 7th IEEE International Conference on Digital Ecosystems and Technologies (DEST), 2013 7th IEEE International Conference on Digital Ecosystems and Technologies (DEST) - Complex Environment Engineering, IEEE, Menlo Park, CA, pp. 141-147.
View/Download from: Publisher's site
Bakker, S, van den Hoven, E & Eggen, B 1970, 'FireFlies', Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, TEI'13: Seventh International Conference on Tangible, Embedded, and Embodied Interaction, ACM, Barcelona, Spain, pp. 57-64.
View/Download from: Publisher's site
View description>>
This paper presents a research-through-design study into interactive systems for a primary school setting to support teachers' everyday tasks. We developed an open-ended interactive system called FireFlies, which is intended to be interacted with in the periphery of the teacher's attention and thereby become an integral part of everyday routines. FireFlies uses light-objects and audio as a (background) information display. Furthermore, teachers can manipulate the light and audio through physical interaction. A working prototype of FireFlies was deployed in four different classrooms for six weeks. Qualitative results reveal that all teachers found a relevant way of working with FireFlies, which they applied every day of the evaluation. After the study had ended and the systems were removed from the schools, the teachers kept reaching for the devices and mentioned they missed FireFlies, which shows that it had become part of their everyday routine.
Bakker, S, Van Den Hoven, E & Eggen, E 1970, 'FireFlies: Physical peripheral interaction design for the everyday routine of primary school teachers', TEI 2013 - Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, pp. 57-64.
View/Download from: Publisher's site
View description>>
This paper presents a research-through-design study into interactive systems for a primary school setting to support teachers' everyday tasks. We developed an open-ended interactive system called FireFlies, which is intended to be interacted with in the periphery of the teacher's attention and thereby become an integral part of everyday routines. FireFlies uses light-objects and audio as a (background) information display. Furthermore, teachers can manipulate the light and audio through physical interaction. A working prototype of FireFlies was deployed in four different classrooms for six weeks. Qualitative results reveal that all teachers found a relevant way of working with FireFlies, which they applied every day of the evaluation. After the study had ended and the systems were removed from the schools, the teachers kept reaching for the devices and mentioned they missed FireFlies, which shows that it had become part of their everyday routine. Copyright 2013 ACM.
Balvey, A, Gil Lafuente, AM, Merigo, JM & Garriga, X 1970, 'APPLICATION OF THE FORGOTTEN EFFECTS MODEL TO THE EARLY DIAGNOSIS OF HEREDITARY HEMOCHROMATOSIS', DECISION MAKING SYSTEMS IN BUSINESS ADMINISTRATION, International Conference on Modeling and Simulation in Engineering, Economics and Management for Sustainable Development, WORLD SCIENTIFIC PUBL CO PTE LTD, Rio de Janeiro, BRAZIL, pp. 407-420.
Bano Sahibzada, M & Zowghi, D 1970, 'Service Oriented Requirements Engineering: Practitioner’s Perspective', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Service Oriented Computing, Springer Berlin Heidelberg, Shanghai, China, pp. 380-392.
View/Download from: Publisher's site
View description>>
Over a decade ago Service Oriented Architecture (SOA) was introduced to provide better alignment between business requirements and IT solutions. During this period a great deal of research interest has emerged from academia and industry alike, to promote this new style of software development. The promise was that SOA based development will improve reusability, agility, platform independence and dynamic discovery, reconfiguration and change management. In spite of all the promises and enhancement in tools and technologies, the service oriented software development continues to face various challenges especially in Requirements Engineering. In this paper we present a qualitative study of Service Oriented Requirements Engineering. Data was collected by conducting interviews with practitioners from IT companies in Sydney, who are experienced in working on SOA based projects. The objective was to explore the issues and challenges faced during requirements analysis in service oriented software development. The results show that Service-Oriented software development has not only inherited existing issues of traditional Requirements Engineering but has also introduced new challenges. The technology has become advanced in SOA but the issues related to the organizational and business aspect of service oriented development need more attention for achieving true benefits of this technology. © Springer-Verlag 2013.
Bano, M & Zowghi, D 1970, 'User involvement in software development and system success: a systematic literature review.', EASE, International Conference on Evaluation and Assessment in Software Engineering, ACM, Porto de Galinhas, Brazil, pp. 125-130.
View/Download from: Publisher's site
View description>>
Context: In the last four decades involving users in the software development process is claimed to have a positive impact on the success of that software. However, previous reviews on this topic have produced conflicting results. Objectives: Our aim is to present a review on user involvement in software development process and investigate its relationship to software system success. Methods: For our exploration, we performed a Systematic Literature Review using the guidelines provided in the Evidence Based Software Engineering literature. Results: 87 relevant empirical studies were selected and reviewed that investigate various perspectives and concepts of user involvement in software development process during the period of 1980-2012. Among 87 studies reviewed, 59 report that user involvement positively contributes to system success, 7 suggest a negative contribution and 21 are uncertain. Conclusions: Our results show an overall positive impact of user involvement on system success. It also suggests that the relationship between user involvement and system success is neither direct nor simple, and it depends on many different factors and conditions surrounding systems development processes. Copyright 2013 ACM.
Bano, M, Zowghi, D & IEEE 1970, 'Users' Involvement in Requirements Engineering and System Success', 2013 IEEE THIRD INTERNATIONAL WORKSHOP ON EMPIRICAL REQUIREMENTS ENGINEERING (EMPIRE), IEEE International Requirements Engineering Conference, IEEE, Rio de Janeiro, Brazil, pp. 24-31.
View/Download from: Publisher's site
View description>>
Involving users in software development in general, and in Requirements Engineering (RE) in particular, has been considered for over three decades. It is axiomatically believed to contribute significantly to a successful system. However, not much attention has been paid to ascertain in which phases of software development life cycle involvement or participation of users is most beneficial. In this paper we present an investigation into the concept of users' involvement during RE activities and explore its relationship with system success. We have conducted a systematic literature review (SLR) using guidelines of Evidence Based Software Engineering. Our SLR identified 87 empirical studies from the period of 1980 to 2012. Only 13 studies focused specifically on investigating users' involvement in RE and 9 of these confirmed benefits of involving users in requirements analysis and 4 remain inconclusive. Effective involvement of users in RE may reduce the need for their more active involvement in the rest of software development. This paper also offers a checklist we have created from the identified factors of all 87 empirical studies that should be utilised for effective users' involvement in RE. © 2013 IEEE.
Baro, EN, Oberst, S, Lai, JCS & Evans, TA 1970, 'A signal processing method for extracting vibration signals due to ants' activities', 42nd International Congress and Exposition on Noise Control Engineering 2013, INTER-NOISE 2013: Noise Control for Quality of Life, International Congress and Exposition on Noise Control Engineering, Innsbruck, Austria, pp. 3631-3640.
View description>>
Many software algorithms have been developed to track ants by analysing recorded videos. On the other hand, the feasibility of using vibrations measured at the substrate to classify ants' behaviour has not been examined before. A method is developed to separate vibrations owing to ants' activities from the substrate's response through a filtering/de-convolution procedure. This involves estimating the frequency response of the substrate and applying wavelet analysis to the measured vibrations. A number of responses due to ants' behaviours have been observed: Ants shaking, falling, carrying stones, walking, scratching/biting, tapping hind legs, grooming, and antennation/feeding. Vibrations produced by ants falling, carrying stones, walking and scratching/biting are measurable (i.e, above background noise levels). The proposed method is shown to be successful in classifying activities due to ants falling, ants carrying stones and to a lesser extent ants' scratching/biting. With further refinement, it seems feasible to use vibrations and the proposed algorithm to measure ants' behaviours in bioassays. Copyright© (2013) by Austrian Noise Abatement Association (OAL).
Behbood, V, Lu, J, Zhang, G & IEEE 1970, 'Text Categorization by Fuzzy Domain Adaptation', 2013 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ - IEEE 2013), IEEE International Conference on Fuzzy Systems, IEEE, Hyderabad, India, pp. 1841-1848.
View/Download from: Publisher's site
View description>>
Machine learning methods have attracted attention of researches in computational fields such as classification/categorization. However, these learning methods work under the assumption that the training and test data distributions are identical. In some real world applications, the training data (from the source domain) and test data (from the target domain) come from different domains and this may result in different data distributions. Moreover, the values of the features and/or labels of the data sets could be non-numeric and contain vague values. In this study, we propose a fuzzy domain adaptation method, which offers an effective way to deal with both issues. It utilizes the similarity concept to modify the target instances' labels, which were initially classified by a shift-unaware classifier. The proposed method is built on the given data and refines the labels. In this way it performs completely independently of the shift-unaware classifier. As an example of text categorization, 20Newsgroup data set is used in the experiments to validate the proposed method. The results, which are compared with those generated when using different baselines, demonstrate a significant improvement in the accuracy. © 2013 IEEE.
Benaben, F, Hussain, F & Pereira, E 1970, 'Track I: Collaborative platforms for sustainable logistics and transportation', 2013 7th IEEE International Conference on Digital Ecosystems and Technologies (DEST), 2013 7th IEEE International Conference on Digital Ecosystems and Technologies (DEST) - Complex Environment Engineering, IEEE.
View/Download from: Publisher's site
Berry, R, Edmonds, E & Johnston, A 1970, 'Representational systems with tangible and graphical elements', 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE, Adelaide, Australia, pp. 1-4.
View/Download from: Publisher's site
View description>>
This research centres on the development of a number of prototype interactive systems, each of which uses a tangible means of representation and manipulation of musical elements in musical composition. Data gathered through collaborative prototyping and user studies is analysed using grounded theory methods. The resultant contribution to knowledge includes theory, design criteria and guidelines specific to tangible representations of music. This knowledge will be useful for future design of systems that use tangible representations, particularly for making music. The prototypes themselves also serve as a form of knowledge and as creative works. © 2013 IEEE.
Biming Tian, Merrick, K, Shui Yu & Jiankun Hu 1970, 'A hierarchical pea-based anomaly detection model', 2013 International Conference on Computing, Networking and Communications (ICNC), 2013 International Conference on Computing, Networking and Communications (ICNC 2013), IEEE, pp. 621-625.
View/Download from: Publisher's site
View description>>
A hierarchical intrusion detection model is proposed to detect both anomaly and misuse attacks. In order to further speed up the training and testing, PeA-based feature extraction algorithm is used to reduce the dimensionality of the data. A PeA-based algorithm is used to filter normal data out in the upper level. The experiment results show that peA can reduce noise in the original data set and the PeA-based algorithm can reach the desirable performance. © 2013 IEEE.
Bommes, D, Lévy, B, Pietroni, N, Puppo, E, Silva, CT, Tarini, M & Zorin, D 1970, 'Quad-Mesh Generation and Processing: A Survey.', Comput. Graph. Forum, WILEY, pp. 51-76.
View/Download from: Publisher's site
View description>>
Triangle meshes have been nearly ubiquitous in computer graphics, and a large body of data structures and geometry processing algorithms based on them has been developed in the literature. At the same time, quadrilateral meshes, especially semi-regular ones, have advantages for many applications, and significant progress was made in quadrilateral mesh generation and processing during the last several years. In this survey we discuss the advantages and problems of techniques operating on quadrilateral meshes, including surface analysis and mesh quality, simplification, adaptive refinement, alignment with features, parametrisation and remeshing. Triangle meshes have been nearly ubiquitous in computer graphics, and a large body of data structures and geometry processing algorithms based on them has been developed in the literature. At the same time, quadrilateral meshes, especially semi-regular ones, have advantages for many applications, and significant progress was made in quadrilateral mesh generation and processing during the last several years. In this survey we discuss the advantages and problems of techniques operating on quadrilateral meshes, including surface analysis and mesh quality, simplification, adaptive refinement, alignment with features, parametrization, and remeshing. © 2013 The Eurographics Association and John Wiley & Sons Ltd.
Bressan, N, James, A & McGregor, C 1970, 'Integration of drug dosing data with physiological data streams using a cloud computing paradigm', 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), IEEE, Osaka, JAPAN, pp. 4175-4178.
View/Download from: Publisher's site
Chen, H, Zhang, G, Lu, J & IEEE 1970, 'A Time-series-based Technology Intelligence Framework by Trend Prediction Functionality', 2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), IEEE International Conference on Systems, Man and Cybernetics, IEEE, Manchester, UK, pp. 3477-3482.
View/Download from: Publisher's site
View description>>
Technology Intelligence (TI) indicates the concept and applications that transform data hidden in patents or scientific literature into technical insight for technology development planning and strategies formulation. Although much effort has been put into technology trend analysis in existing research, the majority of the results are still obtained from expert opinions on the basis of historical trends presented by content-based Technology Intelligence tools. To improve this situation, this paper proposes a time-series-based framework for TI that enables the system to be more effective when dealing with trend prediction requirements. Time-series analysis module is first applied in TI framework to process patent time series for technology trend predictions in a real sense, at the same time overcome the problem that prediction of future data points' values is insufficient to support TI construction. Based on explicit patent attributes and unknown patterns learned from the historical data, the framework combines the "trend" and "content" knowledge by analyzing both time-related property and semantic attributes of patent data, to support technology development planning more efficiently and satisfactorily. A case study is presented to demonstrate the validity of trend prediction functionality, which is the emphasis of the whole framework
Cuzzocrea, A, Moussa, R & Xu, G 1970, 'OLAP*: Effectively and Efficiently Supporting Parallel OLAP over Big Data', Lecture Notes in Computer Science, International Conference on Model and Data Engineering, Springer Berlin Heidelberg, Amantea, Italy, pp. 38-49.
View/Download from: Publisher's site
View description>>
In this paper, we investigate solutions relying on data partitioning schemes for parallel building of OLAP data cubes, suitable to novel Big Data environments, and we propose the framework OLAP*, along with the associated benchmark TPC-H*d, a suitable transformation of the well-known data warehouse benchmark TPC-H. We demonstrate through performance measurements the efficiency of the proposed framework, developed on top of the ROLAP server Mondrian
Deng, C, Ji, R, Liu, W, Tao, D & Gao, X 1970, 'Visual Reranking through Weakly Supervised Multi-graph Learning.', ICCV, IEEE International Conference on Computer Vision, IEEE Computer Society, Sydney, Australia, pp. 2600-2607.
View/Download from: Publisher's site
View description>>
Visual reranking has been widely deployed to refine the quality of conventional content-based image retrieval en- gines. The current trend lies in employing a crowd of re- trieved results stemming from multiple feature modalities to boost the overall performance of visual reranking. Howev- er, a major challenge pertaining to current reranking meth- ods is how to take full advantage of the complementary property of distinct feature modalities. Given a query im- age and one feature modality, a regular visual reranking framework treats the top-ranked images as pseudo positive instances which are inevitably noisy, difficult to reveal this complementary property, and thus lead to inferior ranking performance. This paper proposes a novel image rerank- ing approach by introducing a Co-Regularized Multi-Graph Learning (Co-RMGL) framework, in which the intra-graph and inter-graph constraints are simultaneously imposed to encode affinities in a single graph and consistency across d- ifferent graphs. Moreover, weakly supervised learning driv- en by image attributes is performed to denoise the pseudo- labeled instances, thereby highlighting the unique strength of individual feature modality. Meanwhile, such learning can yield a few anchors in graphs that vitally enable the alignment and fusion of multiple graphs. As a result, an edge weight matrix learned from the fused graph automat- ically gives the ordering to the initially retrieved results. We evaluate our approach on four benchmark image re- trieval datasets, demonstrating a significant performance gain over the state-of-the-arts
Deze Zeng, Song Guo, Stojmenovic, I & Shui Yu 1970, 'Stochastic modeling and analysis of opportunistic computing in intermittent mobile cloud', 2013 IEEE 8th Conference on Industrial Electronics and Applications (ICIEA), 2013 IEEE 8th Conference on Industrial Electronics and Applications (ICIEA 2013), IEEE, Swinburne Univ Technol, Melbourne, AUSTRALIA, pp. 1902-1907.
View/Download from: Publisher's site
Dong, H, Hussain, FK & Chang, E 1970, 'UCOSAIS: A Framework for User-Centered Online Service Advertising Information Search', WEB INFORMATION SYSTEMS ENGINEERING - WISE 2013, PT I, International Conference on Web Information Systems Engineering, Springer Verlag, Nanjing, China, pp. 267-276.
View/Download from: Publisher's site
View description>>
The emergence of Internet advertising brings about an economic and efficient marketing means for small and medium enterprises in service industries. Every day, massive service advertising information is published over the Internet. Nevertheless, on the other side, service consumers find it difficult to quickly and precisely retrieve their desired services. This problem is partly caused by the ubiquitous, heterogeneous, and ambiguous nature of online service advertising information. In this paper, we propose a systematic framework – UCOSAIS – for online service advertising information search. Inspired by the philosophy of user-centered design, this framework comprises an ontology-learning-based focused crawler for service information discovery and classification, a faceted semantic search component for service concept selection, and a user-click-based similarity computing component for service concept ranking adjustment.
Erfani, SS, Abedin, B & Daneshgar, F 1970, 'Investigating the impact of facebook use on cancer survivors' psychological well-being', 19th Americas Conference on Information Systems, AMCIS 2013 - Hyperconnected World: Anything, Anywhere, Anytime, Americas Conference on Information Systems, Association of information Systems, Chicago, USA, pp. 2184-2190.
View description>>
Rapid growth of Social Network Sites (SNSs) use by cancer survivors makes it important to examine whether there is a relationship between the use of these online communities and cancer survivors' psychological well-being. This article poses the question of how the Facebook use as the most popular SNS, may impact cancer survivors' psychological well-being. To answer this question a comprehensive literature review of studies conducted in information systems and health disciplines has been undertaken and a theoretical model is proposed. This study is expected to contribute to the existing knowledge base through the development of a new theoretical model which introduces and explains the ways that SNS use may impact cancer survivors' psychological well-being. It provides important information on the health-related SNSs use and is envisioned to assist health care organizations and cancer survivors to use SNS as an e-health application. © (2013) by the AIS/ICIS Administrative Office All rights reserved.
Erfani, SS, Abedin, B, Daneshgar, F & IEEE 1970, 'A qualitative evaluation of communication in Ovarian Cancer Facebook Communities', INTERNATIONAL CONFERENCE ON INFORMATION SOCIETY (I-SOCIETY 2013), International Conference on Information Society (I-Society), IEEE, Toronto, ON, Canada, pp. 270-272.
View description>>
Social network sites use is increasingly daily. While many health related communities now exist on Facebook; the issue of arising potential drawbacks of online communities is the matter of debate. This study aimed to investigate the content of communication in Ovarian Cancer Facebook communities (i) to determine the users of Ovarian Cancer Facebook communities (ii) to understand the content of communication in Ovarian Cancer Facebook communities (iii) to examine the extent of exchanging incorrect health information and posting negative feeling in patient support Facebook communities. To this end content analysis technique was applied, 10 largest Facebook communities related to the Ovarian Cancer were selected then a thematic coding scheme was developed and random sample of the most recent wall posts and discussion topics were evaluated. Patients with Ovarian Cancer and their care takers provide queries and feedback related to personal health related information like experiences for cancer management; they also provide emotional support but potential disadvantages such as unconventional medical information and negative feeling were infrequent. © 2013 Infonomics Society.
Esteves, A, van den Hoven, E & Oakley, I 1970, 'Physical games or digital games?', Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, TEI'13: Seventh International Conference on Tangible, Embedded, and Embodied Interaction, ACM, Barcelona, Spain, pp. 167-174.
View/Download from: Publisher's site
View description>>
This paper explores how different interfaces to a problem solving task affect how users perform it. Specifically, it focuses on a customized version of the game of Four-in-arow and compares play on a physical, tangible game board with that conducted in mouse and touch-screen driven virtual versions. This is achieved through a repeated measures study involving a total of 36 participants and which explicitly assesses aspects of cognitive work through measures of time task, subjective workload, the projection of mental constructs onto external structures and the occurrence of explanatory epistemic actions. The results highlight the relevance of projection and epistemic action to this problem-solving task and suggest that the different interface forms afford instantiation of these activities in different ways. The tangible version of the system supports the most rapid execution of these actions and future work on this topic should explore the unique advantages of tangible interfaces in supporting epistemic actions.
Feng Tian, Hu, RQ, Yi Qian, Bo Rong, Bo Liu & Lin Gui 1970, 'Pure asynchronous neighbor discovery algorithms in ad hoc networks using directional antennas', 2013 IEEE Global Communications Conference (GLOBECOM), 2013 IEEE Global Communications Conference (GLOBECOM 2013), IEEE, Atlanta, GA, pp. 498-503.
View/Download from: Publisher's site
Fidge, C, Hogan, J & Lister, R 1970, 'What vs. how: Comparing students' testing and coding skills', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Adelaide, Australia, pp. 97-106.
View description>>
The well-known diffculties students exhibit when learning to program are often characterised as either diffculties in understanding the problem to be solved or diffculties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students' conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and 'coverage', respectively. The results allowed us to directly compare students' abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students' unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students' testing abilities lag well behind their coding skills.
Fitzgerald, S, Hanks, B, Lister, R, McCauley, R & Murphy, L 1970, 'What are we thinking when we grade programs?', Proceeding of the 44th ACM technical symposium on Computer science education, SIGCSE '13: The 44th ACM Technical Symposium on Computer Science Education, ACM, Denver, Colorado, USA, pp. 471-476.
View/Download from: Publisher's site
View description>>
Abstract: This paper reports on a mixed methods study which examines how four experienced instructors approached the grading of a programming problem. Two instructors used a detailed, analytic approach and two instructors employed a holistic approach. One instructor exhibited elements of a primary trait approach. Even though the four instructors used different grading scales and philosophies, their raw scores were highly correlated (Spearmanâs rho of .81) supporting the conclusion that experienced instructors usually agree on whether a program is âvery goodâ or âvery badâ. Clearly there is no single right way to grade programs. Further discourse should be encouraged for the benefit of both educators and students.
Fu, B, Xu, G, Wang, Z & Cao, L 1970, 'Leveraging Supervised Label Dependency Propagation for Multi-label Learning', 2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), IEEE International Conference on Data Mining, IEEE, Dallas, TX, USA, pp. 1061-1066.
View/Download from: Publisher's site
View description>>
Exploiting label dependency is a key challenge in multi-label learning, and current methods solve this problem mainly by training models on the combination of related labels and original features. However, label dependency cannot be exploited dynamically and mutually in this way. Therefore, we propose a novel paradigm of leveraging label dependency in an iterative way. Specifically, each label's prediction will be updated and also propagated to other labels via an random walk with restart process. Meanwhile, the label propagation is implemented as a supervised learning procedure via optimizing a loss function, thus more appropriate label dependency can be learned. Extensive experiments are conducted, and the results demonstrate that our method can achieve considerable improvements in terms of several evaluation metrics. © 2013 IEEE.
Gill, AQ 1970, 'Towards the Development of an Adaptive Enterprise Service System Model Completed Research Paper', AMCIS 2013 PROCEEDINGS, 19th Americas Conference on Information Systems (AMCIS) - Hyperconnected World - Anything, Anywhere, Anytime, ASSOC INFORMATION SYSTEMS, IL, Chicago.
Gill, AQ 1970, 'Towards the Development of an Adaptive Enterprise Service System Model.', AMCIS, Americas Conference on Information Systems, Association for Information Systems, Chicago, IL, USA, pp. 1-9.
View description>>
The continuous adaptation of modern enterprises is largely dependent on their underlying adaptive enterprise architecture capability. However, the establishment of an adaptive enterprise architecture capability requires defining the enterprise context before actually commissioning any enterprise architecture or adaptation work. This paper presents the adaptive enterprise service system (AESS) model based on the Design Science research method and Theory Triangulation approach. The AESS integrates the enterprise context perspectives from three well-known theories of agility, (agent) system, and service science. The AESS model, as a part of the large adaptive enterprise architecture toolkit, defines a modern enterprise as an adaptive enterprise service system. The adaptive enterprise service system is a multi-agent system of service systems that exhibits agility and focuses on the emerging service-centric view as opposed to a traditional product-centric view. The service-centric view of an enterprise is critical for establishing the adaptive enterprise architecture capability for handling complex enterprise transformations.
Gluga, R, Kay, J & Lister, RF 1970, 'Progoss: Mastering the Curriculum', Australian Conference on Science and Mathematics Education (ACSME2012), Australian Conference on Science and Mathematics Education, UniServe Science, Sydney, Australia, pp. 92-98.
View description>>
Abstract: In education, we need to design effective degree programs of study that meet authoritative curricula guidelines. This is challenging because of the size of the curriculum and complexity of degree program structures. When dealing with data of this size and complexity, traditional spreadsheets are a clumsy way of storing the data. A database is a better option, especially when the database is accessible over the web. We created ProGoSs to effectively tackle this complexity. ProGoSs is a web-based system that maps curricula learning goals and mastery levels to individual assessment tasks across entire degree programs. ProGoSs enables academics to answer important questions such as: Does our degree teach the essential core defined in a recommended curriculum? Where in our degree are particular parts of the recommended curriculum taught? Does our degree ensure a solid progression in building skills? Where and how do we assess the learning achieved by bare-pass students on particular parts of the recommended curriculum? We present the design and implementation of ProGoSs and report on its evaluation by mapping multiple programming subjects from multiple universities to the ACM/IEEE Computer Science 2013 topics and learning objectives. This includes a mapping to various levels of Bloomâs Taxonomy to capture mastery.
Gluga, R, Kay, J, Lister, R, Simon, Charleston, M, Harland, J & Teague, D 1970, 'A conceptual model for re ecting on expected learning vs. demonstrated student performance', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Adelaide, Australia, pp. 77-86.
View description>>
Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top-performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and re ecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.
Guo, YG, Jin, JX, Zhu, JG & Lu, HY 1970, 'Performance Analysis of a Linear Motor with HTS Bulk Magnets for Driving a Prototype HTS Maglev Vehicle', Applied Mechanics and Materials, Linear Drives for Industry Applications, Trans Tech Publications, Ltd., Hangzhou, China, pp. 33-37.
View/Download from: Publisher's site
View description>>
This paper presents the performance analysis of a linear synchronous motor which employs high-temperature superconducting (HTS) bulk magnets on the mover and normal copper windings on the stator. The linear motor is designed to drive a prototype HTS maglev vehicle in which the mover is suspended by the levitation force between HTS bulks on the mover and permanent magnets on the ground. Finite element magnetic field analysis is conducted to calculate the major parameters of the linear motor and an equation is derived to calculate the electromagnetic thrust force. Theoretical calculations are verified by the measured results on the prototype.
Guo, YG, Zeng, JB, Zhu, JG, Lu, HY & Jin, JX 1970, 'B-H relations of magnetorheological fluid under 2-D rotating magnetic field excitation', 2013 IEEE International Conference on Applied Superconductivity and Electromagnetic Devices, 2013 IEEE International Conference on Applied Superconductivity and Electromagnetic Devices (ASEMD), IEEE, Biejing, China, pp. 94-97.
View/Download from: Publisher's site
View description>>
This paper presents the investigation of the B-H relations of a magnetorheological (MR) fluid under one-dimensional (1-D) alternating and two-dimensional (2-D) rotating magnetic field excitations where B is magnetic flux density and H is magnetic field strength. The measurement is carried out by using a single sheet tester with an MR fluid sample. The measurement principle and structure of the testing system are described. The calibration of the B and H sensing coils are also reported. The relations between B and H on the MR fluid sample under 2-D rotating magnetic field excitations have been measured and compared with the results under 1-D excitations showing that the B-H relations under 2-D excitations are significantly different from the 1-D case. These data would be useful for design and analysis of MR smart structures like MR dampers. © 2013 IEEE.
Guo-Li Zhang & Hua Zuo 1970, 'Solution analysis of multi-objective programming problem', 2013 International Conference on Machine Learning and Cybernetics, 2013 International Conference on Machine Learning and Cybernetics (ICMLC), IEEE.
View/Download from: Publisher's site
Gupta, A, Kayal, N & Qiao, Y 1970, 'Random Arithmetic Formulas Can Be Reconstructed Efficiently', 2013 IEEE Conference on Computational Complexity, 2013 IEEE Conference on Computational Complexity (CCC), IEEE, Palo Alto, CA, pp. 1-9.
View/Download from: Publisher's site
Haibo Zhou, Bo Liu, Luan, TH, Fen Hou, Lin Gui, Ying Li & Xuemin Shen 1970, 'Throughput evaluation for cooperative drive-thru Internet using microscopic mobility model', 2013 IEEE Global Communications Conference (GLOBECOM), 2013 IEEE Global Communications Conference (GLOBECOM 2013), IEEE, Atlanta, GA, pp. 371-376.
View/Download from: Publisher's site
Hausen, D, Bakker, S, van den Hoven, E, Butz, A & Eggen, B 1970, 'Peripheral Interaction: Embedding HCI in Everyday Life', HUMAN-COMPUTER INTERACTION - INTERACT 2013, PT IV, International Federation for Information Processing Technical Committee 13 on Human-Computer Interaction, Springer, Capetown, South Africa, pp. 782-782.
View description>>
The comparison of actions in the physical world with actions on interactive devices reveals a remarkable difference. In daily life we easily perform several tasks in parallel, for example when drinking coffee while reading, drinking may be in the periphery of the attention. Contrarily, we usually have to focus our attention on each digital device we interact with. In recent years, the concept of interacting with computing technology in the background or periphery of the users attention is gaining traction. We call this direction Peripheral Interaction, and see it as a very promising approach to fluently embedding the increasing number of interactive devices into our everyday lives. The workshop is intended to encourage hands-on explorations and discussion about the definition of Peripheral Interaction, its design space and suitable evaluation strategies. Albrecht Schmidt will give a keynote, entitled Creating Seamless transitions between Central and Peripheral User Interfaces. While the term Peripheral Interaction is not (yet) widely adopted, several design disciplines already address different aspects of the core ideas of Peripheral Interaction (e.g. ambient information systems, ubiquitous computing, implicit interaction, eyes-free interaction, calm technology). We want to sharpen the focus for Peripheral Interaction by offering a platform for exchange of knowledge and community building to establish a network around Peripheral Interaction for further collaboration. This workshop invites researchers and practitioners from different disciplines (e.g. computer science, interaction design, interactive arts, psychology, cognitive science, product design and social science), to share their experiences with human-computer interaction for the everyday routine, and aims to lay the foundations for a structured exploration of the new interaction paradigm of Peripheral Interaction. More information about the workshop is available at the workshops website www.peripheralinte...
Holland, BE, Brain, T & Mohamed Mowjoon, D 1970, 'Running before you can walk: creating blended learning in collaborative spaces', Proceedings of the 24th Annual Conference of the Australasian Association for Engineering Education - AAEE2013, AAEE - Annual Conference of Australasian Association for Engineering Education, Australasian Association for Engineering Education Conference (24th: 2013), Crowne Plaza Hotel, Gold Coast, Queensland, pp. 1-10.
View description>>
Many universities across Australia are undertaking significant works to upgrade online and face-to-face teaching technologies and campus environments. The University of Technology, Sydney is one of these institutions and, in 2014, the faculty of engineering and IT will occupy a new complex featuring a range of interactive and collaborative learning spaces. There is a growing body of literature evaluating the delivery of courses using online learning environments and collaborative learning spaces (eg Radcliffe et al, 2009, Rasmussen et al 2012). This paper introduces the review of a senior engineering subject delivered in intensive Block mode sessions as a case study for analysing student engagement and experience of interaction using new collaborative learning spaces. Through a post delivery review of the subject this paper assesses and evaluates the learning experience of students in a block mode subject delivered in new collaborative spaces. It analyses findings from two surveys across a range of indicators. Post delivery review of the use of pilot spaces and the quality of student experience of them in combination with new approaches integrated with the online learning environment, can support and inform the transition to wider use of these spaces and innovation in teaching approaches in engineering. This is no small project in a field which has been characterised by an intensive lecture-based model of teaching and learning and so stakeholders need to be `enrolled in its objectives and how they can be aligned with their priorities, and development resourced to ensure success.
Homayounfard, H, Kennedy, PJ & Braun, R 1970, 'NARGES: Prediction Model for Informed Routing in a Communications Network', Lecture Notes in Computer Science, Pacific-Asia Conference on Knowledge Discovery and Data Mining, Springer Berlin Heidelberg, Gold Coast, Australia, pp. 327-338.
View/Download from: Publisher's site
View description>>
There is a dependency between packet-loss and the delay and jitter time-series derived from a telecommunication link. Multimedia applications such as Voice over IP (VoIP) are sensitive to loss and packet recovery is not a merely efficient solution with the increasing number of Internet users. Predicting packet-loss from network dynamics of past transmissions is crucial to inform the next generation of routers in making smart decisions. This paper proposes a hybrid data mining model for routing management in a communications network, called NARGES. The proposed model is designed and implemented for predicting packet-loss based on the forecasted delays and jitters. The model consists of two parts: a historical symbolic time-series approximation module, called HDAX, and a Multilayer Perceptron (MLP). It is validated with heterogeneous quality of service (QoS) datasets, namely delay, jitter and packet-loss time-series. The results show improved precision and quality of prediction compared to autoregressive moving average, ARMA.
Howsawi, EM, Eager, DM, Bagia, R & Niebecker, KD 1970, 'Using video data in project management research', Australian Institute of Project Management (AIPM) National Conference 2013, Australian Institute of Project Management National Conference 2013, Australian Institute of Project Management, Perth, Australia, pp. 1-10.
View description>>
In project management research, on site engagement is acknowledged as being good practice for gaining primary data and understanding the context of the projects being studied. However, it is not possible for researchers to be on site for every project they intend to research because projects can be difficult to access, or may be secret during the execution phase, or simply may have been completed a long time ago. Reading the project documents will provide a substantial amount of information, but there is always more to any project than written data alone, as project practitioners are well aware. Advances in technology since the beginning of the 20 th century enable the filming of project works and perhaps the main benefit of that filming is to document the process for documentary production. Since the camera can capture a wealth of detail and rich complexity that it is impossible or very difficult to capture by other means, and since the eye and ear can acquire a great deal of information that it is practically impossible to write down simultaneously, can the use of such video data be beneficial in project management research? This paper reports the experience of the authors in using video data in such research. More than 250 hours of video data have been examined in researching British aviation projects during the period of the Second World War. The benefits of, and guidance for, using video data are presented, as well as cautions about what may affect the successful use of video data
Hu, L, Cao, J, Xu, G, Cao, L, Gu, Z & Zhu, C 1970, 'Personalized recommendation via cross-domain triadic factorization', Proceedings of the 22nd international conference on World Wide Web, WWW '13: 22nd International World Wide Web Conference, ACM, Rio de Janeiro, Brazil, pp. 595-605.
View/Download from: Publisher's site
View description>>
Collaborative filtering (CF) is a major technique in recommender systems to help users find their potentially desired items. Since the data sparsity problem is quite commonly encountered in real-world scenarios, Cross-Domain Collaborative Filtering (CDCF) hence is becoming an emerging research topic in recent years. However, due to the lack of sufficient dense explicit feedbacks and even no feedback available in users' uninvolved domains, current CDCF approaches may not perform satisfactorily in user preference prediction. In this paper, we propose a generalized Cross Domain Triadic Factorization (CDTF) model over the triadic relation user-item-domain, which can better capture the interactions between domain-specific user factors and item factors. In particular, we devise two CDTF algorithms to leverage user explicit and implicit feedbacks respectively, along with a genetic algorithm based weight parameters tuning algorithm to trade off influence among domains optimally. Finally, we conduct experiments to evaluate our models and compare with other state-of-the-art models by using two real world datasets. The results show the superiority of our models against other comparative models. Copyright is held by the International World Wide Web Conference Committee (IW3C2).
Hu, L, Cao, J, Xu, G, Wang, J, Gu, Z & Cao, L 1970, 'Cross-domain collaborative filtering via bilinear multilevel analysis', IJCAI International Joint Conference on Artificial Intelligence, International Joint Conference on Artificial Intelligence, IJCAI/AAAI, Beijing, China, pp. 2626-2632.
View description>>
Cross-domain collaborative filtering (CDCF), which aims to leverage data from multiple domains to relieve the data sparsity issue, is becoming an emerging research topic in recent years. However, current CDCF methods that mainly consider user and item factors but largely neglect the heterogeneity of domains may lead to improper knowledge transfer issues. To address this problem, we propose a novel CDCF model, the Bilinear Multilevel Analysis (BLMA), which seamlessly introduces multilevel analysis theory to the most successful collaborative filtering method, matrix factorization (MF). Specifically, we employ BLMA to more efficiently address the determinants of ratings from a hierarchical view by jointly considering domain, community, and user effects so as to overcome the issues caused by traditional MF approaches. Moreover, a parallel Gibbs sampler is provided to learn these effects. Finally, experiments conducted on a realworld dataset demonstrate the superiority of the BLMA over other state-of-the-art methods.
Huang, C-S, Lin, C-L, Ko, L-W, Liu, S-Y, Sua, T-P & Lin, C-T 1970, 'A hierarchical classification system for sleep stage scoring via forehead EEG signals', 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), IEEE, Singapore, pp. 1-5.
View/Download from: Publisher's site
View description>>
The study adopts the structure of hierarchical classification to develop an automatic sleep stage classification system using forehead (Fpl and Fp2) EEG signals. The hierarchical classification consists of a preliminary wake detection rule, a novel feature extraction method based on American Academy of Sleep Medicine (AASM) scoring manual, feature selection methods and SVM. After estimating the preliminary sleep stages, two adaptive adjustment schemes are applied to adjust the preliminary sleep stages and provide the final estimation of sleep stages. Clinical testing reveals that the proposed automatic sleep stage classification system is about 77% accuracy and 67% kappa for individual 10 normal subjects. This system could provide the possibility of long term sleep monitoring at home and provide a preliminary result of sleep stages so that doctor could decide if a patient needs to have a detailed diagnosis using Polysomnography (PSG) system in a sleep laboratory of hospital. © 2013 IEEE.
Huang, CS, Lin, CL, Ko, LW, Wang, YK, Liang, JW & Lin, CT 1970, 'Automatic Sleep Stage Classification GUI with a Portable EEG Device', International Conference on Human-Computer Interaction, Springer, Las Vegas, NV, USA, pp. 613-617.
View/Download from: Publisher's site
View description>>
In this study, a developed automatic sleep stage classification system with a portable EEG recording device, (Mindo-4s) is implemented by JAVA-based sleep graphical user interface (GUI) in android platform. First, the parameters of the developed sleep stage classification system, including extracting effective sleep features and a hierarchical classification structure consisting of preliminary wake detection rule, adaptive adjustment scheme, and support vector machine, were trained by our existing sleep database, which collected using polysomnogram (PSG), in MATLAB program. Finally, this classification system would be reedited by JAVA language, and the corresponding JAVA-based sleep GUI software was working in android platform and Mindo-4s. The connection between JAVA-based sleep GUI software and the portable Mindo-4s was through Bluetooth communication. The performance of this JAVA-based sleep GUI can reach 72.43% average accuracy comparing to the result from manual scoring. This JAVA-based sleep GUI can on-line display, record and analyze the forehead EEG signals simultaneously. After sleep, the user can received a complete sleep report, including sleep efficiency, sleep stage distribution, from JAVA-based sleep GUI. Thus, this system can provide a preliminary result in sleep quality estimation, and help the sleep doctor to decide someone needs to have a complete PSG testing in hospital. Using this system is more convenient for long–term and home-based daily caring than traditional PSG measurement.
Huang, C-S, Lin, C-L, Ko, L-W, Wang, Y-K, Liang, J-W & Lin, C-T 1970, 'Automatic Sleep Stage Classification GUI with a Portable EEG Device', Communications in Computer and Information Science, Springer Berlin Heidelberg, pp. 613-617.
View/Download from: Publisher's site
View description>>
In this study, a developed automatic sleep stage classification system with a portable EEG recording device, (Mindo-4s) is implemented by JAVA-based sleep graphical user interface (GUI) in android platform. First, the parameters of the developed sleep stage classification system, including extracting effective sleep features and a hierarchical classification structure consisting of preliminary wake detection rule, adaptive adjustment scheme, and support vector machine, were trained by our existing sleep database, which collected using polysomnogram (PSG), in MATLAB program. Finally, this classification system would be reedited by JAVA language, and the corresponding JAVA-based sleep GUI software was working in android platform and Mindo-4s. The connection between JAVA-based sleep GUI software and the portable Mindo-4s was through Bluetooth communication. The performance of this JAVA-based sleep GUI can reach 72.43% average accuracy comparing to the result from manual scoring. This JAVA-based sleep GUI can on-line display, record and analyze the forehead EEG signals simultaneously. After sleep, the user can received a complete sleep report, including sleep efficiency, sleep stage distribution, from JAVA-based sleep GUI. Thus, this system can provide a preliminary result in sleep quality estimation, and help the sleep doctor to decide someone needs to have a complete PSG testing in hospital. Using this system is more convenient for long-term and home-based daily caring than traditional PSG measurement. © Springer-Verlag Berlin Heidelberg 2013.
Huang, C-S, Lin, C-L, Yang, W-Y, Ko, L-W, Liu, S-Y & Lin, C-T 1970, 'Applying the fuzzy c-means based dimension reduction to improve the sleep classification system', 2013 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2013 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), IEEE, Hyderabad, India.
View/Download from: Publisher's site
View description>>
Having a well sleep quality is important factor in our daily life. The evaluation of sleep stages has become an important issue due to the distribution of sleep stages across a whole night relates to sleep quality. This study aims to propose a sleep classification system, consists of a preliminary wake detection rule, sleep feature extraction, fuzzy c-means based dimension reduction, support vector machine with radial basis function kernel, and adaptive adjustment scheme, with only FP1 and FP2 electroencephalography. Compared with the results from the sleep technologist, the average accuracy and Kappa coefficient of the proposed sleep classification system is 70.92% and 0.6130, respectively, for individual 10 normal subjects. Thus, the proposed sleep classification system could provide a preliminary report of sleep stages to assistant doctors to make decision if a patient needs to have a detailed testing in a sleep laboratory. © 2013 IEEE.
Islam, MR, Guo, YG, Zhu, JG, Lu, HY & Jin, JX 1970, 'Medium-frequency-link power conversion for high power density renewable energy systems', 2013 IEEE International Conference on Applied Superconductivity and Electromagnetic Devices, 2013 IEEE International Conference on Applied Superconductivity and Electromagnetic Devices (ASEMD), IEEE, Beijing, China, pp. 102-106.
View/Download from: Publisher's site
View description>>
Recent advances in solid-state semiconductors and magnetic materials have provided the impetus for medium frequency-link based medium voltage power conversion systems, which would be a possible solution to reducing the weight and volume of renewable power generation systems. To verify this new concept, in this paper, a laboratory prototype of 1.26 kVA medium-frequency-link power conversion system is developed for a scaled down 1 kV grid applications. The design and implementation of the prototyping, test platform, and the experimental results are analyzed and discussed. It is expected that the proposed new technology would have a great potential for future renewable and smart grid applications.
Jiang, J, Lu, J, Zhang, G & Long, G 1970, 'Optimal Cloud Resource Auto-Scaling for Web Applications', PROCEEDINGS OF THE 2013 13TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND GRID COMPUTING (CCGRID 2013), IEEE, pp. 58-65.
View/Download from: Publisher's site
Johnston, A & Walsh, L 1970, 'Sound stream', Proceedings of the 9th ACM Conference on Creativity & Cognition, C&C '13: Creativity and Cognition 2013, ACM, AUSTRALIA, Univ Technol, Sydney, pp. 399-400.
View/Download from: Publisher's site
Kajdanowicz, T, Michalski, R, Musial, K & Kazienko, P 1970, 'Active learning and inference method for within network classification', Proceedings of the 2013 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, ASONAM '13: Advances in Social Networks Analysis and Mining 2013, ACM, Niagara Falls, CANADA, pp. 1299-1306.
View/Download from: Publisher's site
Kamaleswaran, R, Thommandram, A, Zhou, Q, Eklund, M, Cao, Y, Wang, WP & McGregor, C 1970, 'Cloud framework for real-time synchronous physiological streams to support rural and remote Critical Care', Proceedings - IEEE Symposium on Computer-Based Medical Systems, pp. 473-476.
View description>>
We present a method for transmission and processing of real-time trans-continental medical data streams. We apply fundamentals of existing network technologies to create a secure tunnel from a remote hospital through an open-network to the Artemis Cloud. We capture and store incoming 1Hz data stream in our real-time event stream processor to allow for online real-time monitoring of the patient status. The contributions of this paper extend the Critical Care as a Service paradigm by incorporating remote monitoring centers. The results establish feasibility of the system to support real-time monitoring. However, existing protocols were required significant optimization to account for variability in throughput and availability of the network. © 2013 IEEE.
Kamaleswaran, R, Thommandram, A, Zhou, Q, Eklund, M, Cao, Y, Wang, WP & McGregor, C 1970, 'Cloud framework for real-time synchronous physiological streams to support rural and remote Critical Care', Proceedings of the 26th IEEE International Symposium on Computer-Based Medical Systems, 2013 IEEE 26th International Symposium on Computer-Based Medical Systems (CBMS), IEEE, pp. 473-476.
View/Download from: Publisher's site
View description>>
We present a method for transmission and processing of real-time trans-continental medical data streams. We apply fundamentals of existing network technologies to create a secure tunnel from a remote hospital through an open-network to the Artemis Cloud. We capture and store incoming 1Hz data stream in our real-time event stream processor to allow for online real-time monitoring of the patient status. The contributions of this paper extend the Critical Care as a Service paradigm by incorporating remote monitoring centers. The results establish feasibility of the system to support real-time monitoring. However, existing protocols were required significant optimization to account for variability in throughput and availability of the network. © 2013 IEEE.
Kulkarni, R, Qiao, Y & Sun, X 1970, 'Any Monotone Property of 3-Uniform Hypergraphs Is Weakly Evasive', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Theory and Applications of Models of Computation, Springer Berlin Heidelberg, Hong Kong, pp. 224-235.
View/Download from: Publisher's site
View description>>
For a Boolean function f, let D(f) denote its deterministic decision tree complexity, i.e., minimum number of (adaptive) queries required in worst case in order to determine f. In a classic paper, Rivest and Vuillemin [18] show that any non-constant monotone property P: {0,1} (2n) → {0,1} of n-vertex graphs has D(P) = Ω (n). We extend their result to 3-uniform hypergraphs. In particular, we show that any non-constant monotone property P: {0,1} (3n) → {0,1} of n-vertex 3-uniform hypergraphs has D(P) = Ω (n). Our proof combines the combinatorial approach of Rivest and Vuillemin with the topological approach of Kahn, Saks, and Sturtevant. Interestingly, our proof makes use of Vinogradov's Theorem (weak Gold-bach Conjecture), inspired by its recent use by Babai et. al. [1] in the context of the topological approach. Our work leaves the generalization to k-uniform hypergraphs as an intriguing open question. © Springer-Verlag Berlin Heidelberg 2013.
Kun Wang, Yu Yue & Bo Liu 1970, 'DAS: A dynamic assignment scheduling algorithm for stream computing in distributed applications', 2013 IEEE Global Communications Conference (GLOBECOM), 2013 IEEE Global Communications Conference (GLOBECOM 2013), IEEE, Atlanta, GA, pp. 1632-1637.
View/Download from: Publisher's site
Li, F, Xu, G, Cao, L, Fan, X & Niu, Z 1970, 'CGMF: Coupled Group-Based Matrix Factorization for Recommender System', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Web Information Systems Engineering, Springer Berlin Heidelberg, Nanjing, China, pp. 189-198.
View/Download from: Publisher's site
View description>>
With the advent of social influence, social recommender systems have become an active research topic for making recommendations based on the ratings of the users that have close social relations with the given user. The underlying assumption is that a user's taste is similar to his/her friends' in social networking. In fact, users enjoy different groups of items with different preferences. A user may be treated as trustful by his/her friends more on some specific rather than all groups. Unfortunately, most of the extant social recommender systems are not able to differentiate user's social influence in different groups, resulting in the unsatisfactory recommendation results. Moreover, most extant systems mainly rely on social relations, but overlook the influence of relations between items. In this paper, we propose an innovative coupled group-based matrix factorization model for recommender system by leveraging the user and item groups learned by topic modeling and incorporating couplings between users and items and within users and items. Experiments conducted on publicly available data sets demonstrate the effectiveness of our approach. © 2013 Springer-Verlag.
Li, J & Tao, D 1970, 'A Bayesian factorised covariance model for image analysis', IJCAI International Joint Conference on Artificial Intelligence, International Joint Conference on Artificial Intelligence, IJCAI/AAAI, Beijing, China, pp. 1465-1471.
View description>>
This paper presents a specialised Bayesian model for analysing the covariance of data that are observed in the form of matrices, which is particularly suitable for images. Compared to existing generalpurpose covariance learning techniques, we exploit the fact that the variables are organised as an array with two sets of ordered indexes, which induces innate relationship between the variables. Specifically, we adopt a factorised structure for the covariance matrix. The covariance of two variables is represented by the product of the covariance of the two corresponding rows and that of the two columns. The factors, i.e. The row-wise and column-wise covariance matrices are estimated by Bayesian inference with sparse priors. Empirical study has been conducted on image analysis. The model first learns correlations between the rows and columns in an image plane. Then the correlations between individual pixels can be inferred by their locations. This scheme utilises the structural information of an image, and benefits the analysis when the data are damaged or insufficient.
Li, L, Chen, X & Xu, G 1970, 'Suggestions for Fresh Search Queries by Mining Mircoblog Topics', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Workshop on Behavior and Social Informatics, Springer International Publishing, Gold Coast, QLD, Australia, pp. 214-223.
View/Download from: Publisher's site
View description>>
Query suggestion of Web search has been an effective approach to help users quickly express their information need and more accurately get the information they need. All major web-search engines and most proposed methods that suggest queries rely on query logs of search engine to determine possible query suggestions. However, for search systems, it is much more difficult to effectively suggest relevant queries to a fresh search query which has no or few historical evidences in query logs. In this paper, we propose a suggestion approach for fresh queries by mining the new social network media, i.e, mircoblog topics. We leverage the comment information in the microblog topics to mine potential suggestions. We utilize word frequency statistics to extract a set of ordered candidate words. As soon as a user starts typing a query word, words that match with the partial user query word are selected as completions of the partial query word and are offered as query suggestions. We collect a dataset from Sina microblog topics and compare the final results by selecting different suggestion context source. The experimental results clearly demonstrate the effectiveness of our approach in suggesting queries with high quality. Our conclusion is that the suggestion context source of a topic consists of the tweets from authenticated Sina users is more effective than the tweets from all Sina users. © Springer International Publishing Switzerland 2013.
Li, X, Lai, CF, Yu, S, Atiquzzaman, M, Di Natale, M, Papadopoulos, GA, Yang, LT & Wu, Z 1970, 'Message from ICESS2013 Chairs', 2013 IEEE 16th International Conference on Computational Science and Engineering, 2013 IEEE 16th International Conference on Computational Science and Engineering (CSE), IEEE, p. 27.
View/Download from: Publisher's site
Li, X, Zhang, L, Chen, E, Zong, Y & Xu, G 1970, 'Mining Frequent Patterns in Print Logs with Semantically Alternative Labels', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Advanced Data Mining and Applications, Springer Berlin Heidelberg, Hangzhou, pp. 107-119.
View/Download from: Publisher's site
View description>>
It is common today for users to print the informative information from webpages due to the popularity of printers and internet. Thus, many web printing tools such as Smart Print and PrintUI are developed for online printing. In order to improve the users' printing experience, the interaction data between users and these tools are collected to form a so-called print log data, where each record is the set of urls selected for printing by a user within a certain period of time. Apparently, mining frequent patterns from these print log data can capture user intentions for other applications, such as printing recommendation and behavior targeting. However, mining frequent patterns by directly using url as item representation in print log data faces two challenges: data sparsity and pattern interpretability. To tackle these challenges, we attempt to leverage delicious api (a social bookmarking web service) as an external thesaurus to expand the semantics of each url by selecting tags associated with the domain of each url. In this setting, the frequent pattern mining is employed on the tag representation of each url rather than the url or domain representation. With the enhancement of semantically alternative tag representation, the semantics of url is substantially improved, thus yielding the useful frequent patterns. To this end, in this paper we propose a novel pattern mining problem, namely mining frequent patterns with semantically alternative labels, and propose an efficient algorithm named PaSAL (Frequent Patterns with Semantically Alternative Labels Mining Algorithm) for this problem. Specifically, we propose a new constraint named conflict matrix to purify the redundant patterns to achieve a high efficiency. Finally, we evaluate the proposed algorithm on a real print log data. © 2013 Springer-Verlag.
Li, Y, Liu, B, Rong, B, Wu, Y, Gagnon, G, Gui, L & Zhang, W 1970, 'Rate-compatible LDPC-RS product codes based on raptor-like LDPC codes', 2013 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), 2013 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), IEEE.
View/Download from: Publisher's site
View description>>
In this paper, we propose a novel rate-compatible LDPC-RS product code structure which combines Raptor-like LDPC codes and RS codes in a product way. While maintaining the advantages of traditional LDPC-RS product codes, such as combating different types of errors and offering various decoding solutions, the new code structure gains the feature of rate-compatibility thanks to the application of Raptor-like LDPC code as one of its component codes. To make full use of its advantages, we derive a hybrid, iterative and adaptive decoding scheme for this code structure. We show that on one hand, this decoding scheme organizes LDPC soft-decision decoding, LDPC hard-decision decoding and RS hard-decision decoding in a proper way which improves error performance as well as reduces the complexity; on the other hand, considering rate-compatibility, this scheme implements 'partial decoding' which adapts the code rate with the channel condition and further lowers the decoding complexity and system latency. This rate-compatible LDPC-RS product codes offer a potential solution for channel coding in next-generation broadcasting systems. © 2013 IEEE.
Liang, J, Simoff, S, Nguyen, QV & Huang, ML 1970, 'Visualizing large trees with divide & conquer partition', Proceedings of the 6th International Symposium on Visual Information Communication and Interaction, VINCI '13: The 6th International Symposium on Visual Information Communication and Interaction, ACM, Tianjin, China, pp. 79-87.
View/Download from: Publisher's site
View description>>
While prior works on enclosure approach, guarantees the space utilization of a single geometrical area, mostly rectangle, this paper proposes a flexible enclosure tree layout method for partitioning various polygonal shapes that break through the limitation of rectangular constraint. Similar to Treemap techniques, it uses enclosure to divide display space into smaller areas for its sub-hierarchies. The algorithm can partition a polygonal shape or even an arbitrary shape into smaller polygons, rotated rectangles or vertical-horizontal rectangles. The proposed method and implementation algorithms provide an effective interactive visualization tool for partitioning large hierarchical structures within a confined display area with different shapes for real-time applications. We demonstrated the effective of the new method with a case study, an automated evaluation and a usability study. © 2013 ACM.
Lin, C-T, Prasad, M & Chang, J-Y 1970, 'Designing mamdani type fuzzy rule using a collaborative FCM scheme', 2013 International Conference on Fuzzy Theory and Its Applications (iFUZZY), 2013 International Conference on Fuzzy Theory and Its Applications (iFUZZY), IEEE, Taipei, TAIWAN, pp. 279-282.
View/Download from: Publisher's site
View description>>
This paper presents a new approach for generating fuzzy rules for fuzzy inference system by using collaborative fuzzy c-mean (CFCM). In order to do any mode of integration between datasets, there is a need to define the common feature between datasets by using some kind of collaborative process and also need to preserve the privacy and security at higher levels. This collaboration process gives a common structure between datasets which helps to define an appropriate number of rules for structural learning and also improve the accuracy of the system modeling. This all consideration bring the concept of collaborative fuzzy rule generation process with a quality measuring. © 2013 IEEE.
Lin, C-T, Wang, Y-K, Fan, J-W & Chen, S-A 1970, 'The influence of acute stress on brain dynamics', 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), IEEE, Singapore, pp. 6-10.
View/Download from: Publisher's site
View description>>
Living under high stress may be unhealthy. This study explores electroencephalography (EEG) correlated with stressful circumstances by using the task-switching paradigm with feedback information. According to the behavioral and physiological evidence, acute stress created by this paradigm affected the performance of participants. Under stress, the participants responded quickly and inaccurately. The EEG results correlated with acute stress were found in the frontal midline cortex, especially on the theta and alpha bands. These specific factors are considered importance features for detecting the influence of stress by applying various machine-learning methods and neuro-fuzzy systems. This comprehensive study can provide knowledge for studying stress and designing Brain-Computer Interface (BCI) systems in the future. © 2013 IEEE.
Linares Mustaros, S, Gil Lafuente, AM, Ferrer Comalat, JC & Merigo, JM 1970, 'A MODEL FOR THE GENERALIZATION OF THE FORGOTTEN EFFECTS', DECISION MAKING SYSTEMS IN BUSINESS ADMINISTRATION, International Conference on Modeling and Simulation in Engineering, Economics and Management for Sustainable Development, WORLD SCIENTIFIC PUBL CO PTE LTD, Rio de Janeiro, BRAZIL, pp. 495-508.
Linares-Mustarós, S, Merigó, JM & Ferrer-Comalat, JC 1970, 'PEV: A Computer Program for Fuzzy Sales Forecasting', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, International Conference on Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, Castellon de la Plana, SPAIN, pp. 200-209.
View/Download from: Publisher's site
Liu, C, Chen, L & Zhang, C 1970, 'Mining Probabilistic Representative Frequent Patterns From Uncertain Data', Proceedings of the 2013 SIAM International Conference on Data Mining, Proceedings of the 2013 SIAM International Conference on Data Mining, Society for Industrial and Applied Mathematics, Austin, Texas, USA, pp. 73-81.
View/Download from: Publisher's site
View description>>
Copyright © SIAM. Probabilistic frequent pattern mining over uncertain data has received a great deal of attention recently due to the wide applications of uncertain data. Similar to its counterpart in deterministic databases, however, probabilistic frequent pattern mining suffers from the same problem of generating an exponential number of result patterns. The large number of discovered patterns hinders further evaluation and analysis, and calls for the need to find a small number of representative patterns to approximate all other patterns. This paper formally defines the problem of probabilistic representative frequent pattern (P-RFP) mining, which aims to find the minimal set of patterns with sufficiently high probability to represent all other patterns. The problem's bottleneck turns out to be checking whether a pattern can probabilistically represent another, which involves the computation of a joint probability of supports of two patterns. To address the problem, we propose a novel and efficient dynamic programming-based approach. Moreover, we have devised a set of effective optimization strategies to further improve the computation efficiency. Our experimental results demonstrate that the proposed P-RFP mining effectively reduces the size of probabilistic frequent patterns. Our proposed approach not only discovers the set of P-RFPs efficiently, but also restores the frequency probability information of patterns with an error guarantee.
Liu, C, Chen, L & Zhang, C 1970, 'Summarizing probabilistic frequent patterns', Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, KDD' 13: The 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ACM, Chicago, Illinois USA, pp. 527-535.
View/Download from: Publisher's site
View description>>
Copyright © 2013 ACM. Mining probabilistic frequent patterns from uncertain data has received a great deal of attention in recent years due to the wide applications. However, probabilistic frequent pattern mining suffers from the problem that an exponential number of result patterns are generated, which seriously hinders further evaluation and analysis. In this paper, we focus on the problem of mining probabilistic representative frequent patterns (P-RFP), which is the minimal set of patterns with adequately high probability to represent all frequent patterns. Observing the bottleneck in checking whether a pattern can probabilistically represent another, which involves the computation of a joint probability of the supports of two patterns, we introduce a novel approximation of the joint probability with both theoretical and empirical proofs. Based on the approximation, we propose an Approximate P-RFP Mining (APM) algorithm, which effectively and efficiently compresses the set of probabilistic frequent patterns. To our knowledge, this is the first attempt to analyze the relationship between two probabilistic frequent patterns through an approximate approach. Our experiments on both synthetic and real-world datasets demonstrate that the APM algorithm accelerates P-RFP mining dramatically, orders of magnitudes faster than an exact solution. Moreover, the error rate of APM is guaranteed to be very small when the database contains hundreds transactions, which further affirms APM is a practical solution for summarizing probabilistic frequent patterns.
Liu, Q, Yin, J & Yu, S 1970, 'A Bio-inspired Jamming Detection and Restoration for WMNs: In View of Adaptive Immunology', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer International Publishing, pp. 243-257.
View/Download from: Publisher's site
View description>>
The wireless mesh network is vulnerable to jamming attacks due to open share of physical medium. Since such attacks induce severe interferences resulting in denial of regular service, highly efficient detection and restoration methods are vital for a secure wireless mesh network. On the other hand, artificial immune mechanisms originated from the immunology are considerable methods to inspire design of an detection and restoration system. In this paper, we propose an immunological anti-jamming method as per the adaptive immune system of human beings to defeat the reactive jamming. The proposed method consists three function modules, i.e., the monitoring agent for monitoring the packet reception, the decision agent for detecting attacks and the recovery agent for restoring the network from the ongoing attacks. Simulation results show that the proposed method is effective to defeat the reactive jamming and to maintain considerable performance of the overall network. © Springer International Publishing Switzerland 2013.
Ma, J, Lin, H, Lu, J & Zhang, G 1970, 'A hybrid model for migrating customer segmentation with missing attributes', PROCEEDINGS OF THE 2013 JOINT IFSA WORLD CONGRESS AND NAFIPS ANNUAL MEETING (IFSA/NAFIPS), Joint IFSA World Congress NAFIPS Annual Meeting, IEEE, Edmonton, Canada, pp. 825-830.
View/Download from: Publisher's site
View description>>
Due to missing attributes in an enterprise's database, migrating customer segmentation results from external dataset to enterprise database in difficult. In this paper, a hybrid model, called HMCS model, is presented. This model artificially generates values of missing attributes based on external dataset and populates them to enterprise database. Based on this model, an application in a telecom application is reported. Application indicates the presented model can produce acceptable segmentation results on the enterprise dataset which is with missing attributes.
Mairiza, D, Zowghi, D & Gervasi, V 1970, 'Conflict characterization and Analysis of Non Functional Requirements: An experimental approach', 2013 IEEE 12th International Conference on Intelligent Software Methodologies, Tools and Techniques (SoMeT), 2013 IEEE 12th International Conference on Intelligent Software Methodologies, Tools and Techniques (SoMeT), IEEE, Budapest, Hungary, pp. 83-91.
View/Download from: Publisher's site
View description>>
Prior studies reveal that conflicts among Non Functional Requirements (NFRs) are not always absolute. They can also be relative depending on the context of the system being developed.
Mans, B & Mathieson, L 1970, 'On the Treewidth of Dynamic Graphs', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 349-360.
View/Download from: Publisher's site
View description>>
Dynamic graph theory is a novel, growing area that deals with graphs that change over time and is of great utility in modelling modern wireless, mobile and dynamic environments. As a graph evolves, possibly arbitrarily, it is challenging to identify the graph properties that can be preserved over time and understand their respective computability. In this paper we are concerned with the treewidth of dynamic graphs. We focus on metatheorems, which allow the generation of a series of results based on general properties of classes of structures. In graph theory two major metatheorems on treewidth provide complexity classifications by employing structural graph measures and finite model theory. Courcelle's Theorem gives a general tractability result for problems expressible in monadic second order logic on graphs of bounded treewidth, and Frick & Grohe demonstrate a similar result for first order logic and graphs of bounded local treewidth. We extend these theorems by showing that dynamic graphs of bounded (local) treewidth where the length of time over which the graph evolves and is observed is finite and bounded can be modelled in such a way that the (local) treewidth of the underlying graph is maintained. We show the application of these results to problems in dynamic graph theory and dynamic extensions to static problems. In addition we demonstrate that certain widely used dynamic graph classes naturally have bounded local treewidth. © 2013 Springer-Verlag Berlin Heidelberg.
McGregor, C 1970, 'A platform for real-time online health analytics during spaceflight', 2013 IEEE Aerospace Conference, 2013 IEEE Aerospace Conference, IEEE.
View/Download from: Publisher's site
View description>>
Monitoring the health and wellbeing of astronauts during spaceflight is an important aspect of any manned mission. To date the monitoring has been based on a sequential set of discontinuous samplings of physiological data to support initial studies on aspects such as weightlessness, and its impact on the cardiovascular system and to perform proactive monitoring for health status. The research performed and the real-time monitoring has been hampered by the lack of a platform to enable a more continuous approach to real-time monitoring. While any spaceflight is monitored heavily by Mission Control, an important requirement within the context of any spaceflight setting and in particular where there are extended periods with a lack of communication with Mission Control, is the ability for the mission to operate in an autonomous manner. This paper presents a platform to enable real-time astronaut monitoring for prognostics and health management within space medicine using online health analytics. The platform is based on extending previous online health analytics research known as the Artemis and Artemis Cloud platforms which have demonstrated their relevance for multi-patient, multi-diagnosis and multi-stream temporal analysis in real-time for clinical management and research within Neonatal Intensive Care. Artemis and Artemis Cloud source data from a range of medical devices capable of transmission of the signal via wired or wireless connectivity and hence are well suited to process real-time data acquired from astronauts. A key benefit of this platform is its ability to monitor their health and wellbeing onboard the mission as well as enabling the astronaut's physiological data, and other clinical data, to be sent to the platform components at Mission Control at each stage when that communication is available. As a result, researchers at Mission Control would be able to simulate, deploy and tailor predictive analytics and diagnostics during the same spaceflight for...
McGregor, C 1970, 'Wearable monitors on babies: Big data saving little people', 2013 IEEE International Symposium on Technology and Society (ISTAS): Social Implications of Wearable Computing and Augmediated Reality in Everyday Life, 2013 IEEE International Symposium on Technology and Society (ISTAS), IEEE.
View/Download from: Publisher's site
McGregor, C, James, A, Eklund, M, Sow, D, Ebling, M & Blount, M 1970, 'Real-time multidimensional temporal analysis of complex high volume physiological data streams in the neonatal intensive care unit.', Stud Health Technol Inform, 14th World Congress on Medical and Health Informatics (MEDINFO), IOS PRESS, Netherlands, pp. 362-366.
View/Download from: Publisher's site
View description>>
The intensive care of immature preterm infants is a challenging, dynamic clinical task that is complicated because these infants frequently develop a range of comorbidities as they grow and develop after their premature birth. Earliest reliable condition onset detection is a goal within this setting and high frequency physiological analysis is showing potential new pathophysiological indicators for earlier onset detection of several conditions. To realise this, a platform for multi-stream, multi-condition, multi-feature risk scoring is required. In this paper we demonstrate our multi-stream online analytics approach for condition onset detection and demonstrate a user interface approach for patient state that can be available in real-time to support condition risk scoring.
Mearns, H & Leaney, J 1970, 'The Use of Autonomic Management in Multi-provider Telecommunication Services', 2013 20th IEEE International Conference and Workshops on Engineering of Computer Based Systems (ECBS), 2013 20th IEEE International Conference and Workshops on Engineering of Computer Based Systems (ECBS), IEEE, Scottsdale, USA, pp. 129-138.
View/Download from: Publisher's site
View description>>
The continuing expansion of telecommunication service domains, from Quality of Service guaranteed connectivity to ubiquitous cloud environments, has introduced an ever increasing level of complexity in the field of service management. This complexity arises not only from the sheer variability in service requirements but also through the required but ill-defined interaction of multiple organisations and providers. As a result of this complexity and variability, the provisioning and performance of current services is adversely affected, often with little or no accountability to the users of the service.This exposes a need for total coverage in the management of such complex services, a system which provides for service responsibility. Service responsibility is defined as the provisioning of service resilience and the judgement of service risk across all the service components. To be effective in responsible management for current complex services, any framework must be able to interact with multiple providers and management systems. The CARMA framework upon which we are working, aims to fulfil these requirements through a multi-agent system, that is based in a global market, and can negotiate and be responsible for multiple complex services.To this end the research aims to present the architecture, agent functionality and interactions of the CARMA system, as well as the structure of the marketplace, contract specification and risk management.As the scope and concepts of the proposed system are relatively unexplored, a model and simulation was developed to verify the concepts, explore the issues, assess the assumptions and validate the system. The results of the simulation determined that the introduction of CARMA has the potential to reduce the risk in contracting new services, increase the reliability of contracted services, and increase the utility of providers participating in the market. © 2013 IEEE.
Meng, Q & Kennedy, PJ 1970, 'Discovering influential authors in heterogeneous academic networks by a co-ranking method', Proceedings of the 22nd ACM international conference on Conference on information & knowledge management - CIKM '13, the 22nd ACM international conference, ACM Press, San Francisco, California, USA, pp. 1029-1036.
View/Download from: Publisher's site
View description>>
Research in ranking networked entities is widely applicable to many problems such as optimizing search engines, building recommendation systems and discovering influential nodes in social networks. However, many famous ranking approaches like PageRank are limited to solving this problem in homogeneous networks and are not applicable to heterogeneous networks. Faced with this problem, we propose a co--ranking method to evaluate scientific publications and authors. This novel approach is a flexible framework based on a set of customized rules taking into account both topological features of networks and the included citations. The approach ranks authors and publications iteratively and uses the results of each round to reinforce the ranks of authors and publications. Unlike traditional approaches to assessing publication, which require a great number of citations, our method lowers this requirement. This co--ranking approach has been validated using data collected from DBLP and CiteSeer, and the results suggest that it is effective and efficient in ranking authors and publications based on limited numbers of citations in heterogeneous networks and that it has fast convergence.
Merige, JM, Jian-Bo Yang & Dong-Ling Xu 1970, 'Supply Analysis and Aggregation Systems', 2013 IEEE International Conference on Systems, Man, and Cybernetics, 2013 IEEE International Conference on Systems, Man and Cybernetics (SMC 2013), IEEE, Manchester, ENGLAND, pp. 97-102.
View/Download from: Publisher's site
Merigó, JM, Guillén, M & Sarabia, JM 1970, 'A Generalization of the Variance by Using the Ordered Weighted Average', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, International Conference on Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, Castellon de la Plana, SPAIN, pp. 222-231.
View/Download from: Publisher's site
Merigó, JM, Yang, J-B & Xu, D-L 1970, 'Decision Making with Fuzzy Moving Averages and OWA Operators', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, International Conference on Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, Castellon de la Plana, SPAIN, pp. 210-221.
View/Download from: Publisher's site
Musial, K, Gabrys, B & Buczko, M 1970, 'What kind of network are you?', Proceedings of the 2013 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, ASONAM '13: Advances in Social Networks Analysis and Mining 2013, ACM, Niagara Falls, CANADA, pp. 1366-1373.
View/Download from: Publisher's site
Musial, K, Kazienkol, P & Kajdanowicz, T 1970, 'Social Recommendations within the Multimedia Sharing Systems', Musial K., Kazienko P., Kajdanowicz T.: Social Recommendations within the Multimedia Sharing Systems. The First World Summit on the Knowledge Society, WSKS'08, Lecture Notes in Computer Science LNCS 5288, 2008, pp. 364-372, 1st World Summit on the Knowledge Society (WSKS 2008), SPRINGER-VERLAG BERLIN, Athens, GREECE, pp. 364-+.
View description>>
The social recommender system that supports the creation of new relationsbetween users in the multimedia sharing system is presented in the paper. Togenerate suggestions the new concept of the multirelational social network wasintroduced. It covers both direct as well as object-based relationships thatreflect social and semantic links between users. The main goal of the newmethod is to create the personalized suggestions that are continuously adaptedto users' needs depending on the personal weights assigned to each layer fromthe social network. The conducted experiments confirmed the usefulness of theproposed model.
Naderpour, M, Lu, J, Zhang, G & IEEE 1970, 'A Fuzzy Dynamic Bayesian Network-Based Situation Assessment Approach', 2013 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ - IEEE 2013), IEEE International Conference on Fuzzy Systems, IEEE, Hyderabad, India, pp. 1-8.
View/Download from: Publisher's site
View description>>
Situation awareness (SA), a state in the mind of a human, is essential to conduct decision-making activities. It is about the perception of the elements in the environment, the comprehension of their meaning, and the projection of their status in the near future. Two decades of investigation and analysis of accidents have showed that SA was behind of many serious large-scale technological systems' accidents. This emphasizes the importance of SA support systems development for complex and dynamic environments. This paper presents a fuzzy dynamic Bayesian network-based situation assessment approach to support the operators in decision making process in hazardous situations. The approach includes a dynamic Bayesian network-based situational network to model the hazardous situations where the existence of the situations can be inferred by sensor observations through the SCADA monitoring system using a fuzzy quantizer method. In addition to generate the assessment result, a fuzzy risk estimation method is proposed to show the risk level of situations. Ultimately a hazardous environment from U.S. Chemical Safety Board investigation reports has been used to illustrate the application of proposed approach. © 2013 IEEE.
Nemoto, K, Devitt, SJ, Trupke, M, Stephens, AM, Everitt, MS, Buczak, K, Noebauer, T, Schmiedmayer, J & Munro, WJ 1970, 'Quantum Device and Architecture based on NV Centers for Quantum Networks', CLEO: 2013, CLEO: QELS_Fundamental Science, OSA.
View/Download from: Publisher's site
View description>>
In recent years, NV centers in diamond has attracted significant attention as a candidate for quantum information devices. The negatively charged NV center, in particular, has been intensely investigated [1-3]. NV- centers host both an electron spin qubit and a nitrogen nuclear spin, in our case a nuclear spin-1/2 of 15N is imbedded. The ground state of the electron spin qubit has a long coherence time and an optical transition at 637nm. The nuclear spins may be considered as a long-lived quantum memory, which has been experimentally demonstrated [2]. Because of these preferable quantum properties, NV centers have been considered as a good candidate for implementation of quantum information processing and there has been a number of theoretically propose designs for quantum information processing [3]. © 2013 The Optical Society.
Nemoto, K, Stephens, A, Devitt, S, Everitt, M, Schmiedmayer, J, Trupke, M, Saito, S, Matsuzaki, Y, SaiToh, A, Harrison, K & Munro, WJ 1970, 'Quantum communication utilizing cavity-based quantum devices', 2013 Conference on Lasers and Electro-Optics Pacific Rim (CLEOPR), 2013 Conference on Lasers and Electro-Optics Pacific Rim (CLEO-PR), IEEE, Kyoto, JAPAN.
View/Download from: Publisher's site
Nemoto, K, Stephens, AM, Devitt, SJ, Harrison, KA & Munro, WJ 1970, 'The role of quantum memory in quantum information processing', SPIE Proceedings, SPIE Optical Engineering + Applications, SPIE, San Diego, CA.
View/Download from: Publisher's site
Niazi, M, Mahmood, S, Alshayeb, M, Baqais, AAB & Gill, AQ 1970, 'Motivators of Adopting Social Computing in Global Software Development: Initial Results', WORLD CONGRESS ON ENGINEERING - WCE 2013, VOL I, World Congress on Engineering, IAENG, London, U.K, pp. 409-413.
View/Download from: Publisher's site
View description>>
context: Real-time collaboration is critical for developing high quality software systems at low cost in a geographically distributed Global Software Development (GSD) environment. It is anticipated that emerging Social Computing tools can play an important role in facilitating realtime effective collaboration among teams working in the GSD. Objective: The objective of this research paper is to identify motivators for adopting social computing in GSD organizations. Method: We adopted a Systematic Literature Review (SLR) approach by applying customized search strings derived from our research questions. Results: We have identified factors such as real-time communication and coordination, information sharing, knowledge acquisition and expert feedback as key motivators for adoption of social computing in GSD. Conclusion: Based on the SLR results, we suggest that GSD organizations should embrace social computing as a tool for real-time collaboration between distributed GSD teams. The results of this initial study also suggest the need for developing the social computing strategies and policies to guide the effective social computing adoption by GSD teams.
Oberst, S & Lai, JCS 1970, 'The role of pad-mode instabilities in disc brake squeal', 20th International Congress on Sound and Vibration 2013, ICSV 2013, International Congress on Sound and Vibration, Bangkok, Thailand, pp. 2861-2868.
View description>>
Automotive disc brake squeal remains an economically significant and technically challenging problem to solve, owing to customer complaints' associated warranty costs and the many interacting parameters. While industrial practice aims at identifying unstable vibration modes using complex eigenvalue analysis, in this paper, we show how to identify pad-mode instabilities using vibration forced response analysis complemented by acoustic radiation calculations for simplified brake systems in the form of a pad-on-plate model. Our recent results indicate that pad-mode instabilities might trigger so-called instantaneous mode squeal without the necessity of mode coupling. Pad-mode instabilities, which complex eigenvalue analysis fails to detect, are revealed by the dissipated energy spectrum at frequencies where the dissipated energy is negative (i.e. providing energy instead of dissipating energy). Pad-modes seem to radiate locally higher sound pressure depending on the phase shift between the structural vibration and the sound pressure while exciting the underlying plate's or disc's modes. Pad-mode instabilities are shown to be one mechanism of brake squeal. In order to identify pad-mode instabilities, it is beneficial to perform a full range of vibration analysis which includes complex eigenvalue value analysis, forced response and dissipated energy spectra as well as acoustic radiation calculations for a range of different parameters such as friction coefficient, operating pressure, temperature and contact conditions.
Oberst, SM & Lai, J 1970, 'The role of pad-modes and nonlinearity in instantaneous mode squeal', Proceedings of Meetings on Acoustics, ICA 2013 Montreal, ASA.
View/Download from: Publisher's site
View description>>
Disc brake squeal is a major source of customer dissatisfaction and related warranty costs for automobile manufacturers. Although mode coupling is recognised as a mechanism often found in squealing brakes, recent research results show that friction induced pad-mode instabilities could be the cause of instantaneous mode squeal reported in the literature. In this paper, the nonlinear characteristics of instantaneous mode squeal initiated by pad-mode instabilities are studied by analysing phase space plots of vibrations and sound pressure for a numerical model of a pad-on-plate system as the friction coefficient increases. Results show tat as the friction coefficient increases from 0.05 to 0.65, attractors of vibration in the phase space transits from limit cycle to quasi-periodic, showing signs of approaching chaotic behaviour. It is shown here that the correlation of the sound pressure behaviour in the phase-space with structural vibration is crucial to understanding the role of pad modes and nonlinearity in instantaneous mode squeal. © 2013 Acoustical Society of America.
Paler, A, Devitt, SJ, Nemoto, K & Polian, I 1970, 'Synthesis of Topological Quantum Circuits', Nanoscale Architectures (NANOARCH), 2012 IEEE/ACM International Symposium on (2012), Page(s): 181- 187, IEEE/ACM International Symposium on Nanoscale Architectures (NANOARCH), IEEE, Amsterdam, NETHERLANDS, pp. 181-187.
View description>>
Topological quantum computing has recently proven itself to be a verypowerful model when considering large- scale, fully error corrected quantumarchitectures. In addition to its robust nature under hardware errors, it is asoftware driven method of error corrected computation, with the hardwareresponsible for only creating a generic quantum resource (the topologicallattice). Computation in this scheme is achieved by the geometric manipulationof holes (defects) within the lattice. Interactions between logical qubits(quantum gate operations) are implemented by using particular arrangements ofthe defects, such as braids and junctions. We demonstrate that junction-basedtopological quantum gates allow highly regular and structured implementation oflarge CNOT (controlled-not) gate networks, which ultimately form the basis ofthe error corrected primitives that must be used for an error correctedalgorithm. We present a number of heuristics to optimise the area of theresulting structures and therefore the number of the required hardwareresources.
Palmas, G, Pietroni, N, Cignoni, P & Scopigno, R 1970, 'A computer-assisted constraint-based system for assembling fragmented objects.', Digital Heritage (1), IEEE, IEEE, pp. 529-536.
Pan, R, Dolog, P & Xu, G 1970, 'KNN-Based Clustering for Improving Social Recommender Systems', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Workshop on Agents and Data Mining Interaction, Springer Berlin Heidelberg, Valencia, Spain, pp. 115-125.
View/Download from: Publisher's site
View description>>
Clustering is useful in tag based recommenders to reduce sparsity of data and by doing so to improve also accuracy of recommendation. Strategy for the selection of tags for clusters has an impact on the accuracy. In this paper we propose a KNN based approach for ranking tag neighbors for tag selection. We study the approach in comparison to several baselines by using two datasets in different domains. We show, that in both cases the approach outperforms the compared approaches. © 2013 Springer-Verlag.
Peeters, M, Megens, C, van den Hoven, E, Hummels, C & Brombacher, A 1970, 'Social Stairs: Taking the Piano Staircase towards Long-Term Behavioral Change', Lecture Notes in Computer Science, International Conference on Persuasive Technology, Springer Berlin Heidelberg, Sydney, NSW, Australia, pp. 174-179.
View/Download from: Publisher's site
View description>>
Abstract. This paper addresses the development of Social Stairs, an intelli-gent musical staircase to change peoples behavior in the long-term to take the stairs in favor of the elevator. Through designing with the Experiential Design Landscape (EDL) method, a design opportunity was found that social engage-ment encouraged people to take the stairs at work in favor of the elevator. To encourage this social behavior, people who involved each other and worked to-gether whilst using the Social Stairs were treated with more diverse orchestral chimes that echoed up the stairwell. In this paper we reflect on the differences between the persuasive system of the well-known Piano Staircase and the So-cial Stairs. We report on the deployment of the Social Stairs for a period of three weeks in the public space within the university community and identify opportunities for triggering intrinsic motivation, social engagement and how to keep people involved in the long-term.
Peng, S, Wang, G & Yu, S 1970, 'Mining Mechanism of Top-k Influential Nodes Based on Voting Algorithm in Mobile Social Networks', 2013 IEEE 10th International Conference on High Performance Computing and Communications & 2013 IEEE International Conference on Embedded and Ubiquitous Computing, 2013 IEEE International Conference on High Performance Computing and Communications (HPCC) & 2013 IEEE International Conference on Embedded and Ubiquitous Computing (EUC), IEEE, Zhangjiajie, PEOPLES R CHINA, pp. 2194-2199.
View/Download from: Publisher's site
Peng, S, Wang, G & Yu, S 1970, 'Modeling Malware Propagation in Smartphone Social Networks', 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications, 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), IEEE, Melbourne, AUSTRALIA, pp. 196-201.
View/Download from: Publisher's site
Pileggi, SF & Fernandez-Llatas, C 1970, 'Towards Semantic Resources in the Cloud', Communications in Computer and Information Science, Springer Berlin Heidelberg, pp. 407-416.
View/Download from: Publisher's site
View description>>
During the past years, the cloud vision at distributed systems progressively became the new trend for the next generation platforms. The advance in the technology, both with the broadband availability and the explosion of mobile computing, make the massive migration to cloud solution next to be a fact. The Cloud model assures a new technologic and business environment for services and applications where competitiveness, scalability and sustainability converge. On the other hand, next generation applications have to be able to pervasively meet the needs and requirements deeply different among them. Applications involving complex virtual organizations require a higher level of flexibility. An effective approach is based on the convergence between migration and virtualization. The resource-centric model assumes file systems, DBs, services and any other class of resources available in the "cloud" as Virtual Resources. These heterogeneous resources can be managed in a unique virtual context regardless by the infrastructures on which they are deployed. Semantics play a critical role in order to assure advanced and open solutions in a technologic context featured by a fundamental lack of standardization. © Springer-Verlag Berlin Heidelberg 2013.
Pileggi, SF, Calvo-Gallego, J & Amor, R 1970, 'Bringing Semantic Resources Together in the Cloud: From Theory to Application', 2013 Fifth International Conference on Computational Intelligence, Modelling and Simulation, 2013 Fifth International Conference on Computational Intelligence, Modelling and Simulation (CIMSim), IEEE, pp. 113-118.
View/Download from: Publisher's site
View description>>
This paper deals with the added value provided by Semantic Technologies in cloud environments. In these contexts, semantics are not understood as a massive technology but as a resource in order to improve cloud platforms' capabilities in terms of interoperability, knowledge building/representation and management. The proposed approach aims at the extension of the common middleware functional layer in complex architectures through semantics. This added capability should enable (active and passive) heterogeneous resources to work together as in a unique ecosystem, as well as supporting innovative interaction models involving these resources. The ideal application could be the Smart City. © 2013 IEEE.
Pileggi, SF, Fernandez-Llatas, C & Traver, V 1970, 'Metropolitan Ecosystems among Heterogeneous Cognitive Networks: Issues, Solutions and Challenges', Communications in Computer and Information Science, Springer Berlin Heidelberg, pp. 323-333.
View/Download from: Publisher's site
View description>>
Cognitive Networks working on large scale are experimenting an increasing popularity. The interest, by both a scientific and commercial perspective, in the context of different environments, applications and domains is a fact. The natural convergence point for these heterogeneous disciplines is the need of a strong advanced technologic support that enables the generation of distributed observations on large scale as well as the intelligent process of obtained information. Focusing mostly on cognitive networks that generate information directly through sensor networks, existent solutions at level of metropolitan area are mainly limited by the use of obsolete/static coverage models as well as by a fundamental lack of flexibility respect to the dynamic features of the virtual organizations. Furthermore, the centralized view at the systems is a strong limitation for dynamic data processing and knowledge building. © Springer-Verlag Berlin Heidelberg 2013.
Pisan, Y, Marin, JG, Navarro, KF & Machinery, AC 1970, 'Improving Lives: Using Microsoft Kinect to Predict the Loss of Balance for Elderly Users under Cognitive Load', PROCEEDINGS OF THE 9TH AUSTRALASIAN CONFERENCE ON INTERACTIVE ENTERTAINMENT (IE 2013), Interactive Entertainment, ACM Press, Melbourne, Australia, pp. 1-4.
View/Download from: Publisher's site
View description>>
Among older adults, falling down while doing everyday tasks is the leading cause for injuries, disabilities and can even result in death. Furthermore, even when no injury has occurred the fear of falling can result in loss of confidence and independence. The two major factors in the loss of balance is weakening of the muscles and reduced cognitive skills. While exercise programmes can reduce the risk of falling by 40%, patient compliance with these programmes is low. We present the Microsoft-Kinect based step training program system that we have developed specifically for elderly patients. The program measures physical health and cognitive abilities and incorporates an individualized adaptive program for improvements. The real-time data obtained from the program is similar to clinical evaluations typically conducted by doctors and the game-like exercises result in increased adherence to the exercise regimes. Copyright © 2013 ACM.
Prasad, M, Lin, CT, Yang, CT & Saxena, A 1970, 'Vertical collaborative fuzzy C-means for multiple EEG data sets', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Intelligent Robotics and Applications, Springer, Busan, South Korea, pp. 246-257.
View/Download from: Publisher's site
View description>>
Vertical Collaborative Fuzzy C-Means (VC-FCM) is a clustering method that performs clustering on a data set of having some set of patterns with the collaboration of some knowledge which is obtained from other data set having the same number of features but different set of patterns. Uncertain relationship lies in data between the data sets as well as within a dataset. Practically data of the same group of objects are usually stored in different datasets; in each data set, the data dimensions are not necessarily the same and unreal data may exist. Fuzzy clustering of a single data set would bring about less reliable results. And these data sets cannot be integrated for some reasons. An interesting application of vertical clustering occurs when dealing with huge data sets. Instead of clustering them in a single pass, we split them into individual data sets, cluster each of them separately, and reconcile the results through the collaborative exchange of prototypes. Vertical collaborative fuzzy C-Means is a useful tool for dealing collaborative clustering problems where a feature space is described in different pattern-sets. In this paper we use collaborative fuzzy clustering, first we cluster each data set individually and then optimize in accordance with the dependency of these datasets is adopted so as to improve the quality of fuzzy clustering of a single data set with the help of other data sets, taking personal privacy and security of data into consideration. © 2013 Springer-Verlag Berlin Heidelberg.
Prior, J 1970, 'A sense of working there', Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration, OzCHI '13: Augmentation, Application, Innovation, Collaboration, ACM, Flinders University, Adelaide, Australia, pp. 147-150.
View/Download from: Publisher's site
View description>>
This paper emphasises the importance to the Human-Computer Interaction community of understanding the landscape in which Agile software developers practice. A longitudinal ethnographic study of professional Agile software developers in Australia is drawn on to present an account of their everyday work.
Qiao, Y, Sun, X & Yu, N 1970, 'Determinantal Complexities and Field Extensions', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Symposium on Algorithms and Computation, Springer Berlin Heidelberg, Hong Kong, pp. 119-129.
View/Download from: Publisher's site
View description>>
Let double-struck F be a field of characteristic ≠ 2. The determinantal complexity of a polynomial P ∈ double-struck F[x1,..., x n] is defined as the smallest size of a matrix M whose entries are linear polynomials of xi 's over double-struck F, such that P = det(M) as polynomials in double-struck F[x1,..., xn]. To determine the determinantal complexity of the permanent polynomial is a long-standing open problem. Let double-struck K be an extension field of double-struck F; then P can be viewed as a polynomial over double-struck K. We are interested in the comparison between the determinantal complexity of P over double-struck K (denoted as dcdouble-struck K(P))), and that of P over double-struck F (denoted as dcdouble-struck F(P). It is clear that dcdouble-struck K(P) ≤, dcdouble-struck F(P) and the question is whether strict inequality can happen. In this note we consider polynomials defined over ℚ. For P = x12 + ⋯ + xn2, there exists a constant multiplicative gap between dcℝ(P) and dcℂ(P): we prove dc ℝ(P) ≥ n while ⌈n/2⌉ + 1 ≥ dc ℂ(P). We also consider additive constant gaps: (1) there exists a quadratic polynomial Q ∈ ℚ [x, y], such that dcℚ-(Q) = 3 and ; (2) there exists a cubic polynomial C ∈ ℚ[x, y] with a rational zero, such that dcℚ(C) = 4 and dcℚ-(C) = 3. For additive constant gaps, geometric criteria are presented to decide when dcℚ = dcℚ-. © 2013 Springer-Verlag.
Raduescu, C & Gill, AQ 1970, 'Towards the Development of a Complex Adaptive Project Environment Assessment Tool.', ISD, International Conference on Information Systems Development, Springer, Sevilla, Spain, pp. 487-498.
View/Download from: Publisher's site
Ramezani, F, Hussain, FK & Lu, J 1970, 'A Fuzzy Predictable Load Balancing Approach in Cloud Computing', Proceedings of the International Conference on Grid & Cloud Computing and Applications GCA'13, International Conference on Grid & Cloud Computing and Applications, WorldComp, Las Vegas, Nevada, USA.
View description>>
Cloud computing is a new paradigm for hosting and delivering services on demand over the internet where users access services. It is an example of an ultimately virtualized system, and a natural evolution for data centers that employ automated systems management, workload balancing, and virtualization technologies. Live virtual machine (VM) migration is a technique to achieve load balancing in cloud environment by transferring an active overload VM from one physical host to another one without disrupting the VM. In this study, to eliminate whole VM migration in load balancing process, we propose a Fuzzy Predictable Load Balancing (FPLB) approach which confronts with the problem of overload VM, by assigning the extra tasks from overloaded VM to another similar VM instead of whole VM migration. In addition, we propose a Fuzzy Prediction Method (FPM) to predict VMs migration time. This approach also contains a multi-objective optimization model to migrate these tasks to a new VM host. In proposed FPLB approach there is no need to pause VM during migration time. Furthermore, considering this fact that VM live migration contrast to tasks migration takes longer to complete and needs more idle capacity in host physical machine (PM), the proposed approach will significantly reduce time, idle memory and cost consumption.
Ramezani, F, Lu, J & Hussain, F 1970, 'An Online Fuzzy Decision Support System for Resource Management in Cloud Environments', PROCEEDINGS OF THE 2013 JOINT IFSA WORLD CONGRESS AND NAFIPS ANNUAL MEETING (IFSA/NAFIPS), Joint IFSA World Congress and NAFIPS Annual Meeting, IEEE, Edmonton, Canada, pp. 754-759.
View/Download from: Publisher's site
View description>>
Cloud computing is a large-scale distributed computing paradigm driven by economies of scale, in which a pool of abstracted, virtualized, dynamically-scalable, managed computing power, storage, platforms, and services are delivered on demand to external customers over the Internet. Although a significant amount of studies have been developed to optimize resource management and task scheduling in cloud computing, none of them considered the impact of task scheduling patterns on resource management and vice versa. To overcome this drawback, and considering the lack of resources in cloud environments and growing customer demands for cloud services, this paper proposes an Online Resource Management Decision Support System (ORMDSS) that addresses both tasks scheduling and resource management optimization in a unique system. In addition, ORMDSS contains a fuzzy prediction method for predicting VM workload patterns and VM migration time by applying neural networks and fuzzy expert systems. This ORMDSS helps cloud providers to automatically allocate scare resources to the applications and services in an optimal way. It is expected that the ORMDSS not only increases cloud utilization and QoS, but also decreases cost and response time.
Ramezani, F, Lu, J & Hussain, F 1970, 'Task Scheduling Optimization in Cloud Computing Applying Multi-Objective Particle Swarm Optimization', SERVICE-ORIENTED COMPUTING, ICSOC 2013, IEEE International Conference on Service-Oriented Computing and Applications, Springer, Berlin, Germany, pp. 237-251.
View/Download from: Publisher's site
View description>>
Optimizing the scheduling of tasks in a distributed heterogeneous computing environment is a nonlinear multi-objective NP-hard problem which is playing an important role in optimizing cloud utilization and Quality of Service (QoS). In this paper, we develop a comprehensive multi-objective model for optimizing task scheduling to minimize task execution time, task transferring time, and task execution cost. However, the objective functions in this model are in conflict with one another. Considering this fact and the supremacy of Particle Swarm Optimization (PSO) algorithm in speed and accuracy, we design a multi-objective algorithm based on multi-objective PSO (MOPSO) method to provide an optimal solution for the proposed model. To implement and evaluate the proposed model, we extend Jswarm package to multi-objective Jswarm (MO-Jswarm) package. We also extend Cloudsim toolkit applying MO-Jswarm as its task scheduling algorithm. MO-Jswarm in Cloudsim determines the optimal task arrangement among VMs according to MOPSO algorithm. The simulation results show that the proposed method has the ability to find optimal trade-off solutions for multi-objective task scheduling problems that represent the best possible compromises among the conflicting objectives, and significantly increases the QoS. © 2013 Springer-Verlag.
Ramezani, F, Lu, J & Hussain, FK 1970, 'Task Scheduling Optimization in Cloud Computing Applying Multi-Objective Particle Swarm Optimization', Proceedings of the 11th International Conference on Service-Oriented Computing, 11th International Conference on Service-Oriented Computing, Springer Verlag, Berlin, Germany, pp. 237-251.
Rehman, ZU, Hussain, OK & Hussain, FK 1970, 'Multi-Criteria IaaS Service Selection based on QoS History', 2013 IEEE 27TH INTERNATIONAL CONFERENCE ON ADVANCED INFORMATION NETWORKING AND APPLICATIONS (AINA), International Conference on Advanced Information Networking and Applications (was ICOIN), IEEE, Barcelona, Spain, pp. 1129-1135.
View/Download from: Publisher's site
View description>>
The growing number of cloud services have made service selection a challenging decision-making problem by providing wide ranging choices for cloud service consumers. This necessitates the use of formal decision making methodologies to assist a decision maker in selecting the service that best fulfils the user's requirements. In this paper, we present a cloud service selection methodology that utilizes QoS history over different time periods, performs Multi-Criteria Decision Analysis to rank all cloud services in each time period in accordance with user preferences before aggregating the results to determine the overall service rank of all the available services for cloud service selection. © 2013 IEEE.
Reitsma, L, Smith, A, van den Hoven, E & IEEE 1970, 'StoryBeads: Preserving Indigenous Knowledge through Tangible Interaction Design', 2013 INTERNATIONAL CONFERENCE ON CULTURE AND COMPUTING (CULTURE AND COMPUTING 2013), International Conference on Culture and Computing, IEEE Computer Society, Kyoto, Japan, pp. 79-85.
View/Download from: Publisher's site
View description>>
This paper addresses the need to preserve culturally unique knowledge for future generations. This user-centered design-research case study focused on preserving Indigenous Knowledge (IK) of the South-African BaNtwane culture, specifically focusing on their rich beadwork and oral traditions. Our approach allows for design research in a scenario where the community is represented by a few prominent members, simultaneously making provision for the incorporation of modern technology in a society trailing in technology adoption. The study resulted in a recording device that fits the target group's oral tradition and is based on a concept in which oral stories are recorded and associated with tangible beads that can be incorporated into traditional beadwork. The device and interaction design embraces the culture's aesthetics and existing IK mechanisms.
Ren, Y, Zhu, T, Li, G & Zhou, W 1970, 'Top-N Recommendations by Learning User Preference Dynamics', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 390-401.
View/Download from: Publisher's site
View description>>
In a recommendation system, user preference patterns and the preference dynamic effect are observed in the user x item rating matrix. However, their value has barely been exploited in previous research. In this paper, we formalize the preference pattern as a sparse matrix and propose a Preference Pattern Subspace to iteratively model the personal and the global preference patterns with an EM-like algorithm. Furthermore, we propose a PrepSVD-I algorithm by transforming the Top-N recommendation as a pairwise preference learning process. Experiment results show that the proposed PrepSVD-I algorithm significantly outperforms the state-of-the-art Top-N recommendation algorithms. © Springer-Verlag 2013.
Sanders, K, Ahmadzadeh, M, Clear, T, Edwards, SH, Goldweber, M, Johnson, C, Lister, R, McCartney, R, Patitsas, E & Spacco, J 1970, 'The Canterbury QuestionBank', Proceedings of the ITiCSE working group reports conference on Innovation and technology in computer science education-working group reports, ITiCSE '13: Innovation and Technology in Computer Science Education conference 2013, ACM, Canterbury, Kent, United Kingdom, pp. 33-51.
View/Download from: Publisher's site
View description>>
Abstract: In this paper, we report on an ITiCSE-13 Working Group that developed a set of 654 multiple-choice questions on CS1 and CS2 topics, the Canterbury QuestionBank. We describe the questions, the metadata we investigated, and some preliminary investigations of possible research uses of the QuestionBank. The QuestionBank is publicly available as a repository for computing education instructors and researchers.
Shang-Lin Wu, Chun-Wei Wu, Pal, NR, Chih-Yu Chen, Shi-An Chen & Chin-Teng Lin 1970, 'Common spatial pattern and linear discriminant analysis for motor imagery classification', 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2013 IEEE Symposium on Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), IEEE, Singapore, pp. 146-151.
View/Download from: Publisher's site
View description>>
A Brain-Computer Interface (BCI) system provides a convenient way of communication for healthy subjects and subjects who suffer from severe diseases such as amyotrophic lateral sclerosis (ALS). Motor imagery (MI) is one of the popular ways of designing BCI systems. The architecture of many BCI system is quite complex and they involve time consuming processing. The electroencephalography (EEG) signal is the most commonly used inputs for BCI applications but EEG is often contaminated with noise. To overcome such drawbacks, in this paper we use the common spatial pattern (CSP) for feature extraction from EEG and the linear discriminant analysis (LDA) for motor imagery classification. In this study, CSP and LDA have been used to reduce the artifact and classify MI-based EEG signal. We have used two-level cross validation scheme to determine the subject specific best time window and number of CSP features. We have compared the performance of our system with BCI competition results. We have also experimented with MI data generated in our lab. The proposed system is found to produce good results. In particular, using our EEG data for MI movements, we have obtained an average classification accuracy of 80% for two subjects using only 9 channels, without any feature selection. This proposed MI-based BCI system may be used in real life applications. © 2013 IEEE.
Sharma, N, Chanda, S, Pal, U & Blumenstein, M 1970, 'Word-Wise Script Identification from Video Frames', 2013 12th International Conference on Document Analysis and Recognition, 2013 12th International Conference on Document Analysis and Recognition (ICDAR), IEEE, Washington, DC, USA, pp. 867-871.
View/Download from: Publisher's site
View description>>
Script identification is an essential step for the efficient use of the appropriate OCR in multilingual document images. There are various techniques available for script identification from printed and handwritten document images, but script identification from video frames has not been explored much. This paper presents a study of some pre-processing techniques and features for word-wise script identification from video frames. Traditional features, namely Zernike moments, Gabor and gradient, have performed well for handwritten and printed documents having simple backgrounds and adequate resolution for OCR. Video frames are mostly coloured and suffer from low resolution, blur, background noise, to mention a few. In this paper, an attempt has been made to explore whether the traditional script identification techniques can be useful in video frames. Three feature extraction techniques, namely Zernike moments, Gabor and gradient features, and SVM classifiers were considered for analyzing three popular scripts, namely English, Bengali and Hindi. Some pre-processing techniques such as super resolution and skeletonization of the original word images were used in order to overcome the inherent problems with video. Experiments show that the super resolution technique with gradient features has performed well, and an accuracy of 87.5% was achieved when testing on 896 words from three different scripts. The study also reveals that the use of proper pre-processing approaches can be helpful in applying traditional script identification techniques to video frames. © 2013 IEEE.
Sharma, N, Shivakumara, P, Pal, U, Blumenstein, M & Tan, CL 1970, 'A New Method for Character Segmentation from Multi-oriented Video Words', 2013 12th International Conference on Document Analysis and Recognition, 2013 12th International Conference on Document Analysis and Recognition (ICDAR), IEEE, USA, pp. 413-417.
View/Download from: Publisher's site
View description>>
This paper presents a two-stage method for multi-oriented video character segmentation. Words segmented from video text lines are considered for character segmentation in the present work. Words can contain isolated or non-touching characters, as well as touching characters. Therefore, the character segmentation problem can be viewed as a two stage problem. In the first stage, text cluster is identified and isolated (non-touching) characters are segmented. The orientation of each word is computed and the segmentation paths are found in the direction perpendicular to the orientation. Candidate segmentation points computed using the top distance profile are used to find the segmentation path between the characters considering the background cluster. In the second stage, the segmentation results are verified and a check is performed to ascertain whether the word component contains touching characters or not. The average width of the components is used to find the touching character components. For segmentation of the touching characters, segmentation points are then found using average stroke width information, along with the top and bottom distance profiles. The proposed method was tested on a large dataset and was evaluated in terms of precision, recall and f-measure. A comparative study with existing methods reveals the superiority of the proposed method. © 2013 IEEE.
Shui Yu, Wanlei Zhou, Song Guo & Minyi Guo 1970, 'A dynamical Deterministic Packet Marking scheme for DDoS traceback', 2013 IEEE Global Communications Conference (GLOBECOM), 2013 IEEE Global Communications Conference (GLOBECOM 2013), IEEE, Atlanta, GA, pp. 729-734.
View/Download from: Publisher's site
Smith, S, Gill, AQ, Hasan, H & Ghobadi, S 1970, 'An Enterprise Architecture Driven Approach to Virtualisation.', PACIS, Pacific Asia Conference on Information Systems, AIS, Seogwipo City, Jeju-do, Korea, pp. 50-50.
View description>>
Organisations have shown a significant interest in the adoption of virtualisation technology for improving the efficiency of their Data Centres (DC) from both the resource performance and cost efficiency viewpoints. By improving the efficiency of data centres we can sustainably manage their impact on the environment by controlling their energy consumption. The intentions are clear but how best to approach to Data Centre virtualisation is not. This paper proposes an integrated Enterprise Architecture and Information Infrastructure (EAII) driven approach to guide the Data Centre virtualisation. The EAII approach has been developed based on the review and analysis of wellknown The Open Group Architecture Framework (TOGAF) and Information Infrastructure (II) model. The proposed integrated EAII approach seems appropriate to guide and align business strategy and virtualisation implementation for data centres of any size in any industry vertical.
Sriyanyong, P & Lu, H 1970, 'Implementation and comparison of PSO-based algorithms for multi-modal optimization problems', AIP Conference Proceedings, 2013 INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL MODELS FOR LIFE SCIENCES, AIP, Sydney, Australia, pp. 165-174.
View/Download from: Publisher's site
View description>>
This paper aims to compare the global search capability and overall performance of a number of Particle Swarm Optimization (PSO) based algorithms in the context solving the Dynamic Economic Dispatch (DED) problem which takes into account the operation limitations of generation units such as valve-point loading effect as well as ramp rate limits. The comparative study uses six PSO-based algorithms including the basic PSO and hybrid PSO algorithms using a popular benchmark test IEEE power system which is 10-unit 24-hour system with non-smooth cost functions. The experimental results show that one of the hybrid algorithms that combines the PSO with both inertia weight and constriction factor, and the Gaussian mutation operator (CBPSO-GM) is promising in achieving the near global optimal of a non-linear multi-modal optimization problem, such as the DED problem under the consideration
Tafavogh, S, Navarro, KF, Catchpoole, DR & Kennedy, PJ 1970, 'Segmenting Neuroblastoma Tumor Images and Splitting Overlapping Cells Using Shortest Paths between Cell Contour Convex Regions.', AIME, Artificial Intelligence in Medicine in Europe, Springer, Murcia, Spain, pp. 171-175.
View/Download from: Publisher's site
View description>>
Neuroblastoma is one of the most fatal paediatric cancers. One of the major prognostic factors for neuroblastoma tumour is the total number of neuroblastic cells. In this paper, we develop a fully automated system for counting the total number of neuroblastic cells within the images derived from Hematoxylin and Eosin stained histological slides by considering the overlapping cells. We finally propose a novel multi-stage cell counting algorithm, in which cellular regions are extracted using an adaptive thresholding technique. Overlapping and single cells are discriminated using morphological differences. We propose a novel cell splitting algorithm to split overlapping cells into single cells using the shortest path between contours of convex regions. © 2013 Springer-Verlag.
Tao, Y & Yu, S 1970, 'DDoS Attack Detection at Local Area Networks Using Information Theoretical Metrics', 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications, 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), IEEE, Melbourne, AUSTRALIA, pp. 233-240.
View/Download from: Publisher's site
Teague, D, Comey, M, Ahadi, A & Lister, R 1970, 'A qualitative think aloud study of the early Neo-Piagetian stages of reasoning in novice programmers', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Adelaide, Australia, pp. 87-95.
View description>>
Recent research indicates that some of the difficulties faced by novice programmers are manifested very early in their learning. In this paper, we present data from think aloud studies that demonstrate the nature of those difficulties. In the think alouds, novices were required to complete short programming tasks which involved either hand executing ('tracing') a short piece of code, or writing a single sentence describing the purpose of the code. We interpret our think aloud data within a neo-Piagetian framework, demonstrating that some novices reason at the sensorimotor and preoperational stages, not at the higher concrete operational stage at which most instruction is implicitly targeted.
Thommandram, A, Eklund, JM & McGregor, C 1970, 'Detection of apnoea from respiratory time series data using clinically recognizable features and kNN classification', 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), IEEE, Osaka, JAPAN, pp. 5013-5016.
View/Download from: Publisher's site
Thommandram, A, Pugh, JE, Eklund, JM, McGregor, C & James, AG 1970, 'Classifying neonatal spells using real-time temporal analysis of physiological data streams: Algorithm development', 2013 IEEE Point-of-Care Healthcare Technologies (PHT), 2013 IEEE Point-of-Care Healthcare Technologies (PHT), IEEE, Bangalore, INDIA, pp. 240-243.
View/Download from: Publisher's site
Vermeulen, J, Luyten, K, van den Hoven, E & Coninx, K 1970, 'Crossing the bridge over Norman's Gulf of Execution', Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13: CHI Conference on Human Factors in Computing Systems, ACM, Paris, France, pp. 1931-1940.
View/Download from: Publisher's site
View description>>
Feedback and affordances are two of the most well-known principles in interaction design. Unfortunately, the related and equally important notion of feedforward has not been given as much consideration. Nevertheless, feedforward is a powerful design principle for bridging Norman's Gulf of Execution. We reframe feedforward by disambiguating it from related design principles such as feedback and perceived affordances, and identify new classes of feedforward. In addition, we present a reference framework that provides a means for designers to explore and recognize different opportunities for feedforward. Copyright © 2013 ACM.
Wan, L, Chen, L & Zhang, C 1970, 'Mining Dependent Frequent Serial Episodes from Uncertain Sequence Data', 2013 IEEE 13th International Conference on Data Mining, 2013 IEEE International Conference on Data Mining (ICDM), IEEE, Dallas, TX, USA, pp. 1211-1216.
View/Download from: Publisher's site
View description>>
In this paper, we focus on the problem of mining Probabilistic Dependent Frequent Serial Episodes (P-DFSEs) from uncertain sequence data. By observing that the frequentness probability of an episode in an uncertain sequence is a Markov Chain imbeddable variable, we first propose an Embeded Markov Chain-based algorithm that efficiently computes the frequentness probability of an episode by projecting the probability space into a set of limited partitions. To further improve the computation efficiency, we devise an optimized approach that prunes candidate episodes early by estimating the upper bound of their frequentness probabilities. © 2013 IEEE.
Wan, L, Chen, L & Zhang, C 1970, 'Mining frequent serial episodes over uncertain sequence data', Proceedings of the 16th International Conference on Extending Database Technology, EDBT/ICDT '13: Joint 2013 EDBT/ICDT Conferences, ACM, Genoa, Italy, pp. 215-226.
View/Download from: Publisher's site
View description>>
Data uncertainty has posed many unique challenges to nearly all types of data mining tasks, creating a need for uncertain data mining. In this paper, we focus on the particular task of mining probabilistic frequent serial episodes (P-FSEs) from uncertain sequence data, which applies to many real applications including sensor readings as well as customer purchase sequences. We first define the notion of P-FSEs, based on the frequentness probabilities of serial episodes under possible world semantics. To discover P-FSEs over an uncertain sequence, we propose: 1) an exact approach that computes the accurate frequentness probabilities of episodes; 2) an approximate approach that approximates the frequency of episodes using probability models; 3) an optimized approach that efficiently prunes a candidate episode by estimating an upper bound of its frequentness probability using approximation techniques. We conduct extensive experiments to evaluate the performance of the developed data mining algorithms. Our experimental results show that: 1) while existing research demonstrates that approximate approaches are orders of magnitudes faster than exact approaches, for P-FSE mining, the efficiency improvement of the approximate approach over the exact approach is marginal; 2) although it has been recognized that the normal distribution based approximation approach is fairly accurate when the data set is large enough, for P-FSE mining, the binomial distribution based approximation achieves higher accuracy when the the number of episode occurrences is limited; 3) the optimized approach clearly outperforms the other two approaches in terms of the runtime, and achieves very high accuracy. © 2013 ACM.
Wang, X, Wang, Z & Xu, X 1970, 'An Improved Artificial Bee Colony Approach to QoS-Aware Service Selection', 2013 IEEE 20th International Conference on Web Services, 2013 IEEE International Conference on Web Services (ICWS), IEEE, pp. 395-402.
View/Download from: Publisher's site
View description>>
As available services accumulate on the Internet, QoS-aware service selection (SSP) becomes an increasingly difficult task. Since Artificial Bee Colony algorithm (ABC) has been successful in solving many problems as a simpler implementation of swarm intelligence, its application to SSP is promising. However, ABC was initially designed for numerical optimization, and its effectiveness highly depends on what we call optimality continuity property of the solution space, i.e., similar variable values (or neighboring solutions) result in similar objective values (or evaluation results). We will show that SSP does not possess such property. We further propose an approximation approach based on greedy search strategies for ABC, to overcome this problem. In this approach, neighboring solutions are generated for a composition greedily based on the neighboring services of its component services. Two algorithms with different neighborhood measures are presented based on this approach. The resulting neighborhood structure of the proposed algorithms is analogical to that of continuous functions, so that the advantages of ABC can be fully leveraged in solving SSP. Also, they are pure online algorithms which are as simple as canonical ABC. The rationale of the proposed approach is discussed and the complexity of the algorithms is analyzed. Experiments conducted against canonical ABC indicate that the proposed algorithms can achieve better optimality within limited time. © 2013 IEEE.
Wang, Y-K, Jung, T-P, Chen, S-A, Huang, C-S & Lin, C-T 1970, 'Tracking Attention Based on EEG Spectrum', Communications in Computer and Information Science, International Conference on Human-Computer Interaction, Springer Berlin Heidelberg, Las Vegas, NV, USA, pp. 450-454.
View/Download from: Publisher's site
View description>>
Distraction while driving is a serious problem that can have many catastrophic consequences. Developing a countermeasure to detect the drivers' distraction is imperative. This study measured Electroencephalography (EEG) signals from six healthy participants while they were asked to pay their full attention to a lane-keeping driving task or a math problem-solving task. The time courses of six distinct brain networks (Frontal, Central, Parietal, Occipital, Left Motor, and Right Motor) separated by Independent Component Analysis were used to build the distraction-detection model. EEG data were segmented into 400-ms epochs. Across subjects, 80% of the EEG epochs were used to train various classifiers that were tested against the remaining 20% of the data. The classification performance based on support vector machines (SVM) with a radial basis function (RBF) kernel achieved accuracy of 84.7±2.7% or 85.8±1.3% for detecting subjects' focuses of attention to the math-solving or lane-deviation task, respectively. The high attention-detection accuracy demonstrated the feasibility of accurately detecting drivers' attention based on the brain activities. This demonstration may lead to a practical real-time distraction-detection system for improving road safety. © Springer-Verlag Berlin Heidelberg 2013.
Wang, Z, Luo, T, Xu, G & Wang, X 1970, 'A New Indexing Technique for Supporting By-attribute Membership Query of Multidimensional Data', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Web-Age Information Management, Springer Berlin Heidelberg, China, pp. 266-277.
View/Download from: Publisher's site
View description>>
Multidimensional Data indexing and lookup has been widely used in online data-intensive applications involving in data with multiple attributes. However, there remains a long way to go for the high performance multi-attribute data representation and lookup: the performance of index drops down with the increase of dimensions. In this paper, we present a novel data structure called Bloom Filter Matrix (BFM) to support multidimensional data indexing and by-attribute search. The proposed matrix is based on the Cartesian product of different bloom filters, each representing one attribute of the original data. The structure and parameter of each bloom filter is designed to fit the actual data characteristic and system demand, enabling fast object indexing and lookup, especially by-attribute search of multidimensional data. Experiments show that Bloom Filter Matrix is a fast and accurate data structure for multi-attribute data indexing and by-attribute search with high-correlated queries. © 2013 Springer-Verlag.
Wang, Z, Xu, X & Wang, X 1970, 'Mass Customization Oriented and Cost-Effective Service Network', ENTERPRISE INTEROPERABILITY, IWEI 2013, 5th International IFIP Working Conference on Enterprise Interoperability (IWEI), Springer Berlin Heidelberg, Univ Twente, Enschede, NETHERLANDS, pp. 172-185.
View/Download from: Publisher's site
Wu, D, Zhang, G & Lu, J 1970, 'A Fuzzy Tree Similarity Based Recommendation Approach for Telecom Products', PROCEEDINGS OF THE 2013 JOINT IFSA WORLD CONGRESS AND NAFIPS ANNUAL MEETING (IFSA/NAFIPS), Joint IFSA World Congress NAFIPS Annual Meeting, IEEE, Edmonton, Canada, pp. 813-818.
View/Download from: Publisher's site
View description>>
Due to the huge product assortments and complex descriptions of telecom products, it is a great challenge for customers to select appropriate products. A fuzzy tree similarity based hybrid recommendation approach is proposed to solve this issue. In this study, fuzzy techniques are used to deal with the various uncertainties existing within the product and customer data. A fuzzy tree similarity measure is developed to evaluate the semantic similarity between tree structured products or user profiles. The similarity measures for items and users both integrate the collaborative filtering (CF) and semantic similarities. The final recommendation hybridizes item-based and user-based CF recommendation techniques. A telecom product recommendation case study is given to show the effectiveness of the proposed approach. © 2013 IEEE.
Wu, D, Zhang, G, Lu, J & IEEE 1970, 'A Fuzzy Tree Similarity Measure and Its Application in Telecom Product Recommendation', 2013 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC 2013), IEEE International Conference on Systems, Man and Cybernetics, IEEE, Manchester, UK, pp. 3483-3488.
View/Download from: Publisher's site
View description>>
The recommender systems field has been well developed in the last few years to provide item recommendations to related users. Existing recommendation approaches, however, assume that an item is described by a single value or a vector. Unfortunately, some items in real world applications, such as telecom products, could have a tree structure. This paper aims to handle this issue by developing a comprehensive fuzzy tree similarity measure. The fuzzy tree similarity measure compares both the concepts and values in two trees of items. The focus of this study is primarily on the fuzzy value similarity between two trees. In the similarity measure, each attribute is associated with a set of linguistic terms to express the value granularly. The node values are first transformed to membership vectors related to the linguistic terms, and the values of the conceptual corresponding nodes are then compared. These local similarities are aggregated into the final fuzzy value similarity between the two trees. A telecom product recommendation case study shows the effectiveness of the proposed fuzzy tree similarity measure and its applicability for telecom product recommendations. © 2013 IEEE.
Wu, L, Chin, A, Xu, G, Du, L, Wang, X, Meng, K, Guo, Y & Zhou, Y 1970, 'Who Will Follow Your Shop? Exploiting Multiple Information Sources in Finding Followers', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Database Systems for Advanced Applications, Springer Berlin Heidelberg, Wuhan, pp. 401-415.
View/Download from: Publisher's site
View description>>
WuXianGouXiang is an O2O(offline to online and vice versa)-based mobile application that recommends the nearby coupons and deals for users, by which users can also follow the shops they are interested in. If the potential followers of a shop can be discovered, the merchant's targeted advertising can be more effective and the recommendations for users will also be improved. In this paper, we propose to predict the link relations between users and shops based on the following behavior. In order to better model the characteristics of the shops, we first adopt Topic Modeling to analyze the semantics of their descriptions and then propose a novel approach, named INtent Induced Topic Search (INITS) to update the hidden topics of the shops with and without a description. In addition, we leverage the user logs and search engine results to get the similarity between users and shops. Then we adopt the latent factor model to calculate the similarity between users and shops, in which we use the multiple information sources to regularize the factorization. The experimental results demonstrate that the proposed approach is effective for detecting followers of the shops and the INITS model is useful for shop topic inference. © Springer-Verlag 2013.
Wu, Z, Yin, W, Cao, J, Xu, G & Cuzzocrea, A 1970, 'Community Detection in Multi-relational Social Networks', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Web Information Systems Engineering, Springer Berlin Heidelberg, Nanjing, pp. 43-56.
View/Download from: Publisher's site
View description>>
Multi-relational networks are ubiquitous in many fields such as bibliography, twitter, and healthcare. There have been many studies in the literature targeting at discovering communities from social networks. However, most of them have focused on single-relational networks. A hint of methods detected communities from multi-relational networks by converting them to single-relational networks first. Nevertheless, they commonly assumed different relations were independent from each other, which is obviously unreal to real-life cases. In this paper, we attempt to address this challenge by introducing a novel co-ranking framework, named MutuRank. It makes full use of the mutual influence between relations and actors to transform the multi-relational network to the single-relational network. We then present GMM-NK (Gaussian Mixture Model with Neighbor Knowledge) based on local consistency principle to enhance the performance of spectral clustering process in discovering overlapping communities. Experimental results on both synthetic and real-world data demonstrate the effectiveness of the proposed method. © 2013 Springer-Verlag.
Yang Wang, Bo Liu & Lin Gui 1970, 'Adaptive Scan-based Asynchronous Neighbor Discovery in wireless networks using directional antennas', 2013 International Conference on Wireless Communications and Signal Processing, 2013 International Conference on Wireless Communications and Signal Processing (WCSP), IEEE, Hangzhou, PEOPLES R CHINA.
View/Download from: Publisher's site
Ye, D & Zhang, M 1970, 'A Study on the Evolution of Cooperation in Networks', WEB INFORMATION SYSTEMS ENGINEERING - WISE 2013, PT II, 14th International Conference on Web Information Systems Engineering (WISE), Springer Berlin Heidelberg, Nanjing, PEOPLES R CHINA, pp. 285-298.
View/Download from: Publisher's site
Yi, X, Paulet, R, Bertino, E & Xu, G 1970, 'Private data warehouse queries', Proceedings of the 18th ACM symposium on Access control models and technologies, SACMAT '13: 18th ACM Symposium on Access Control Models and Technologies, ACM, Amsterdam, pp. 25-35.
View/Download from: Publisher's site
View description>>
Publicly accessible data warehouses are an indispensable resource for data analysis. But they also pose a significant risk to the privacy of the clients, since a data warehouse operator may follow the client's queries and infer what the client is interested in. Private Information Retrieval (PIR) techniques allow the client to retrieve a cell from a data warehouse without revealing to the operator which cell is retrieved. However, PIR cannot be used to hide OLAP operations performed by the client, which may disclose the client's interest. This paper presents a solution for private data warehouse queries on the basis of the Boneh-Goh-Nissim cryptosystem which allows one to evaluate any multi-variate polynomial of total degree 2 on ciphertexts. By our solution, the client can perform OLAP operations on the data warehouse and retrieve one (or more) cell without revealing any information about which cell is selected. Furthermore, our solution supports some types of statistical analysis on data warehouse, such as regression and variance analysis, without revealing the client's interest. Our solution ensures both the server's security and the client's security. Copyright 2013 ACM.
Yin, H, Sun, Y, Cui, B, Hu, Z & Chen, L 1970, 'LCARS', Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, KDD' 13: The 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, ACM, Chicago, Illinois USA, pp. 221-229.
View/Download from: Publisher's site
View description>>
Newly emerging location-based and event-based social network services provide us with a new platform to understand users preferences based on their activity history. A user can only visit a limited number of venues/events and most of them are within a limited distance range, so the user-item matrix is very sparse, which creates a big challenge for traditional collaborative filtering-based recommender systems. The problem becomes more challenging when people travel to a new city where they have no activity history. In this paper, we propose LCARS, a location-content-aware recommender system that offers a particular user a set of venues (e.g., restaurants) or events (e.g., concerts and exhibitions) by giving consideration to both personal interest and local preference. This recommender system can facilitate peoples travel not only near the area in which they live, but also in a city that is new to them. Specifically, LCARS consists of two components: offline modeling and online recommendation. The offline modeling part, called LCA- LDA, is designed to learn the interest of each individual user and the local preference of each individual city by capturing item co- occurrence patterns and exploiting item contents. The online recommendation part automatically combines the learnt interest of the querying user and the local preference of the querying city to produce the top-k recommendations. To speed up this online process, a scalable query processing technique is developed by extending the classic Threshold Algorithm (TA). We evaluate the performance of our recommender system on two large-scale real data sets, Douban- Event and Foursquare. The results show the superiority of LCARS in recommending spatial items for users, especially when traveling to new cities, in terms of both effectiveness and efficiency.
You, Y, Xu, G, Cao, J, Zhang, Y & Huang, G 1970, 'Leveraging Visual Features and Hierarchical Dependencies for Conference Information Extraction', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Joint International Conference on Asia-Pacific Web Conference (APWeb)/Web-Age Information Management (WAIM), Springer Berlin Heidelberg, Sydney, pp. 404-416.
View/Download from: Publisher's site
View description>>
Traditional information extraction methods mainly rely on visual feature assisted techniques; but without considering the hierarchical dependencies within the paragraph structure, some important information is missing. This paper proposes an integrated approach for extracting academic information from conference Web pages. Firstly, Web pages are segmented into text blocks by applying a new hybrid page segmentation algorithm which combines visual feature and DOM structure together. Then, these text blocks are labeled by a Tree-structured Random Fields model, and the block functions are differentiated using various features such as visual features, semantic features and hierarchical dependencies. Finally, an additional post-processing is introduced to tune the initial annotation results. Our experimental results on real-world data sets demonstrated that the proposed method is able to effectively and accurately extract the needed academic information from conference Web pages. © 2013 Springer-Verlag.
Yu, S, Doss, R, Zhou, W & Guo, S 1970, 'A general cloud firewall framework with dynamic resource allocation', 2013 IEEE International Conference on Communications (ICC), ICC 2013 - 2013 IEEE International Conference on Communications, IEEE, pp. 1941-1945.
View/Download from: Publisher's site
View description>>
Cloud is becoming a dominant computing platform. However, we see few work on how to protect cloud data centers. As a cloud usually hosts many different type of applications, the traditional packet level firewall mechanism is not suitable for cloud platforms in case of complex attacks. It is necessary to perform anomaly detection at the event level. Moreover, protecting objects are more diverse than the traditional firewall. Motivated by this, we propose a general framework of cloud firewall, which features event level detection chain with dynamic resource allocation. We establish a mathematical model for the proposed framework. Moreover, a linear resource investment function is proposed for economical dynamical resource allocation for cloud firewalls. A few conclusions have been extracted for the reference of cloud service providers and designers. © 2013 IEEE.
Yu, S, Wang, H, Lin, X & Ruj, S 1970, 'NFSP 2013: Message from the Chairs', 2013 IEEE 33rd International Conference on Distributed Computing Systems Workshops, 2013 IEEE 33rd International Conference on Distributed Computing Systems Workshops (ICDCSW), IEEE.
View/Download from: Publisher's site
Zechun Hu, Shu Zhang, Fang Zhang & Haiyan Lu 1970, 'SCUC with battery energy storage system for peak-load shaving and reserve support', 2013 IEEE Power & Energy Society General Meeting, 2013 IEEE Power & Energy Society General Meeting, IEEE, Vancouver, BC, Canada.
View/Download from: Publisher's site
View description>>
This paper aims to investigate the benefit of deploying battery energy storage system (BESS) in a power system for reducing production cost, shaving peak-load and providing reserve support. A BESS model is built which takes into account the charging and discharging efficiencies, charging/discharging power limits, and reserve capacity limits. This model is incorporated into the security-constrained unit commitment (SCUC) problem. The new SCUC problem is solved to optimally allocate the charging and discharging power and reserve capacity of each BESS. Tests are carried out on the IEEE 24-bus system and simulation results show that lower operational cost can be achieved by using BESS for both peak-load shaving/shifting and reserve support. © 2013 IEEE.
Zhang, Z, Oberst, S & Lai, JCS 1970, 'Application of polynomial chaos expansions to analytical models of friction oscillators', Annual Conference of the Australian Acoustical Society 2013, Acoustics 2013: Science, Technology and Amenity, Annual Conference of the Australian Acoustical Society, Victor Harbor, Australia, pp. 408-414.
View description>>
Despite past substantial research efforts, the prediction of brake squeal propensity remains a largely unresolved problem. The standard practice to predict the brake squeal propensity is to analyse dynamic instabilities using the complex eigenvalue analysis. However, it is well known that not every predicted unstable vibration mode will lead to squeal and vice-versa. Owing to nonlinearity and problem complexity (e.g. operating conditions), treating brake squeal with uncertainty seems appealing. Another indicator of brake squeal propensity, not often used, is based on negative dissipated energy. In this study, uncertainty analysis induced by polynomial chaos expansions is examined for 1-dof and 4-dof friction models. Results are compared with dissipated energy calculations and standard complex eigenvalue analysis. The potential of this approach for the prediction of brake squeal propensity is discussed. © (2013) by the Australian Acoustical Society.
Zhang, Z, Zhang, G, Lu, J & Guo, C 1970, 'A fuzzy tri-level decision making algorithm and its application in supply chain', PROCEEDINGS OF THE 8TH CONFERENCE OF THE EUROPEAN SOCIETY FOR FUZZY LOGIC AND TECHNOLOGY (EUSFLAT-13), World Congress of the International-Fuzzy-Systems-Association (IFSA) / Conference of the European-Society-for-Fuzzy-Logic-and-Technology (EUSFLAT), Atlantis Press, Milano, Italy, pp. 154-160.
View/Download from: Publisher's site
View description>>
In this paper, we develop a fuzzy tri-level decision making (FTLDM) model to deal with decentralized decision making problems with three levels of decision makers. Based on the -cut of fuzzy set, we transform an FTLDM problem into a multiobjective tri-level decision making problem. Based on the linear tri-level Kth-best algorithm, the global optimal solution can be obtained. A case study for third-party logistics decision making in supply chain is utilized to illustrate the effectiveness of the proposed algorithm. © 2013. The authors-Published by Atlantis Press.
Zhu, T, Li, G, Ren, Y, Zhou, W & Xiong, P 1970, 'Differential privacy for neighborhood-based collaborative filtering', Proceedings of the 2013 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, ASONAM '13: Advances in Social Networks Analysis and Mining 2013, ACM, pp. 752-759.
View/Download from: Publisher's site
View description>>
As a popular technique in recommender systems, Collaborative Filtering (CF) has received extensive attention in recent years. However, its privacy-related issues, especially for neighborhood-based CF methods, can not be overlooked. The aim of this study is to address the privacy issues in the context of neighborhood-based CF methods by proposing a Private Neighbor Collaborative Filtering (PNCF) algorithm. The algorithm includes two privacy-preserving operations: Private Neighbor Selection and Recommendation-Aware Sensitivity. Private Neighbor Selection is constructed on the basis of the notion of differential privacy to privately choose neighbors. Recommendation-Aware Sensitivity is introduced to enhance the performance of recommendations. Theoretical and experimental analysis are provided to show the proposed algorithm can preserve differential privacy while retaining the accuracy of recommendations. Copyright 2013 ACM.
Zhu, T, Li, G, Ren, Y, Zhou, W & Xiong, P 1970, 'Privacy Preserving for Tagging Recommender Systems', 2013 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT), 2013 IEEE/WIC/ACM International Joint Conferences on Web Intelligence (WI) and Intelligent Agent Technologies (IAT), IEEE, pp. 81-88.
View/Download from: Publisher's site
View description>>
Tagging recommender systems allow Internet users to annotate resources with personalized tags. The connection among users, resources and these annotations, often called a folksonomy, permits users the freedom to explore tags, and to obtain recommendations. Releasing these tagging datasets accelerates both commercial and research work on recommender systems. However, adversaries may re-identify a user and her/his sensitivity information from the tagging dataset using a little background information. Recently, several private techniques have been proposed to address the problem, but most of them lack a strict privacy notion, and can hardly resist the number of possible attacks. This paper proposes an private releasing algorithm to perturb users' profile in a strict privacy notion, differential privacy, with the goal of preserving a user's identity in a tagging dataset. The algorithm includes three privacy-preserving operations: Private Tag Clustering is used to shrink the randomized domain and Private Tag Selection is then applied to find the most suitable replacement tags for the original tags. To hide the numbers of tags, the third operation, Weight Perturbation, finally adds Laplace noise to the weight of tags We present extensive experimental results on two real world datasets, Delicious and Bibsonomy. While the personalization algorithm is successful in both cases. © 2013 IEEE.
Zhu, T, Xiong, P, Xiang, Y & Zhou, W 1970, 'An Effective Deferentially Private Data Releasing Algorithm for Decision Tree', 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications, 2013 12th IEEE International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), IEEE, pp. 388-395.
View/Download from: Publisher's site
View description>>
Differential privacy is a strong definition for protecting individual privacy in data releasing and mining. However, it is a rigid definition introducing a large amount of noise to the original dataset, which significantly decreases the quality of data mining results. Recently, how to design a suitable data releasing algorithm for data mining purpose is a hot research area. In this paper, we propose a differential private data releasing algorithm for decision tree construction. The proposed algorithm provides a non-interactive data releasing method through which miner can obtain the complete dataset for data mining purpose. With a given privacy budget, the proposed algorithm generalizes the original dataset, and then specializes it in a differential privacy constrain to construct decision trees. As the designed novel scheme selection operation can fully utilize the allocated privacy budget, the data set released by the proposed algorithm can yield better decision tree models than other method. Experimental results demonstrate that the proposed algorithm outperforms existing methods for private decision tree construction. © 2013 IEEE.