Ahad, MT, Dyson, LE & Gay, VC 2012, 'An Empirical Study of Factors Influencing the SME's Intention to Adopt m-Banking in Rural Bangladesh', Journal of Mobile Technologies, Knowledge and Society, vol. 2012, pp. 1-16.
View description>>
This research empirically studies the factors that influence the intention of SME owners and managers to adopt m-banking in rural Bangladesh. The study specifically focuses on business oriented m-banking, such as paying suppliers or receiving payments from customers, and on person-to-person use of m-banking. Although over the last ten years a wide spectrum of mbanking frameworks has emerged in various countries, very few research have focused on SMEs m-banking adoption and acceptance of the service. Another rationale for undertaking such a study is that m-banking has not yet been extended to rural Bangladesh. To fill the gap this research surveyed 550 SMEs owners/managers in four (4) rural villages. The survey indicates that poor banking facilities, cost, credibility, gender, education and SME business type are the main factors that significantly influence the intention to adopt m-banking. The analysis focuses on the three factors that have been largely overlooked in prior literature, that are banking satisfaction, m-banking advantages for SMEs, and SME business type. The study broadens our understanding of m-banking and provides insights into developing m-banking strategies in Bangladesh. This research will be of potential value in accelerating the development of m-banking in Bangladesh.
Ahmed, A, Hussain, W & Kamran, A 2012, 'Logic Formulation and Evaluation of Academic Constraints', Australian journal of basic and applied sciences / International network for scientific information; INSInet, vol. 1, no. 1, pp. 26-39.
View/Download from: Publisher's site
Arefin, AS, Mathieson, L, Johnstone, D, Berretta, R & Moscato, P 2012, 'Unveiling Clusters of RNA Transcript Pairs Associated with Markers of Alzheimer’s Disease Progression', PLoS ONE, vol. 7, no. 9, pp. e45535-e45535.
View/Download from: Publisher's site
View description>>
Background: One primary goal of transcriptomic studies is identifying gene expression patterns correlating with disease progression. This is usually achieved by considering transcripts that independently pass an arbitrary threshold (e.g. p<0.05). In diseases involving severe perturbations of multiple molecular systems, such as Alzheimer's disease (AD), this univariate approach often results in a large list of seemingly unrelated transcripts. We utilised a powerful multivariate clustering approach to identify clusters of RNA biomarkers strongly associated with markers of AD progression. We discuss the value of considering pairs of transcripts which, in contrast to individual transcripts, helps avoid natural human transcriptome variation that can overshadow disease-related changes. Methodology/Principal Findings: We re-analysed a dataset of hippocampal transcript levels in nine controls and 22 patients with varying degrees of AD. A large-scale clustering approach determined groups of transcript probe sets that correlate strongly with measures of AD progression, including both clinical and neuropathological measures and quantifiers of the characteristic transcriptome shift from control to severe AD. This enabled identification of restricted groups of highly correlated probe sets from an initial list of 1,372 previously published by our group. We repeated this analysis on an expanded dataset that included all pair-wise combinations of the 1,372 probe sets. As clustering of this massive dataset is unfeasible using standard computational tools, we adapted and re-implemented a clustering algorithm that uses external memory algorithmic approach. This identified various pairs that strongly correlated with markers of AD progression and highlighted important biological pathways potentially involved in AD pathogenesis. Conclusions/Significance: Our analyses demonstrate that, although there exists a relatively large molecular signature of AD progression, only a smal...
Arif, M, Saqib, M, Basalamah, S & Naeem, A 2012, 'Counting of moving people in the video using neural network system', Life Science Journal, vol. 9, no. 3, pp. 1384-1392.
View description>>
Automatic counting of people in the crowd using surveillance visual camera is very useful in effective crowd management, security surveillance, and many more applications. In this paper, we have proposed an intelligent framework to automate the process of people counting in the surveillance video. Foreground (moving people) segmentation from the video is done by combination of different foreground estimation techniques. Texture analysis and foreground pixel area for different segmentation techniques are used to extract the useful features. Neural Network is trained on these features and people counting accuracy of more than 96% is achieved on a benchmark video.
Ariffin, S, Dyson, L & Hoskins-McKenzie, D 2012, 'Content is King: Malaysian Industry Experts’ Point of View on Local Content for Mobile Phones', Journal of Mobile Technologies, Knolwedge, and Society, vol. 2012, no. 2012, pp. 1-9.
View/Download from: Publisher's site
View description>>
Content is the most prominent aspect of the medium for communication. The trends of the content nowadays, especially in Malaysia, have shown the lack of local content in, for example, television, radio, films, the Internet and, in more recent years, mobile phones. In order to understand the situation in Malaysia with regards to local content for mobile phones, meetings were carried out with Malaysian mobile technology specialists. Given their extensive experience in this industry, they are knowledgeable regarding mobile users needs. The findings show that there is limited local content. On the other hand, the Malaysian government has adopted a pro-active attitude to launch activities to motivate students in the universities to produce more local mobile content. Problems that need to be addressed include: low numbers of mobile content developers; lack of standardization of mobile phones; a limited business for Malaysian mobile content and limitation of bandwidth coverage in rural areas. The content on the mobile phone is also crucially important for the positioning of Malaysian local culture on the world map. One of the areas identified of importance for understanding use of mobile content is at the local institutions of higher learning. Thus, it is suggested to investigate mobile content for students and educators further on how it could benefit those participants.
Assis‐Dorr, H, Palacios‐Marques, D & Merigó, JM 2012, 'Social networking as an enabler of change in entrepreneurial Brazilian firms', Journal of Organizational Change Management, vol. 25, no. 5, pp. 699-708.
View/Download from: Publisher's site
View description>>
PurposeThis paper aims to research the effects of market orientation in the use of social networking and its relationship with organisational learning.Design/methodology/approachThe empirical study was carried out in 132 recently created Brazilian biotechnology companies. Structural equation models were used in order to test the hypotheses.FindingsThe findings suggest that market orientation is positively related to social networking and organizational learning. The study also examines businesses that employ social networks and generate learning procedures within the organisations.Practical implicationsStatistically speaking, the use of social networking platforms such as Facebook and Twitter have significant effects on the internal variables of the organisation, which is why businesses should develop new profiles that better reflect the company's corporate strategy.Originality/valueCurrently, studies carried out on technologically based social networks are a new feature of the field of management. This article brings together classic management constructs, such as organizational learning or market orientation, together with the incorporation of technological social networks.
Azadeh, A, Asadzadeh, SM, Saberi, M & Khoshmagham, S 2012, 'An integrated genetic algorithm-principal component analysis for improvement and estimation of gas consumption in Finland, Hungary, Ireland, Japan and Malaysia', International Journal of Operational Research, vol. 13, no. 2, pp. 147-147.
View/Download from: Publisher's site
View description>>
This paper presents a genetic algorithm (GA) - principal component analysis (PCA) for long-term natural gas (NG) consumption prediction and improvement. Six models are proposed to forecast the annual gas demand. Around 27 GAs have been constructed and tested in order to find the best GA for gas consumption. The proposed models consist of input variables such as gross domestic product (GDP) and population (POP). All of trained GAs are then compared with each other respect to the mean absolute percentage error (MAPE). The GA model is capable of dealing both complexity and uncertainty in the data set. To show the applicability and superiority of the GA, actual gas consumptions in Finland, Hungary, Ireland, Japan and Malaysia from 1980 to 2007 are considered. With the aid of an autoregressive model, GDP and population are projected till 2015, and then with the projected GDP and population as inputs to the best GA model, gas consumption is predicted till 2015. Finally, we use the multivariate method of PCA in behaviour analysis of gas consumption in the selected countries. This method normalises the gas consumption by both population and GDP, and then the PCA procedure is run for efficiency assessment of the selected countries. PCA is used to examine the behaviour of gas consumption in the past and also to make insights for the forthcoming years. © 2012 Inderscience Enterprises Ltd.
Azadeh, A, Neshat, N, Kazemi, A & Saberi, M 2012, 'Predictive control of drying process using an adaptive neuro-fuzzy and partial least squares approach', The International Journal of Advanced Manufacturing Technology, vol. 58, no. 5-8, pp. 585-596.
View/Download from: Publisher's site
Azadeh, A, Saberi, M & Jiryaei, Z 2012, 'An intelligent decision support system for forecasting and optimization of complex personnel attributes in a large bank', Expert Systems with Applications, vol. 39, no. 16, pp. 12358-12370.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Noorossana, R, Mehrabad, MS, Anvari, M & Izadbakhsh, H 2012, 'Estimating efficient value of controllable variable using an adaptive neural network algorithm: Case of a railway system', Journal of Scientific and Industrial Research, vol. 71, no. 1, pp. 45-50.
View description>>
This study proposes a method, using adaptive neural network (ANN), to predict, estimate and evaluate performance variables without requiring any restrictive assumptions, taking case of a railway system. Also, by means of this method, it would be possible to compare actual performance data with estimated values and route their assignable causes in future periods. Energy consumption norm of vehicles in case of energy railway and real data of energy consumption in Iranian railway is considered.
Azadeh, A, Seraj, O, Asadzadeh, SM & Saberi, M 2012, 'An integrated fuzzy regression-data envelopment analysis algorithm for optimum oil consumption estimation with ambiguous data', Applied Soft Computing, vol. 12, no. 8, pp. 2614-2630.
View/Download from: Publisher's site
Bakker, S, Antle, AN & van den Hoven, E 2012, 'Embodied metaphors in tangible interaction design', PERSONAL AND UBIQUITOUS COMPUTING, vol. 16, no. 4, pp. 433-449.
View/Download from: Publisher's site
View description>>
For centuries, learning and development has been supported by physical activity and manipulating physical objects. With the introduction of embedded technologies, opportunities for employing tangible or embodied interaction for learning and development have emerged. As a result of previous research, we have seen that interaction models based on embodied knowledge (through embodied metaphors) can support children's learning in abstract domains. Although metaphorical mappings are promoted in tangible and embodied interaction research, little is known about how to identify embodied metaphors, or how to implement them effectively into interaction models. In this paper, we introduce a people centered, iterative approach to the design of tangible learning systems with embodied metaphor-based mappings. As a design case, we imple- mented our approach to the design of Moving Sounds (MoSo) Tangibles; a tangible system for learning abstract sound concepts. The system consists of a set of interactive tangibles with which children can manipulate pitch, volume, and tempo of ongoing tones. In a user study with 39 participants, we found that all children were able to reproduce sound samples with MoSo Tangibles.
Bakker, S, van den Hoven, E & Eggen, B 2012, 'Acting by hand: Informing interaction design for the periphery of people's attention', INTERACTING WITH COMPUTERS, vol. 24, no. 3, pp. 119-130.
View/Download from: Publisher's site
View description>>
Interactions in and with the physical world have enabled us to perform everyday activities in the periphery of our attention. Even though digital technologies are becoming increasingly present in the everyday environment, interaction with these technologies usually requires people's focused attention. In the realm of the vision of calm technology, we think that designing interactions with the digital world inspired by our peripheral interaction with the physical world, will enable digital technologies to better blend into our everyday lives. However, for such interaction design to be effective, a detailed understanding of the everyday periphery is required. In this paper, we therefore present a qualitative study on every- day activities that may take place in the periphery of the attention. We provide a broad range of examples of such everyday activities and cluster them to present the conditions under which they may be performed peripherally. Furthermore, we discuss how our findings may be relevant for the design of peripheral interactions with digital technologies, and present two conceptual designs that are based on our findings.
Bakker, S, van den Hoven, E & Eggen, B 2012, 'Knowing by ear: leveraging human attention abilities in interaction design', JOURNAL ON MULTIMODAL USER INTERFACES, vol. 5, no. 3-4, pp. 197-209.
View/Download from: Publisher's site
View description>>
In a world in which intelligent technologies are integrated in everyday objects and environments, users are at risk of being overburdened with information and interac- tion possibilities. Calm technology therefore aims at design- ing interactions that may reside in the periphery of the userâs attention and only shift to the center of the attention when re- quired. However, for such designs to be effective, a detailed understanding of human attention abilities is needed. In this paper, we therefore present a qualitative study on the every- day periphery of the attention. As we expected, we found that sound plays a major role in this, which supports our ap- proach to use interactive sonification as an interaction style for peripheral interaction. We present a range of rich exam- ples of everyday situations that lay out the design space for peripheral interaction and support these findings by describ- ing three initial designs that use interactive sonification for peripheral interaction.
Bogdanov, A & Qiao, Y 2012, 'On the security of Goldreich’s one-way function', computational complexity, vol. 21, no. 1, pp. 83-127.
View/Download from: Publisher's site
View description>>
Goldreich (ECCC 2000) suggested a simple construction of a candidate one-way function f: {0, 1} n → {0, 1} m where each bit of output is a fixed predicate P of a constant number d of (random) input bits. We investigate the security of this construction in the regime m = Dn, where D(d) is a sufficiently large constant. We prove that for any predicate P that correlates with either one or two of its inputs, f can be inverted with high probability. We also prove an amplification claim regarding Goldreich's construction. Suppose we are given an assignment x′ ∈ {0,1} n that has correlation ε > 0 with the hidden assignment x′ ∈ {0,1} n. Then, given access to x′, it is possible to invert f on x with high probability, provided D = D(d,ε) is sufficiently large. © 2012 Springer Basel AG.
Bressan, N, McGregor, C, Blount, M, Ebling, M, Sow, D & James, A 2012, '1618 Identification of Noxious Events for Newborn Infants with a Neural Network', Archives of Disease in Childhood, vol. 97, no. Suppl 2, pp. A458-A458.
View/Download from: Publisher's site
Bródka, P, Kazienko, P, Musiał, K & Skibicki, K 2012, 'Analysis of Neighbourhoods in Multi-layered Dynamic Social Networks', International Journal of Computational Intelligence Systems, vol. 5, no. 3, pp. 582-596.
View/Download from: Publisher's site
View description>>
Social networks existing among employees, customers or users of various ITsystems have become one of the research areas of growing importance. A socialnetwork consists of nodes - social entities and edges linking pairs of nodes.In regular, one-layered social networks, two nodes - i.e. people are connectedwith a single edge whereas in the multi-layered social networks, there may bemany links of different types for a pair of nodes. Nowadays data about peopleand their interactions, which exists in all social media, provides informationabout many different types of relationships within one network. Analysing thisdata one can obtain knowledge not only about the structure and characteristicsof the network but also gain understanding about semantic of human relations.Are they direct or not? Do people tend to sustain single or multiple relationswith a given person? What types of communication is the most important forthem? Answers to these and more questions enable us to draw conclusions aboutsemantic of human interactions. Unfortunately, most of the methods used forsocial network analysis (SNA) may be applied only to one-layered socialnetworks. Thus, some new structural measures for multi-layered social networksare proposed in the paper, in particular: cross-layer clustering coefficient,cross-layer degree centrality and various versions of multi-layered degreecentralities. Authors also investigated the dynamics of multi-layeredneighbourhood for five different layers within the social network. Theevaluation of the presented concepts on the real-world dataset is presented.The measures proposed in the paper may directly be used to various methods forcollective classification, in which nodes are assigned to labels according totheir structural input features.
Casanovas, M & Merigó, JM 2012, 'Fuzzy aggregation operators in decision making with Dempster–Shafer belief structure', Expert Systems with Applications, vol. 39, no. 8, pp. 7138-7149.
View/Download from: Publisher's site
Çetindamar, D & Günsel, A 2012, 'Measuring the Creativity of a City: A Proposal and an Application', European Planning Studies, vol. 20, no. 8, pp. 1301-1318.
View/Download from: Publisher's site
CETINDAMAR, D & PRETORIUS, MW 2012, 'UNVEILING TM PRACTICES IN DEVELOPING COUNTRIES', International Journal of Innovation and Technology Management, vol. 09, no. 05, pp. 1-8.
View/Download from: Publisher's site
View description>>
No abstract received.
Cetindamar, D, Gupta, VK, Karadeniz, EE & Egrican, N 2012, 'What the numbers tell: The impact of human, family and financial capital on women and men's entry into entrepreneurship in Turkey', Entrepreneurship & Regional Development, vol. 24, no. 1-2, pp. 29-51.
View/Download from: Publisher's site
View description>>
Entrepreneurship contributes to economic development in countries worldwide. Entrepreneurial activity is beneficial for both men and women, including those in developing countries. However, men and women may not engage in entrepreneurship to the same extent because of differential access to (various forms of) capital. This study examines the relative importance of three types of capital -- human, family and financial -- in pursuing entrepreneurship. Using data collected in Turkey, we find that regardless of sex, all three forms of capital influence the likelihood of becoming an entrepreneur in varying degrees. Contrary to expectations, the impact of human capital on the likelihood of becoming an entrepreneur is higher for women than men. Data also revealed that family capital facilitates women's entry into entrepreneurship only when family size is very large (i.e. seven or more). No gender differences are observed in the impact of financial capital on the likelihood of becoming an entrepreneur. Findings suggest that to encourage entrepreneurship in Turkey, policy-makers should emphasize access to human and financial capital. Furthermore, findings suggest that women's likelihood of becoming an entrepreneur will be especially encouraged if they have increased access to education, as well as the skills necessary to take advantage of their family capital.
CETINDAMAR, D, WASTI, NS & BEYHAN, B 2012, 'TECHNOLOGY MANAGEMENT TOOLS AND TECHNIQUES: FACTORS AFFECTING THEIR USAGE AND THEIR IMPACT ON PERFORMANCE', International Journal of Innovation and Technology Management, vol. 09, no. 05, pp. 1-17.
View/Download from: Publisher's site
View description>>
This study investigates which technology management (TM) tools are used in practice, what determines their usage, and whether they affect the user firms' performance. Based on a survey of 52 electronics and machinery firms in Turkey, the study shows there are significant relationships between the number of TM tools and techniques that a firm uses and (i) the hierarchical level of the chief technology officer (CTO) or most senior manager responsible for technology, (ii) his/her field of education, and (iii) the size of the firm. The findings indicate a significant and linear relationship between the extent to which the firms have reached their growth targets and the number of TM tools and techniques used. This relationship is, however, not observed between firm profitability and the number of TM tools and techniques. The findings have important implications for the practice of TM.
Chiu, T, Gramann, K, Ko, L, Duann, J, Jung, T & Lin, C 2012, 'Alpha modulation in parietal and retrosplenial cortex correlates with navigation performance', Psychophysiology, vol. 49, no. 1, pp. 43-55.
View/Download from: Publisher's site
View description>>
AbstractThe present study investigated the brain dynamics accompanying spatial navigation based on distinct reference frames. Participants preferentially using an allocentric or an egocentric reference frame navigated through virtual tunnels and reported their homing direction at the end of each trial based on their spatial representation of the passage. Task‐related electroencephalographic (EEG) dynamics were analyzed based on independent component analysis (ICA) and subsequent clustering of independent components. Parietal alpha desynchronization during encoding of spatial information predicted homing performance for participants using an egocentric reference frame. In contrast, retrosplenial and occipital alpha desynchronization during retrieval covaried with homing performance of participants using an allocentric reference frame. These results support the assumption of distinct neural networks underlying the computation of distinct reference frames and reveal a direct relationship of alpha modulation in parietal and retrosplenial areas with encoding and retrieval of spatial information for homing behavior.
Chuang, S-W, Ko, L-W, Lin, Y-P, Huang, R-S, Jung, T-P & Lin, C-T 2012, 'Co-modulatory spectral changes in independent brain processes are correlated with task performance', NeuroImage, vol. 62, no. 3, pp. 1469-1477.
View/Download from: Publisher's site
View description>>
This study investigates the independent modulators that mediate the power spectra of electrophysiological processes, measured by electroencephalogram (EEG), in a sustained-attention experiment. EEG and behavioral data were collected during 1-2. hour virtual-reality based driving experiments in which subjects were instructed to maintain their cruising position and compensate for randomly induced drift using the steering wheel. Independent component analysis (ICA) applied to 30-channel EEG data separated the recorded EEG signals into a sum of maximally temporally independent components (ICs) for each of 30 subjects. Logarithmic spectra of resultant IC activities were then decomposed by principal component analysis, followed by ICA, to find spectrally fixed and temporally independent modulators (IM). Across subjects, the spectral ICA consistently found four performance-related independent modulators: delta, delta-theta, alpha, and beta modulators that multiplicatively affected the spectra of spatially distinct IC processes when the participants experienced waves of alternating alertness and drowsiness during long-hour simulated driving. The activation of the delta-theta modulator increased monotonically as subjects' task performances decreased. Furthermore, the time courses of the theta-beta modulator were highly correlated with concurrent changes in driving errors across subjects (r = 0.77 ± 0.13). © 2012 Elsevier Inc.
De La Poype, A-L & Sood, S 2012, 'Public Sphere Dialogue in Online Newspapers and Social Spaces: The Nuclear Debate in Post Fukushima France', Public Communication Review, vol. 2, no. 2, pp. 30-47.
View/Download from: Publisher's site
View description>>
The meltdown at the Japanese Fukushima Daiichi nuclear power plant (March 2011) provided a trigger to contribute this research about the ways in which French newspapers facilitate (or not) a public dialogue on the issue of nuclear energy. Nuclear power not only generates over 75% of the electricity in France but also sustains a healthy domestic job creation program and drives nuclear technology exports. Hence, the absence of public debate amongst the French in nuclear energy over the long term is not surprising. Against this backdrop of French nuclear interests and post Fukushima, this paper presents a French language computer-mediated discourse analysis on nuclear debates and discussions taking place online in the hybrid public sphere. This space straddles user-generated content in the public comment spaces of newspapers embracing the spectrum of political persuasions (Le Figaro, Le Monde and Liberation) and social media.
Qualitative and quantitative research methods uncover citizen interactions within the online public sphere comprising newspapers. Findings illuminate the progress of deliberations on nuclear power in online newspapers following a process of agenda setting through news stories, providing space for public dialogue and the digital curating of social media commentary. Furthermore, the research reveals the relevance of the Habermasian public sphere concept within the context of online newspapers. Key learning for the role of the media in fostering the democratic process using social media and insights for the political communications landscape within the context of the nuclear debate compliment the research.
Doss, R, Zhou, W, Sundaresan, S, Yu, S & Gao, L 2012, 'A minimum disclosure approach to authentication and privacy in RFID systems', Computer Networks, vol. 56, no. 15, pp. 3401-3416.
View/Download from: Publisher's site
Dovey, K & Mooney, G 2012, 'Leadership practices in the generation and deployment of intangible capital resources for innovation', International Journal of Learning and Intellectual Capital, vol. 9, no. 3, pp. 295-295.
View/Download from: Publisher's site
View description>>
This paper explores the practices underpinning an enterprise's ability to generate and deploy intangible capital in support of its strategic intent to innovate. Drawing on two research projects, we focus upon enterprises that are able to innovatively leverage the intangible capital resources that are potentially available to them. Using a phenomenological methodology, one project explores at a high level the social dynamics within 25 medium-sized enterprises noted for their innovative capabilities in Sydney, Australia. The other project explores in finer detail, through an action research methodology, the transformation of stakeholder relationships within another Sydney-based medium-sized enterprise that has become highly innovative over the past five years. Our findings show that the most important forms of intangible capital for innovation are relationship-based and are leveraged through stakeholder collaboration. Copyright © 2012 Inderscience Enterprises Ltd.
Ellis, J, Goodswen, S, Kennedy, PJ & Bush, S 2012, 'The Core Mouse Response to Infection by Neospora Caninum Defined by Gene Set Enrichment Analyses', Bioinformatics and Biology Insights, vol. 6, pp. BBI.S9954-BBI.S9954.
View/Download from: Publisher's site
View description>>
In this study, the BALB/c and Qs mouse responses to infection by the parasite Neospora caninum were investigated in order to identify host response mechanisms. Investigation was done using gene set (enrichment) analyses of microarray data. GSEA, MANOVA, Romer, subGSE and SAM-GS were used to study the contrasts Neospora strain type, Mouse type (BALB/c and Qs) and time post infection (6 hours post infection and 10 days post infection). The analyses show that the major signal in the core mouse response to infection is from time post infection and can be defined by gene ontology terms Protein Kinase Activity, Cell Proliferation and Transcription Initiation. Several terms linked to signaling, morphogenesis, response and fat metabolism were also identified. At 10 days post infection, genes associated with fatty acid metabolism were identified as up regulated in expression. The value of gene set (enrichment) analyses in the analysis of microarray data is discussed.
Erfani, SZ, Akhgar, B, Taghvaie, SM & Estiri, F 2012, 'Presenting a Model for Increasing Productivity in Hydropower Plants by Identifying Organizational Complications', Applied Mechanics and Materials, vol. 197, pp. 734-739.
View/Download from: Publisher's site
View description>>
The present economical conditions on today’s world require specific point of view and policy making in business agencies. In this competitive world to achieve competence, competitive advantages in order to better governance, organizations need to increase their competitive powers through increasing productivity. One of the fundamental approaches to enhance the productivity level is first identifying the organizational complications then finding solution and implementing the solutions. To shed light on recognizing the firm’s complications and recoverable areas in the business agencies the authors were benefited from the concept of critical factors of success and social capital affect on inter-firm relationships then an empirical model by taking advantage of Deming Continue improvement model was presented. In order to verify and validate the performed research the planned model was accomplished in the Iran hydropower plants. Positive and acceptable results were obtained hydropower complications were identified and removed as well, organizations total factors of productivity improved.
Faber, JP & van den Hoven, E 2012, 'MARBOWL: increasing the fun experience of shooting marbles', PERSONAL AND UBIQUITOUS COMPUTING, vol. 16, no. 4, pp. 391-404.
View/Download from: Publisher's site
View description>>
This paper focuses on the old school game of shooting marbles. We investigate which aspects of this tangible game make it popular and show how experienced fun can increase by elaborating such aspects through an iterative design process. A questionnaire and field study, tailored to the user group of primary school children aged 9-12 years old, revealed that aspects within areas of physical control, surface of the playground, opponent, and stakes of the game had the biggest influence on the fun experience of shooting marbles. A gameflow model and fun toolkit were used to improve the game in these respective areas. This resulted in a moving marble hole entitled Marbowl: a tangible marble game that augments existing game aspects such as timing, distance, surface, and other physical and environmental influences. A working prototype was field tested with 24 children at a primary school. Results show that different gameflow areas like concentration needed, playability, difficulty to win the game, and amount of challenge, increased in a positive way. Together these findings concluded that compared with the original marble game, children experienced a higher level of fun while playing with Marbowl.
Gao, L, Li, M, Bonti, A, Zhou, W & Yu, S 2012, 'M-Dimension: Multi-characteristics based routing protocol in human associated delay-tolerant networks with improved performance over one dimensional classic models', Journal of Network and Computer Applications, vol. 35, no. 4, pp. 1285-1296.
View/Download from: Publisher's site
Garcia, JA, Navarro, KF, Schoene, D, Smith, ST & Pisan, Y 2012, 'Exergames for the elderly: Towards an embedded Kinect-based clinical test of falls risk.', HIC, vol. 178, no. 1, pp. 51-57.
View/Download from: Publisher's site
View description>>
Falls are the leading cause of disability, injuries or even death among older adults. Exercise programmes that include a balance component reduce the risk of falling by 40%. However, such interventions are often perceived as boring and drop-out rates are high. The characteristics of videogames may overcome this weakness and increase exercise adherence. The use of modern input devices, such as the Microsoft Kinect, enables quantification of player performance in terms of motor function while engaging with games. This capability has just started to be explored. The work presented in this paper focuses on the development of a Kinect-based system to deliver step training while simultaneously measuring parameters of stepping performance that have shown to predict falls in older people. © 2012 The authors and IOS Press. All rights reserved.
Golsteijn, C & van den Hoven, E 2012, 'Cueb', interactions, vol. Mar/Apr, pp. 9-9.
View description>>
Cueb is a set of interactive photo cubes that aims to encourage parents and teenagers to explore digital photos of their individual and shared experiences, reminisce, and exchange stories. Family members each have their own cube with photos of their individual experiences. Shaking a cube will randomly display photos on six sides. Connecting cubes by holding them together will display photos of the family membersâ shared experiences. Photos can be transferred between cubes and locked for use as a selection filter to find related photos. This generates surprising photo results and allows parents and teenagers to compare their experiences.
Goodswen, SJ, Kennedy, PJ & Ellis, JT 2012, 'Evaluating High-Throughput Ab Initio Gene Finders to Discover Proteins Encoded in Eukaryotic Pathogen Genomes Missed by Laboratory Techniques', PLOS ONE, vol. 7, no. 11.
View/Download from: Publisher's site
Guo, Y, Zhu, J, Lu, H, Lin, Z & Li, Y 2012, 'Core Loss Calculation for Soft Magnetic Composite Electrical Machines', IEEE Transactions on Magnetics, vol. 48, no. 11, pp. 3112-3115.
View/Download from: Publisher's site
View description>>
Soft magnetic composite (SMC) materials are especially suitable for developing electrical machines with complex structure and three-dimensional (3-D) magnetic flux path. In these SMC machines, the magnetic field is in general 3-D and rotational, so the mechanism and calculation of core loss may be quite different from that in traditional electrical machines with laminated steels in which the magnetic field is restrained. This paper investigates the calculation of core loss in a permanent magnet claw pole motor with SMC stator core. First, core loss models are developed based on the experimental data on SMC samples by using a 3-D magnetic property tester. Then, 3-D magnetic time-stepping field finite element analysis (FEA) is conducted to find the flux density locus in each element when the rotor rotates. The core loss is computed based on the magnetic field FEA results by using the developed core loss models. The calculations agree well with the experimental measurements on the SMC motor prototype.
Guo, Z, Zhao, W, Lu, H & Wang, J 2012, 'Multi-step forecasting for wind speed using a modified EMD-based artificial neural network model', Renewable Energy, vol. 37, no. 1, pp. 241-249.
View/Download from: Publisher's site
View description>>
In this paper, a modified EMD-FNN model (empirical mode decomposition (EMD) based feed-forward neural network (FNN) ensemble learning paradigm) is proposed for wind speed forecasting. The nonlinear and non-stationary original wind speed series is first decomposed into a finite and often small number of intrinsic mode functions (IMFs) and one residual series using EMD technique for a deep insight into the data structure. Then these sub-series except the high frequency are forecasted respectively by FNN whose input variables are selected by using partial autocorrelation function (PACF). Finally, the prediction results of the modeled IMFs and residual series are summed to formulate an ensemble forecast for the original wind speed series. Further more, the developed model shows the best accuracy comparing with basic FNN and unmodified EMD-based FNN through multi-step forecasting the mean monthly and daily wind speed in Zhangye of China.
Huang, L, Milne, D, Frank, E & Witten, IH 2012, 'Learning a concept‐based document similarity measure', Journal of the American Society for Information Science and Technology, vol. 63, no. 8, pp. 1593-1608.
View/Download from: Publisher's site
View description>>
Document similarity measures are crucial components of many text‐analysis tasks, including information retrieval, document classification, and document clustering. Conventional measures are brittle: They estimate the surface overlap between documents based on the words they mention and ignore deeper semantic connections. We propose a new measure that assesses similarity at both the lexical and semantic levels, and learns from human judgments how to combine them by using machine‐learning techniques. Experiments show that the new measure produces values for documents that are more consistent with people's judgments than people are with each other. We also use it to classify and cluster large document sets covering different genres and topics, and find that it improves both classification and clustering performance.
Hussain, OK, Dillon, T, Hussain, FK & Chang, E 2012, 'Probabilistic assessment of loss in revenue generation in demand-driven production', JOURNAL OF INTELLIGENT MANUFACTURING, vol. 23, no. 6, pp. 2069-2084.
View/Download from: Publisher's site
View description>>
In Demand-driven Production with Just-in-Time inputs, there are several sources of uncertainty which impact on themanufacturers ability to meet the required customers demand within the given time frame. This can result in a loss of revenue and customers, which will have undesirable impacts on the financial aspects and on the viability of the manufacturer.Hence, a key concern for manufacturers in justin- time production is to determine whether they can meet a specific level of demand within a given time frame, to meet the customers orders and also to achieve the required revenue target for that period of time. In this paper, we propose a methodology by which a manufacturer can ascertain the probability of not meeting the required demand within a given period by considering the uncertainties in the availability of production units and raw materials, and the loss of financial revenue that it would experience as a result.
Hussain, W, Sohaib, O, Ahmed, A & Khan, MQ 2012, 'GEOGRAPHICAL INFORMATION SYSTEM BASED MODEL OF LAND SUITABILITY FOR GOOD YIELD OF RICE IN PRACHUAP KHIRI KHAN PROVINCE, THAILAND', Science, Technology and Development, vol. 31, no. 1, pp. 1-9.
Janjua, NK & Hussain, FK 2012, 'Web@IDSS - Argumentation-enabled Web-based IDSS for reasoning over incomplete and conflicting information', KNOWLEDGE-BASED SYSTEMS, vol. 32, no. 1, pp. 9-27.
View/Download from: Publisher's site
View description>>
Over the past few decades, there has been a resurgence of interest in using high-level software intelligence for business intelligence (BI). The objective is to produce actionable information that is delivered at the right time, easily comprehendible and exportable to other software to assist business decision-making processes. Although the design and development of decision support systems (DSS) has been carried out for over 40 years, DSS still suffer from many limitations such as poor maintainability, poor flexibility and less reusability. The development of the Internet and WWW has helped information systems to overcome those limitations and Web DSS is now an active area of research in business intelligence, impacting significantly on the way information is exchanged and businesses are conducted. However, to remain competitive, companies rely on business intelligence (BI) to continually monitor and analyze the operating environment (both internal and external), to identify potential risks, and to devise competitive business strategies. However, the current Web DSS applications are not able to reason over information present across organizational boundaries which could be incomplete and conflicting. The use of an argumentation-based mechanism has not been explored to address such shortcomings in Web DSS. Argumentation is a kind of commonsense reasoning used by human beings to reach a justifiable conclusion when available information is incomplete and/or inconsistent among participants. In this paper, we propose and elaborate in detail a conceptual framework and formal argumentation-based semantics for Web enabled Intelligent DSS (Web@IDSS). We evaluate the use of argumentative reasoning in Web DSS with the help of a case study, prototype development and future directions. Applications built according to the proposed framework will provide more practical, understandable results to decision makers.
Jiang, JJ, Zhanga, HB & Yu, S 2012, 'An interior point trust region method for nonnegative matrix factorization', Neurocomputing, vol. 97, pp. 309-316.
View/Download from: Publisher's site
Kamaleswaran, R & McGregor, C 2012, 'Integrating complex business processes for knowledge-driven clinical decision support systems', 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 1306-1309.
View/Download from: Publisher's site
Karimi, F, Poo, DCC & Tan, YM 2012, 'Clinicians’ Satisfaction with Clinical Information Systems: A Disconfirmation Paradigm Perspective', Academy of Management Proceedings, vol. 2012, no. 1, pp. 18240-18240.
View/Download from: Publisher's site
Krueger, T, Page, T, Smith, L & Voinov, A 2012, 'A guide to expert opinion in environmental modelling and management', Environmental Modelling & Software, vol. 36, pp. 1-3.
View/Download from: Publisher's site
Lai, H-Y, Liao, L-D, Lin, C-T, Hsu, J-H, He, X, Chen, Y-Y, Chang, J-Y, Chen, H-F, Tsang, S & Shih, Y-YI 2012, 'Design, simulation and experimental validation of a novel flexible neural probe for deep brain stimulation and multichannel recording', Journal of Neural Engineering, vol. 9, no. 3, pp. 036001-036001.
View/Download from: Publisher's site
View description>>
An implantable micromachined neural probe with multichannel electrode arrays for both neural signal recording and electrical stimulation was designed, simulated and experimentally validated for deep brain stimulation (DBS) applications. The developed probe has a rough three-dimensional microstructure on the electrode surface to maximize the electrode-tissue contact area. The flexible, polyimide-based microelectrode arrays were each composed of a long shaft (14.9 mm in length) and 16 electrodes (5 μm thick and with a diameter of 16 μm). The ability of these arrays to record and stimulate specific areas in a rat brain was evaluated. Moreover, we have developed a finite element model (FEM) applied to an electric field to evaluate the volume of tissue activated (VTA) by DBS as a function of the stimulation parameters. The signal-to-noise ratio ranged from 4.4 to 5 over a 50 day recording period, indicating that the laboratory-designed neural probe is reliable and may be used successfully for long-term recordings. The somatosensory evoked potential (SSEP) obtained by thalamic stimulations and in vivo electrode-electrolyte interface impedance measurements was stable for 50 days and demonstrated that the neural probe is feasible for long-term stimulation. A strongly linear (positive correlation) relationship was observed among the simulated VTA, the absolute value of the SSEP during the 200 ms post-stimulus period (ΣSSEP) and c-Fos expression, indicating that the simulated VTA has perfect sensitivity to predict the evoked responses (c-Fos expression). This laboratory-designed neural probe and its FEM simulation represent a simple, functionally effective technique for studying DBS and neural recordings in animal models. © 2012 IOP Publishing Ltd.
Lambert, M & Kennedy, P 2012, 'Using Artificial Intelligence to Build with Unprocessed Rock', Key Engineering Materials, vol. 517, pp. 939-945.
View/Download from: Publisher's site
View description>>
Unprocessed rock is a massive resource of very cheap building material with very low embodied energy. However, it is highly underutilised due to the difficulty of dealing with irregular shaped blocks. We have developed a novel software application using the artificial intelligence methods of search and optimisation to simulate building three-dimensional structures in a virtual world. The aim of our software is to help builders solve the 3-dimensional jigsaw puzzle of building with rock rubble with an emphasis on its potential use for building sustainable housing and infrastructure. This paper describes our approach and the design of our software including an overview of the rock digitising, optimisation software and building methods. We present simulation results of building and testing several small drystone structures using the prototype software.
Lee, CY, Lin, CT, Hong, CT & Su, MT 2012, 'Smoke detection using spatial and temporal analyses', International Journal of Innovative Computing, Information and Control, vol. 8, no. 7 A, pp. 4749-4770.
View description>>
Video-based fire detection is currently a fairly common application with the growth in the number of installed surveillance video systems. Moreover, the related processing units are becoming more powerful. Smoke is an early sign of most fires; therefore, selecting an appropriate smoke-detection method is essential. However, detecting smoke without creating a false alarm remains a challenging problem for open or large spaces with the disturbances of common moving objects, such as pedestrians and vehicles. This study proposes a novel video-based smoke-detection method that can be incorpora,ted into a surveillance system to provide early alerts. In this study, the process of extracting smoke features from Candidate regions was accomplished by analyzing the spatial and temporal characteristics of video sequences for three important features: edge blurring, gradual energy changes, and gradual chromatic configuration changes. The proposed spatial-temporal analysis technique improves the feature extraction of gradual energy changes. In order to make the video smoke-detection results more reliable, these three features were combined using a support vector machine (SVM) technique and a temporal-based alarm decision unit (ADU) was also introduced. The effectiveness of the proposed algorithm was evaluated on a PC with an Intel® Core™ 2 Duo CPU (2.2 GHz) and 2 GB RAM. The average processing time was 32.27 ms per frame; i.e., the proposed algorithm can process 30.98 frames per second. Experimental results showed that the proposed system can detect smoke effectively with a low false-alarm rate and a short reaction time in many real-world scenarios. © ICIC International 2012.
Li, C-H, Kuo, B-C, Lin, C-T & Huang, C-S 2012, 'A Spatial–Contextual Support Vector Machine for Remotely Sensed Image Classification', IEEE Transactions on Geoscience and Remote Sensing, vol. 50, no. 3, pp. 784-799.
View/Download from: Publisher's site
View description>>
© 2011 IEEE. Recent studies show that hyperspectral image classification techniques that use both spectral and spatial information are more suitable, effective, and robust than those that use only spectral information. Using a spatial-contextual term, this study modifies the decision function and constraints of a support vector machine (SVM) and proposes two kinds of spatial-contextual SVMs for hyperspectral image classification. One machine, which is based on the concept of Markov random fields (MRFs), uses the spatial information in the original space (SCSVM). The other machine uses the spatial information in the feature space (SCSVMF), i.e., the nearest neighbors in the feature space. The SCSVM is better able to classify pixels of different class labels with similar spectral values and deal with data that have no clear numerical interpretation. To evaluate the effectiveness of SCSVM, the experiments in this study compare the performances of other classifiers: an SVM, a context-sensitive semisupervised SVM, a maximum likelihood (ML) classifier, a Bayesian contextual classifier based on MRFs (ML-MRF), and k nearest neighbor classifier. Experimental results show that the proposed method achieves good classification performance on famous hyperspectral images (the Indian Pine site (IPS) and the Washington, DC mall data sets). The overall classification accuracy of the hyperspectral image of the IPS data set with 16 classes is 95.5%. The kappa accuracy is up to 94.9%, and the average accuracy of each class is up to 94.2%.
Li, D-L, Prasad, M, Hsu, S-C, Hong, C-T & Lin, C-T 2012, 'Face recognition using nonparametric-weighted Fisherfaces', EURASIP Journal on Advances in Signal Processing, vol. 2012, no. 1.
View/Download from: Publisher's site
View description>>
Abstract This study presents an appearance-based face recognition scheme called the nonparametric-weighted Fisherfaces (NW-Fisherfaces). Pixels in a facial image are considered as coordinates in a high-dimensional space and are transformed into a face subspace for analysis by using nonparametric-weighted feature extraction (NWFE). According to previous studies of hyperspectral image classification, NWFE is a powerful tool for extracting hyperspectral image features. The Fisherfaces method maximizes the ratio of between-class scatter to that of within-class scatter. In this study, the proposed NW-Fisherfaces weighted the between-class scatter to emphasize the boundary structure of the transformed face subspace and, therefore, enhances the separability for different persons' face. The proposed NW-Fisherfaces was compared with Orthogonal Laplacianfaces, Eigenfaces, Fisherfaces, direct linear discriminant analysis, and null space linear discriminant analysis methods for tests on five facial databases. Experimental results showed that the proposed approach outperforms other feature extraction methods for most databases.
Li, J & Tao, D 2012, 'On Preserving Original Variables in Bayesian PCA With Application to Image Analysis.', IEEE Trans. Image Process., vol. 21, no. 12, pp. 4830-4843.
View/Download from: Publisher's site
View description>>
Principal component analysis (PCA) computes a succinct data representation by converting the data to a few new variables while retaining maximum variation. However, the new variables are difficult to interpret, because each one is combined with all of the original input variables and has obscure semantics. Under the umbrella of Bayesian data analysis, this paper presents a new prior to explicitly regularize combinations of input variables. In particular, the prior penalizes pair-wise products of the coefficients of PCA and encourages a sparse model. Compared to the commonly used mmbl1-regularizer, the proposed prior encourages the sparsity pattern in the resultant coefficients to be consistent with the intrinsic groups in the original input variables. Moreover, the proposed prior can be explained as recovering a robust estimation of the covariance matrix for PCA. The proposed model is suited for analyzing visual data, where it encourages the output variables to correspond to meaningful parts in the data. We demonstrate the characteristics and effectiveness of the proposed technique through experiments on both synthetic and real data. © 1992-2012 IEEE.
Li, J, Tao, D & Li, X 2012, 'A probabilistic model for image representation via multiple patterns.', Pattern Recognit., vol. 45, no. 11, pp. 4044-4053.
View/Download from: Publisher's site
View description>>
For image analysis, an important extension to principal component analysis (PCA) is to treat an image as multiple samples, which helps alleviate the small sample size problem. Various schemes of transforming an image to multiple samples have been proposed. Although having been shown effective in practice, the schemes are mainly based on heuristics and experience. In this paper, we propose a probabilistic PCA model, in which we explicitly represent the transformation scheme and incorporate the scheme as a stochastic component of the model. Therefore fitting the model automatically learns the transformation. Moreover, the learned model allows us to distinguish regions that can be well described by the PCA model from those that need further treatment. Experiments on synthetic images and face data sets demonstrate the properties and utility of the proposed model. © 2012 Elsevier Ltd. All rights reserved.
Li, L, Zhong, L, Xu, G & Kitsuregawa, M 2012, 'A feature-free search query classification approach using semantic distance', Expert Systems with Applications, vol. 39, no. 12, pp. 10739-10748.
View/Download from: Publisher's site
View description>>
When classifying search queries into a set of target categories, machine learning based conventional approaches usually make use of external sources of information to obtain additional features for search queries and training data for target categories. Unfortunately, these approaches rely on large amount of training data for high classification precision. Moreover, they are known to suffer from inability to adapt to different target categories which may be caused by the dynamic changes observed in both Web topic taxonomy and Web content. In this paper, we propose a feature-free classification approach using semantic distance. We analyze queries and categories themselves and utilizes the number of Web pages containing both a query and a category as a semantic distance to determine their similarity. The most attractive feature of our approach is that it only utilizes the Web page counts estimated by a search engine to provide the search query classification with respectable accuracy. In addition, it can be easily adaptive to the changes in the target categories, since machine learning based approaches require extensive updating process, e.g.; re-labeling outdated training data, re-training classifiers, to name a few, which is time consuming and high-cost. We conduct experimental study on the effectiveness of our approach using a set of rank measures and show that our approach performs competitively to some popular state-of-the-art solutions which, however, frequently use external sources and are inherently insufficient in flexibility. © 2012 Elsevier Ltd. All rights reserved.
Liao, L-D, Chang, Y-J, Lai, H-Y, Lin, C-T, Lin, Z-M, Tsang, S & Chen, Y-Y 2012, 'A Novel Light-Addressable Multi-Electrode Array Chip for Neural Signal Recording Based on VCSEL Diode Arrays', Journal of Neuroscience and Neuroengineering, vol. 1, no. 1, pp. 4-12.
View/Download from: Publisher's site
Lin, C-L, Shaw, F-Z, Young, K-Y, Lin, C-T & Jung, T-P 2012, 'EEG correlates of haptic feedback in a visuomotor tracking task', NeuroImage, vol. 60, no. 4, pp. 2258-2273.
View/Download from: Publisher's site
View description>>
This study investigates the temporal brain dynamics associated with haptic feedback in a visuomotor tracking task. Haptic feedback with deviation-related forces was used throughout tracking experiments in which subjects' behavioral responses and electroencephalogram (EEG) data were simultaneously measured. Independent component analysis was employed to decompose the acquired EEG signals into temporally independent time courses arising from distinct brain sources. Clustering analysis was used to extract independent components that were comparable across participants. The resultant independent brain processes were further analyzed via time-frequency analysis (event-related spectral perturbation) and event-related coherence (ERCOH) to contrast brain activity during tracking experiments with or without haptic feedback. Across subjects, in epochs with haptic feedback, components with equivalent dipoles in or near the right motor region exhibited greater alpha band power suppression. Components with equivalent dipoles in or near the left frontal, central, left motor, right motor, and parietal regions exhibited greater beta-band power suppression, while components with equivalent dipoles in or near the left frontal, left motor, and right motor regions showed greater gamma-band power suppression relative to non-haptic conditions. In contrast, the right occipital component cluster exhibited less beta-band power suppression in epochs with haptic feedback compared to non-haptic conditions. The results of ERCOH analysis of the six component clusters showed that there were significant increases in coherence between different brain networks in response to haptic feedback relative to the coherence observed when haptic feedback was not present. The results of this study provide novel insight into the effects of haptic feedback on the brain and may aid the development of new tools to facilitate the learning of motor skills. © 2012 Elsevier Inc..
Lin, C-T, Chuang, C-H, Wang, Y-K, Tsai, S-F, Chiu, T-C & Ko, L-W 2012, 'Neurocognitive Characteristics of the Driver: A Review on Drowsiness, Distraction, Navigation, and Motion Sickness', Journal of Neuroscience and Neuroengineering, vol. 1, no. 1, pp. 61-81.
View/Download from: Publisher's site
View description>>
Within the past few decades, neuroscientists have designed various experimental paradigms and driving environments. Using well-established neurotechnologies, such as functional magnetic resonance imaging (fMRI), positron emission tomography (PET), and electroencephalography (EEG), they have gained insight into the brain activity involved in the processing of driving cognition and behaviors. Moreover, neuroengineers have developed computational intelligent technologies to model these brain-behavioral relationships for real-life applications. With the advance of neurotechnology and the understanding of driving cognition, it is thought that an in-vehicle brain-computer interface will be implemented in the near future. In this review, we discuss four major issues prominent in driving cognitive research, including drowsiness, distraction, navigation, and motion sickness. We provide four summary tables that list nearly 60 references from the fields of neuroscience and neuroengineering to briefly present experimental materials, brain imaging modalities, and major findings of the brain in response to specific driving cognitive states. In addition, driving experiments conducted in a virtual-realistic driving environment and studies examining the power spectral characteristics of brain dynamics using independent component analysis, which eliminates artifacts and extracts the independent component processes, are also described.
Lin, C-T, Huang, T-Y, Lin, W-J, Chang, S-Y, Lin, Y-H, Ko, L-W, Hung, DL & Chang, EC 2012, 'Gender differences in wayfinding in virtual environments with global or local landmarks', Journal of Environmental Psychology, vol. 32, no. 2, pp. 89-96.
View/Download from: Publisher's site
View description>>
This study assesses gender differences in wayfinding in environments with global or local landmarks by analyzing both overall and fine-grained measures of performance. Both female and male participants were required to locate targets in grid-like virtual environments with local or global landmarks. Interestingly, the results of the two overall measures did not converge: although females spent more time than males in locating targets, both genders were generally equivalent in terms of corrected travel path. Fine-grained measures account for different aspects of wayfinding behavior and provide additional information that explains the divergence in overall measures; females spent less time traveling away from the target location, a higher proportion of time not traversing, and made more rotations when stopping than males did. Rather than unequivocally supporting male superiority in wayfinding tasks, both the overall and fine-grained measures partially indicate that males and females are differentially superior when using global and local landmark information, respectively. To summarize, males moved faster than females but did not necessarily navigate the spatial surroundings more efficiently. Each gender showed different strengths related to wayfinding; these differences require the application of both overall and fine-grained measures for accurate assessment. © 2012 Elsevier Ltd.
Lister, R 2012, 'A variation on Kvale's one thousand page question', ACM Inroads, vol. 3, no. 3, pp. 24-25.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2012, 'Rare research', ACM Inroads, vol. 3, no. 4, pp. 16-17.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2012, 'The CC2013 Strawman and Bloom's taxonomy', ACM Inroads, vol. 3, no. 2, pp. 12-13.
View/Download from: Publisher's site
View description>>
This is a regular column that I write for this journal.
Liu, X, Pan, Y, Xu, Y & Yu, S 2012, 'Least square completion and inconsistency repair methods for additively consistent fuzzy preference relations', Fuzzy Sets and Systems, vol. 198, pp. 1-19.
View/Download from: Publisher's site
Lu, J, Zhang, G, Montero, J & Garmendia, L 2012, 'Multifollower Trilevel Decision Making Models and System', IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, vol. 8, no. 4, pp. 974-985.
View/Download from: Publisher's site
View description>>
In a trilevel hierarchical decision problem, the objectives and variables of each decision entity at one level are controlled, in part, by the decision entities at other levels. The choice of values for the decision variables at each level may influence
Lun-De Liao, Chin-Teng Lin, McDowell, K, Wickenden, AE, Gramann, K, Tzyy-Ping Jung, Li-Wei Ko & Jyh-Yeong Chang 2012, 'Biosensor Technologies for Augmented Brain–Computer Interfaces in the Next Decades', Proceedings of the IEEE, vol. 100, no. Special Centennial Issue, pp. 1553-1566.
View/Download from: Publisher's site
View description>>
The study of brain-computer interfaces (BCIs) has undergone 30 years of intense development and has grown into a rich and diverse field. BCIs are technologies that enable direct communication between the human brain and external devices. Conventionally, wet electrodes have been employed to obtain unprecedented sensitivity to high-temporal-resolution brain activity; recently, the growing availability of various sensors that can be used to detect high-quality brain signals in a wide range of clinical and everyday environments is being exploited. This development of biosensing neurotechnologies and the desire to implement them in real-world applications have led to the opportunity to develop augmented BCIs (ABCIs) in the upcoming decades. An ABCI is similar to a BCI in that it relies on biosensors that record signals from the brain in everyday environments; the signals are then processed in real time to monitor the behavior of the human. To use an ABCI as a mobile brain imaging technique for everyday, real-life applications, the sensors and the corresponding device must be lightweight and the equipment response time must be short. This study presents an overview of the wide range of biosensor approaches currently being applied to ABCIs, from their use in the laboratory to their application in clinical and everyday use. The basic principles of each technique are described along with examples of current applications of cutting-edge neuroscience research. In summary, we show that ABCI techniques continue to grow and evolve, incorporating new technologies and advances to address ever more complex and important neuroscience issues, with advancements that are envisioned to lead to a wide range of real-life applications. © 2012 IEEE.
Ma, J, Zhang, G & Lu, J 2012, 'A Method for Multiple Periodic Factor Prediction Problems Using Complex Fuzzy Sets', IEEE TRANSACTIONS ON FUZZY SYSTEMS, vol. 20, no. 1, pp. 32-45.
View/Download from: Publisher's site
View description>>
Multiple periodic factor prediction (MPFP) problems exist widely in multisensor data fusion applications. Development of an effective prediction method should integrate information for multiple periodically changing factors. Because the uncertainty and periodicity coexist in the information used, the prediction method should be able to handle them simultaneously. In this study, complex fuzzy sets are used to represent the information with uncertainty and periodicity. A product-sum aggregation operator (PSAO) is developed for a set of complex fuzzy sets, which is used to integrate information with uncertainty and periodicity, and a PSAO-based prediction (PSAOP) method is then proposed to generate a solution of MPFP problems. This study illustrates the details of the PSAOP method through two real applications in annual sunspot number prediction and bushfire danger rating prediction. Experiments indicate that the proposed PSAOP method effectively handles the uncertainty and periodicity in the information of multiple periodic factors simultaneously and can generate accurate predictions for MPFP problems. © 2012 IEEE.
Manzoor, M & Hussain, W 2012, 'A Web Usability Evaluation Model for Higher Education Providing Universities of Asia', Science, Technology and Development, vol. 31, no. 2, pp. 183-192.
Mathieson, L & Szeider, S 2012, 'Editing graphs to satisfy degree constraints: A parameterized approach', Journal of Computer and System Sciences, vol. 78, no. 1, pp. 179-191.
View/Download from: Publisher's site
View description>>
We study a wide class of graph editing problems that ask whether a given graph can be modified to satisfy certain degree constraints, using a limited number of vertex deletions, edge deletions, or edge additions. The problems generalize several well-studied problems such as the General Factor Problem and the Regular Subgraph Problem. We classify the parameterized complexity of the considered problems taking upper bounds on the number of editing steps and the maximum degree of the resulting graph as parameters. © 2011 Elsevier Inc. All rights reserved.
Merigó, J & Gil-Lafuente, A 2012, 'A method for decision making with the OWA operator', Computer Science and Information Systems, vol. 9, no. 1, pp. 357-380.
View/Download from: Publisher's site
View description>>
A new method for decision making that uses the ordered weighted averaging (OWA) operator in the aggregation of the information is presented. It is used a concept that it is known in the literature as the index of maximum and minimum level (IMAM). This index is based on distance measures and other techniques that are useful for decision making. By using the OWA operator in the IMAM, we form a new aggregation operator that we call the ordered weighted averaging index of maximum and minimum level (OWAIMAM) operator. The main advantage is that it provides a parameterized family of aggregation operators between the minimum and the maximum and a wide range of special cases. Then, the decision maker may take decisions according to his degree of optimism and considering ideals in the decision process. A further extension of this approach is presented by using hybrid averages and Choquet integrals. We also develop an application of the new approach in a multi-person decision-making problem regarding the selection of strategies.
Merigó, JM 2012, 'OWA operators in the weighted average and their application in decision making', Control and Cybernetics, vol. 41, no. 3, pp. 605-643.
View description>>
We introduce a new aggregation operator that unifies the weighted average (WA) and the ordered weighted averaging (OWA) operator in a single formulation. We call it the ordered weighted averaging - weighted average (OWAWA) operator. This aggregation operator provides a more complete representation of the weighted average and the OWA operator because it considers the degree of importance that each concept has in the aggregation and includes them as particular cases of a more general context. We study different properties and families of the OWAWA operator. The applicability of this method is very broad because any study that uses the weighted average or the OWA can be revised and extended with our approach. We focus on a multi-person decision-making application in the selection of financial strategies.
Merigó, JM 2012, 'Probabilities in the OWA operator', Expert Systems with Applications, vol. 39, no. 13, pp. 11456-11467.
View/Download from: Publisher's site
Merigó, JM 2012, 'The probabilistic weighted average and its application in multiperson decision making', International Journal of Intelligent Systems, vol. 27, no. 5, pp. 457-476.
View/Download from: Publisher's site
Merigó, JM & Casanovas, M 2012, 'Decision-making with uncertain aggregation operators using the Dempster-Shafer belief structure', International Journal of Innovative Computing, Information and Control, vol. 8, no. 2, pp. 1037-1061.
View description>>
We develop a new decision-making model using the Dempster-Shafer (D-S) belief structure when available information is uncertain and can be assessed with interval numbers. We use a wide range of aggregation operators involving interval numbers such as the uncertain weighted average (UWA), the uncertain ordered weighted average (UOWA), the uncertain generalized weighted average (UGWA) and the uncertain generalized ordered weighted average (UGOWA). We present a new approach to using interval weights in these uncertain aggregation operators. By using these aggregation operators within a D-S framework, we obtain various belief structures (BS), including the UWA (BS-UWA), the BS-UOWA, the BS-UGWA and the BS-UGOWA. We also use more complete formulations by using induced, hybrid and quasi-arithmetic aggregation operators. We end the paper by applying these operators to a decision-making problem regarding strategic management. © 2012 ICIC International.
Merigo, JM & Gil-Lafuente, AM 2012, 'Decision-making techniques with similarity measures and OWA operators', SORT, vol. 36, no. 1, pp. 81-102.
View description>>
We analyse the use of the ordered weighted average (OWA) in decision-making giving special attention to business and economic decision-making problems. We present several aggregation techniques that are very useful for decision-making such as the Hamming distance, the adequacy coefficient and the index of maximum and minimum level. We suggest a new approach by using immediate weights, that is, by using the weighted average and the OWA operator in the same formulation. We further generalize them by using generalized and quasi-arithmetic means. We also analyse the applicability of the OWA operator in business and economics and we see that we can use it instead of the weighted average. We end the paper with an application in a business multi-person decision-making problem regarding production management.
Merigó, JM, Carral, CL & Castillo, AC 2012, 'Decision making in the European Union under risk and uncertainty', European J. of International Management, vol. 6, no. 5, pp. 590-590.
View/Download from: Publisher's site
Merigo, JM, Casanovas, M & Engemann, KJ 2012, 'GROUP DECISION-MAKING WITH GENERALIZED AND PROBABILISTIC AGGREGATION OPERATORS', INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, vol. 8, no. 7A, pp. 4823-4835.
Merigó, JM, Casanovas, M & Engemann, KJ 2012, 'Group decision-making with generalized and probabilistic aggregation operators', International Journal of Innovative Computing, Information and Control, vol. 8, no. 7 A, pp. 4823-4835.
View description>>
The aim of this paper is to introduce a unified model between the generalized ordered weighted averaging (GOWA) operator and the generalized probabilistic aggregation. We present the generalized probabilistic OWA (GPOWA) operator. It is a new aggregation operator that unifies the probability with the OWA operator considering the degree of importance that each concept has in the analysis. It includes a wide range of particular cases including the GOWA operator and the probabilistic OWA (POWA) operator. We also study the applicability of this new approach and we see that it is very broad because all the previous studies that use the probability or the OWA operator can be revised with this new approach. We develop an application in multi-person decision making concerning the selection of the optimal strategies. © ICIC International 2012.
Merigó, JM, Gil-Lafuente, AM & Martorell, O 2012, 'Uncertain induced aggregation operators and its application in tourism management', Expert Systems with Applications, vol. 39, no. 1, pp. 869-880.
View/Download from: Publisher's site
Merigó, JM, Gil-Lafuente, AM, Zhou, L-G & Chen, H-Y 2012, 'Induced and Linguistic Generalized Aggregation Operators and Their Application in Linguistic Group Decision Making', Group Decision and Negotiation, vol. 21, no. 4, pp. 531-549.
View/Download from: Publisher's site
Merigó-Lindahl, JM 2012, 'Bibliometric Analysis of Business and Economics in the Web of Science', Studies in Fuzziness and Soft Computing, vol. 287, pp. 3-17.
View/Download from: Publisher's site
View description>>
We present a general overview of the most influential results found in the Web of Science in the subject area of Business & Economics that includes the categories of Business, Economics, Business Finance and Management. We analyse the most cited papers in the history and rank the most influential institutions by number of papers published. We analyse the most relevant journals, the temporal evolution and the countries with the highest number of publications. We also develop a similar analysis to the Spanish case studying the most cited papers, the most influential institutions and the temporal evolution. Note that this study is only based on the results found on the Web of Science with the objective of giving a general overview of the research done in Business & Economics especially over the last half century. However, many exceptions and particularities may be found throughout the results. © 2012 Springer-Verlag Berlin Heidelberg.
Mirtalaei, MS, Saberi, M, Hussain, OK, Ashjari, B & Hussain, FK 2012, 'A trust-based bio-inspired approach for credit lending decisions', COMPUTING, vol. 94, no. 7, pp. 541-577.
View/Download from: Publisher's site
View description>>
Credit scoring computation essentially involves taking into account various financial factors and the previous behavior of the credit requesting person. There is a strong degree of correlation between the compliance level and the credit score of a given entity. The concept of trust has been widely used and applied in the existing literature to determine the compliance level of an entity. However it has not been studied in the context of credit scoring literature. In order to address this shortcoming, in this paper we propose a six-step bio-inspired methodology for trust-based credit lending decisions by credit institutions. The proposed methodology makes use of an artificial neural network-based model to classify the (potential) customers into various categories. To show the applicability and superiority of the proposed algorithm, it is applied to a credit-card dataset obtained from the UCI repository. Due to the varying spectrum of trust levels, we are able to solve the problem of binary credit lending decisions. A trust-based credit scoring approach allows the financial institutions to grant credit-based on the level of trust in potential customers. © Springer-Verlag 2012.
Moroder, T, Curty, M, Lim, CCW, Thinh, LP, Zbinden, H & Gisin, N 2012, 'Security of distributed-phase-reference quantum key distribution', Phys. Rev. Lett., vol. 109, p. 260501.
View description>>
Distributed-phase-reference quantum key distribution stands out for its easyimplementation with present day technology. Since many years, a full securityproof of these schemes in a realistic setting has been elusive. For the firsttime, we solve this long standing problem and present a generic method to provethe security of such protocols against general attacks. To illustrate ourresult we provide lower bounds on the key generation rate of a variant of thecoherent-one-way quantum key distribution protocol. In contrast to standardpredictions, it appears to scale quadratically with the system transmittance.
Musau, F, Wang, G, Yu, S & Abdullahi, MB 2012, 'Securing Recommendations in Grouped P2P E-Commerce Trust Model', IEEE Transactions on Network and Service Management, vol. 9, no. 4, pp. 407-420.
View/Download from: Publisher's site
View description>>
In dynamic peer to peer (P2P) e-commerce, it is an important and difficult problem to promote online businesses without sacrificing the desired trust to secure transactions. In this paper, we address malicious threats in order to guarantee secrecy and integrity of recommendations exchanged among peers in P2P e-commerce. In addition to trust, secret keys are required to be established between each peer and its neighbors. Further, we propose a key management approach gkeying to generate six types of keys. Our work mainly focuses on key generation for securing recommendations, and ensuring the integrity of recommendations. The proposed approach presented with a security and performance analysis, is more secure and more efficient in terms of communication cost, computation cost, storage cost, and feasibility. © 2012 IEEE.
Naimi, B & Voinov, A 2012, 'StellaR: A software to translate Stella models into R open-source environment', Environmental Modelling & Software, vol. 38, pp. 117-118.
View/Download from: Publisher's site
Nguyen, TTS, Lu, HY, Tran, TP & Lu, J 2012, 'Investigation of sequential pattern mining techniques for web recommendation', International Journal of Information and Decision Sciences, vol. 4, no. 4, pp. 293-293.
View/Download from: Publisher's site
View description>>
Increased application of sequence mining in web recommender systems (WRS) requires a better understanding of the performance and a clear identification of the strengths and weaknesses of existing algorithms. Among the commonly used sequence mining methods, the tree-based approach, such as pre-order linked WAP-tree mining algorithm (PLWAP-Mine) and conditional sequence mining algorithm (CS-Mine), has demonstrated high performance in web mining applications. However, its advantages over other mining methods are not well explained and understood in the context of WRS. This paper firstly reviews the existing sequence mining algorithms, and then studies the performance of two outstanding algorithms, i.e., the PLWAP-Mine and CS-Mine algorithms, with respect to their sensitivity to the dataset variability, and their practicality for web recommendation. The results show that CS-Mine performs faster than PLWAP-Mine, but the frequent patterns generated by PLWAP-Mine are more effective than CS-Mine when applied in web recommendations. These results are useful to WRS developers for the selection of appropriate sequence mining algorithms. © 2012 Inderscience Enterprises Ltd.
O'Hara, K, Helmes, J, Sellen, A, Harper, R, ten Bhomer, M & van den Hoven, E 2012, 'Food for Talk: Phototalk in the Context of Sharing a Meal', HUMAN-COMPUTER INTERACTION, vol. 27, no. 1-2, pp. 124-150.
View/Download from: Publisher's site
View description>>
Photographic mementos are important signifiers of our personal memories. Rather than simply passive representations of memories to "preserve" the past, these photos are actively displayed and consumed in the context of everyday behavior and social practices. Within the context of these settings, these mementos are invoked in particular ways to mobilize particular social relations in the present. Taking this perspective, we explore how photo mementos come to be used in the everyday social setting of sharing meal. Rather than a simple concern with nutritional consumption, the shared meal is a social event and impor- tant cultural site in the organization of family and social life with culturally specific rhythms, norms, rights, and responsibilities. We present a system-4 Photos-that situates photo mementos within the social concerns of these settings. The system collates photo mementos from those attending the meal and displays them at the dining table to be interacted with by all. Through a real-world deployment of the system, we explore the social work performed by invoking these personal memory resources in the context of real-world settings of shared eating. We highlight particular features of the system that enable this social work to be achieved.
Pal, U, Jayadevan, R & Sharma, N 2012, 'Handwriting Recognition in Indian Regional Scripts', ACM Transactions on Asian Language Information Processing, vol. 11, no. 1, pp. 1-35.
View/Download from: Publisher's site
View description>>
Offline handwriting recognition in Indian regional scripts is an interesting area of research as almost 460 million people in India use regional scripts. The nine major Indian regional scripts are Bangla (for Bengali and Assamese languages), Gujarati, Kannada, Malayalam, Oriya, Gurumukhi (for Punjabi language), Tamil, Telugu, and Nastaliq (for Urdu language). A state-of-the-art survey about the techniques available in the area of offline handwriting recognition (OHR) in Indian regional scripts will be of a great aid to the researchers in the subcontinent and hence a sincere attempt is made in this article to discuss the advancements reported in this regard during the last few decades. The survey is organized into different sections. A brief introduction is given initially about automatic recognition of handwriting and official regional scripts in India. The nine regional scripts are then categorized into four subgroups based on their similarity and evolution information. The first group contains Bangla, Oriya, Gujarati and Gurumukhi scripts. The second group contains Kannada and Telugu scripts and the third group contains Tamil and Malayalam scripts. The fourth group contains only Nastaliq script (Perso-Arabic script for Urdu), which is not an Indo-Aryan script. Various feature extraction and classification techniques associated with the offline handwriting recognition of the regional scripts are discussed in this survey. As it is important to identify the script before the recognition step, a section is dedicated to handwritten script identification techniques. A benchmarking database is very important for any pattern recognition related research. The details of the datasets available in different Indian regional scripts are also mentioned in the article. A separate section is dedicated to the observations made, future scope, and existing difficulties related to handwriting recognition in Indian regional scripts. We hope that this survey will s...
Parvin, S, Hussain, FK, Hussain, OK & Faruque, AA 2012, 'Trust-based Throughput in Cognitive Radio Networks', Procedia Computer Science, vol. 10, pp. 713-720.
View/Download from: Publisher's site
View description>>
Cognitive Radio Networks (CRNs) deal with opportunistic spectrum access in order to fully utilize the scarce of spectrum resources, with the development of cognitive radio technologies to greater utilization of the spectrum. Nowa-days Cognitive Radio (CR) is a promising concept for improving the utilization of limited radio spectrum resources for future wireless communications and mobile computing. In this paper, we propose two approaches. At first we propose a trust aware model to authenticate the secondary users (SUs) in CRNs which provides a reliable technique to establish trust for CRNs. Secondly, we propose trust throughput mechanism to measure throughput in CRNs.
Parvin, S, Hussain, FK, Hussain, OK, Han, S, Tian, B & Chang, E 2012, 'Cognitive radio network security: A survey', JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, vol. 35, no. 6, pp. 1691-1708.
View/Download from: Publisher's site
View description>>
Recent advancements in wireless communication are creating a spectrum shortage problem on a daily basis. Recently, Cognitive Radio (CR), a novel technology, has attempted to minimize this problem by dynamically using the free spectrum in wireless communications and mobile computing. Cognitive radio networks (CRNs) can be formed using cognitive radios by extending the radio link features to network layer functions. The objective of CRN architecture is to improve the whole network operation to fulfil the userâs demands anytime and anywhere, through accessing CRNs in a more efficient way, rather than by just linking spectral efficiency. CRNs are more flexible and exposed to wireless networks compared with other traditional radio networks. Hence, there are many security threats to CRNs, more so than other traditional radio environments. The unique characteristics of CRNs make security more challenging. Several crucial issues have not yet been investigated in the area of security for CRNs. A typical public key infrastructure (PKI) scheme which achieves secure routing and other purposes in typical ad hoc networks is not enough to guarantee the security of CRNs under limited communication and computation resources. However, there has been increasing research attention on security threats caused specifically by CR techniques and special characteristics of CR in CRNs. Therefore, in this research, a survey of CRNs and their architectures and security issues has been carried out in a broad way in this paper.
Parvin, S, Hussain, FK, Park, JS & Kim, DS 2012, 'A survivability model in wireless sensor networks', COMPUTERS & MATHEMATICS WITH APPLICATIONS, vol. 64, no. 12, pp. 3666-3682.
View/Download from: Publisher's site
View description>>
In this paper, we present a survivability evaluation model and analyze the performance of Wireless Sensor Networks (WSNs) under attack and key compromise. First, we present a survivability evaluation model of WSNs by representing the states of WSNs under
PURBA, JH, LU, JIE, ZHANG, G & RUAN, DA 2012, 'AN AREA DEFUZZIFICATION TECHNIQUE TO ASSESS NUCLEAR EVENT RELIABILITY DATA FROM FAILURE POSSIBILITIES', International Journal of Computational Intelligence and Applications, vol. 11, no. 04, pp. 1250022-1250022.
View/Download from: Publisher's site
View description>>
Reliability data is essential for a nuclear power plant probabilistic safety assessment by fault tree analysis to assess the performance of the safety-related systems. The limitation of conventional reliability data arises from insufficient historical data for probabilistic calculation. This study describes a new approach to calculate nuclear event reliability data by utilizing the concept of failure possibilities, which are expressed in qualitative natural languages, mathematically represented by membership functions of fuzzy numbers, and subjectively justified by a group of experts based on their working experience and expertise. We also propose an area defuzzification technique to convert the membership function into nuclear event reliability data. The actual event reliability data, which are collected from the operational experiences of the reactor protection system in Babcock & Wilcox pressurized water reactor between 1984 and 1998, are then compared with the reliability data calculated from the new approach. The results show that fuzzy failure rates can be used as alternatives for probabilistic failure rates when nuclear event historical data are insufficient or unavailable for probabilistic calculation. This study also confirms that our proposed area defuzzification technique is a suitable technique to defuzzify failure possibilities into nuclear event reliability data.
Qiao, Y-M, M.N., JS & Tang, B-S 2012, 'On Isomorphism Testing of Groups with Normal Hall Subgroups', Journal of Computer Science and Technology, vol. 27, no. 4, pp. 687-701.
View/Download from: Publisher's site
View description>>
A normal Hall subgroup N of a group G is a normal subgroup with its order coprime with its index. Schur-Zassenhaus theorem states that every normal Hall subgroup has a complement subgroup, that is a set of coset representatives H which also forms a subgroup of G. In this paper, we present a framework to test isomorphism of groups with at least one normal Hall subgroup, when groups are given as multiplication tables. To establish the framework, we first observe that a proof of Schur-Zassenhaus theorem is constructive, and formulate a necessary and sufficient condition for testing isomorphism in terms of the associated actions of the semidirect products, and isomorphisms of the normal parts and complement parts. We then focus on the case when the normal subgroup is abelian. Utilizing basic facts of representation theory of finite groups and a technique by Le Gall (STACS 2009), we first get an efficient isomorphism testing algorithm when the complement has bounded number of generators. For the case when the complement subgroup is elementary abelian, which does not necessarily have bounded number of generators, we obtain a polynomial time isomorphism testing algorithm by reducing to generalized code isomorphism problem, which asks whether two linear subspaces are the same up to permutation of coordinates. A solution to the latter can be obtained by a mild extension of the singly exponential (in the number of coordinates) time algorithm for code isomorphism problem developed recently by Babai et al. (SODA 2011). Enroute to obtaining the above reduction, we study the following computational problem in representation theory of finite groups: given two representations ? and ? of a group H over Z dp , p a prime, determine if there exists an automorphism φ : H → H, such that the induced representation ?φ = ? 0 φ and ? are equivalent, in time poly(|H|, p d). © 2012 Springer Science+Business Media, LLC & Science Press, China.
Raza, M, Hussain, FK & Hussain, OK 2012, 'Neural Network-Based Approach for Predicting Trust Values Based on Non-uniform Input in Mobile Applications', COMPUTER JOURNAL, vol. 55, no. 3, pp. 347-378.
View/Download from: Publisher's site
View description>>
Recently, there has been much research focus on trust and reputation modelling as one of the key strategies for the formation of successful business intelligence strategies, particularly for service in mobile applications. One of the key trust modelling activities is trust prediction. During this process, the accuracy and reliability of the predicted trust values play an important role in the making of informed business decisions. Key factors to be considered at this stage are the variability and the high levels of distortion in the input series that have to be captured when predicting the trust values at a point in time in the future. In this paper, we propose a Multi-layer Feed Forward Artificial Neural Network to predict the future trust values of entities (services, agents, products etc.) for a future point in time based on data series input. We use four different non-uniform' data input series and measure the accuracy of the predicted values under different experimental scenarios for benchmarking and comparison with existing approaches. Results indicate that the model is reliable in predicting trust values even in scenarios where there are only limited data available on training the neural network and a high level of distortion is present in the input series. © 2011 The Author. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved.
Somma, RD, Nagaj, D & Kieferová, M 2012, 'Quantum Speedup by Quantum Annealing', Physical Review Letters, vol. 109, no. 5, pp. 050501-050501.
View/Download from: Publisher's site
View description>>
We study the glued-trees problem from A. M. Childs, R. Cleve, E. Deotto, E. Farhi, S. Gutmann, and D. Spielman, in Proceedings of the 35th Annual ACM Symposium on Theory of Computing (ACM, San Diego, CA, 2003), p. 59. in the adiabatic model of quantum computing and provide an annealing schedule to solve an oracular problem exponentially faster than classically possible. The Hamiltonians involved in the quantum annealing do not suffer from the so-called sign problem. Unlike the typical scenario, our schedule is efficient even though the minimum energy gap of the Hamiltonians is exponentially small in the problem size. We discuss generalizations based on initial-state randomization to avoid some slowdowns in adiabatic quantum computing due to small gaps.
Sood, S 2012, 'The Death of Social Media in Start-Up Companies and the Rise of S-Commerce', Journal of Electronic Commerce in Organizations, vol. 10, no. 2, pp. 1-15.
View/Download from: Publisher's site
View description>>
Startup employees led by the entrepreneur are masters of embracing complexity. This means the startup team understands cause and effect follow a non-linear relationship with the subtlest of changes potentially resultant in producing chaotic behavior and surprise. For the startup, this means counterintuitive thinking wins the day. In light of this, small expenditures can have a greater impact on developing new business. The startup employee prefers not to be constrained by desktop or the old broadcast model of email; instead exploiting social technologies anywhere. A startup is a learning organization improving processes and results on an ongoing basis mirroring entrepreneurship as a learning process. Startup employees realize success goes beyond consideration of product functionality or a track record built on an existing base of customers. With major technology disruptions during 2012-2014, the potential to launch a “startup-in-a-box” integrating social, mobile, and wearable computing technology is a reality and essential. Only through a combination of social technologies can startups and founding employees maintain pace with the changing business landscape and generate a rapid amount of knowledge to sustain sufficient advantage in the market. Furthermore, the forthcoming death of social media and rise of S-commerce as convergence with E-commerce progresses to help generate revenues from newfound knowledge perfectly complements startup employees.
Sood, SC & Pattinson, HM 2012, '21st Century applicability of the interaction model: Does pervasiveness of social media in B2B marketing increase business dependency on the interaction model?', Journal of Customer Behaviour, vol. 11, no. 2, pp. 117-128.
View/Download from: Publisher's site
View description>>
The IMP interaction model (Håkansson, 1982, p. 24) has survived academic and managerial scrutiny for three decades. Simultaneously, a techno-economic revolution has emerged reshaping B2B communication and interaction through digitising the global economy. In the 21st century, mobile devices directly connect with social interactions of people and businesses through the exemplary social media of Facebook, Twitter, Google Plus, LinkedIn and YouTube. The pervasiveness of social media technologies and applications enables not just the generation of online conversations but enhances B2B collaboration activities atop the B2B and intra business conversations. On this basis, consideration of social media within the context of the IMP interaction model (ibid.) is essential when undertaking any worthwhile contemporary study of B2B marketing.
Tang, F, Tang, C, Guo, M, Guo, S & Yu, S 2012, 'Service-Oriented Wireless Sensor Networks and An Energy-Aware Mesh Routing Algorithm', AD HOC & SENSOR WIRELESS NETWORKS, vol. 15, no. 1, pp. 21-46.
Tao, W & Zhang, G 2012, 'Trusted interaction approach for dynamic service selection using multi-criteria decision making technique', Knowledge-Based Systems, vol. 32, no. 1, pp. 116-122.
View/Download from: Publisher's site
View description>>
Recent developments in information technology have shifted the computing paradigm in a more dynamic direction, and this has raised new challenges. In a dynamic computing environment, (1) the number of transacting entities is not fixed; (2) the relationship between these entities are very dynamic; (3) transacting entities may not necessarily have previous knowledge of each other; (4) the surrounding context may possibly constantly change; and, (5) it is possible that the transaction is conducted in fully a automatic approach. Based on these unique feature in a dynamic environment, we claim that two important challenges need to be resolved: one is transacting entities should be able to establish trusted interactions between each other, and another is transacting entities should be able to select the most suitable transacting entities by pre-programmed business rules and current context. Based on our previous research work on MobiPass, this paper proposes a technique which successfully solves the above two important research issues by using Multi Criteria Decision Making (MCDM) on top of the MobiPass framework, to help transacting entities select the most suitable transacting partners under a trusted interaction in dynamic environments in real time
THINH, LP, SHERIDAN, L & SCARANI, V 2012, 'TOMOGRAPHIC QUANTUM CRYPTOGRAPHY PROTOCOLS ARE REFERENCE FRAME INDEPENDENT', International Journal of Quantum Information, vol. 10, no. 03, pp. 1250035-1250035.
View/Download from: Publisher's site
View description>>
We consider the class of reference frame independent protocols in d dimensions for quantum key distribution, in which Alice and Bob have one natural basis that is aligned and the rest of their measurement bases are unaligned. We relate existing approaches to tomographically complete protocols. We comment on two different approaches to finite key bounds in this setting, one direct and one using the entropic uncertainty relation and suggest that the existing finite key bounds can still be improved.
van den Hoven, E, Sas, C & Whittaker, S 2012, 'Introduction to this Special Issue on Designing for Personal Memories: Past, Present, and Future', HUMAN-COMPUTER INTERACTION, vol. 27, no. 1-2, pp. 1-12.
View/Download from: Publisher's site
View description>>
This special issue focuses on new uses of digital media to help people remember in everyday situations. We begin this introduction by describing the field's origins (personal memories past), using this to contextualise the papers presented here (personal memories present). We conclude by identifying a number of important research challenges that we feel must be addressed by future work in this area (personal memories future).
Verma, P, Singh, R & Kumar Singh, A 2012, 'A Framework for the Next Generation Screen Readers for Visually Impaired', International Journal of Computer Applications, vol. 47, no. 10, pp. 31-38.
View/Download from: Publisher's site
View description>>
Despite shortcomings, Screen Readers have been the primary tool for using internet by visually impaired. In this paper, we present a framework for an advanced Screen Reader that aims at eliminating the drawbacks that are associated with the existing systems. The proposed framework makes the use of informed search technique to enhance the usability and navigability. Some of its features like background music to appraise the layout structure of web page, mouse hovering to speak out glimpses of the underlying text make the use of image processing techniques. These features are implemented independent of the rest development therefore they can also be used to enhance any existing Screen Reader.
Wang, J, Lu, H, Dong, Y & Chi, D 2012, 'The model of chaotic sequences based on adaptive particle swarm optimization arithmetic combined with seasonal term', Applied Mathematical Modelling, vol. 36, no. 3, pp. 1184-1196.
View/Download from: Publisher's site
View description>>
Within a competitive electric power market, electricity price is one of the core elements, which is crucial to all the market participants. Accurately forecasting of electricity price becomes highly desirable. This paper propose a forecasting model of el
Wang, X, Wang, Z & Xu, X 2012, 'Effective Service Composition in Large Scale Service Market', International Journal of Web Services Research, vol. 9, no. 1, pp. 74-94.
View/Download from: Publisher's site
View description>>
The web has undergone a tremendous shift from information repository to the provisioning capacity of services. As an effective means of constructing coarse-grained solutions by dynamically aggregating a set of services to satisfy complex requirements, traditional service composition suffers from dramatic decrease on the efficiency of determining the optimal solution when large scale services are available in the Internet based service market. Most current approaches look for the optimal composition solution by real-time computation, and the composition efficiency greatly depends on the adopted algorithms. To eliminate such deficiency, this paper proposes a semi-empirical composition approach which incorporates the extraction of empirical evidence from historical experiences to provide guidance to solution space reduction to real-time service selection. Service communities and historical requirements are further organized into clusters based on similarity measurement, and then the probabilistic correspondences between the two types of clusters are identified by statistical analysis. For each new request, its hosting requirement cluster would be identified and corresponding service clusters would be determined by leveraging Bayesian inference. Concrete services would be selected from the reduced solution space to constitute the final composition. Timing strategies for re-clustering and consideration to special cases in clustering ensures continual adaption of the approach to changing environment. Instead of relying solely on pure real-time computation, the approach distinguishes from traditional methods by combining the two perspectives together.
Wei, GW & Merigó, JM 2012, 'Methods for strategic decision-making problems with immediate probabilities in intuitionistic fuzzy setting', Scientia Iranica, vol. 19, no. 6, pp. 1936-1946.
View/Download from: Publisher's site
Woodside, AG, Megehee, CM & Sood, S 2012, 'Conversations with(in) the collective unconscious by consumers, brands, and relevant others', Journal of Business Research, vol. 65, no. 5, pp. 594-602.
View/Download from: Publisher's site
View description>>
Jung's (2009) paintings of his dreams to enable conscious interpretation of his conversations within the collective unconscious informs a call for creating visual narrative art to inform meanings of personal and collective unconscious relating to stories consumers tell about buying and using brands. This study describes 13 conversations relevant to the study of conscious and the collective unconscious for consumer-brand relationships/communications. The 13 conversations' paradigm is useful for complementing the dominant logic by scholars of asking questions and relying on consumer conscious interpretations in their responses. The article advocates the use of multiple methods for both collecting and interpreting consumer-brand relationships, and illustrates the usage of storyboard-art of consumer-brand relationships in natural contexts. Brand strategy implications focus on the value of identifying how brands enable consumers to enact primal forces (archetypes).
Wu, Z, Xu, G, Yu, Z, Yi, X, Chen, E & Zhang, Y 2012, 'Executing SQL queries over encrypted character strings in the Database-As-Service model', Knowledge-Based Systems, vol. 35, pp. 332-348.
View/Download from: Publisher's site
View description>>
Rapid advances in the networking technologies have prompted the emergence of the 'software as service' model for enterprise computing, moreover, which is becoming one of the key industries quickly. 'Database as service' model provides users power to store, modify and retrieve data from anywhere in the world, as long as they have access to the Internet, thus, being increasingly popular in current enterprise data management systems. However, this model introduces several challenges, an essential issue being how to implement SQL queries over encrypted data efficiently. To ensure data security, this model generally encrypts sensitive data at the trusted client's site, before storing them into the non-trusted database service provider's site, which, unfortunately, results in that SQL queries cannot be executed over the encrypted data immediately at the database service provider. In this paper we only focus on how to query encrypted character strings efficiently. Our strategy is that when storing character strings to the database service provider, we not only store the encrypted character strings themselves, but also generate some characteristic index values for these character strings, and store them in an additional field; and when querying the encrypted character strings, we first execute a coarse query over the characteristic index fields at the database service provider, in order to filter out most of tuples not related to the querying conditions, and then, we decrypt the rest tuples and execute a refined query over them again at the client site. In our strategy, we define an n-phase reachability matrix for a character string and use it as the characteristic index values, and based on such a definition, we present some theorems to split a SQL query into its server-side representation and client-side representation for partitioning the computation of a query across the client and the server and thus improving query performance. Finally, experimental resul...
Wu, Z, Xu, G, Zhang, Y, Cao, Z, Li, G & Hu, Z 2012, 'GMQL: A graphical multimedia query language', Knowledge-Based Systems, vol. 26, pp. 135-143.
View/Download from: Publisher's site
View description>>
The rapid increase of multimedia data makes multimedia query more and more important. To better satisfy users' query requirements, developing a functional multimedia query language is becoming a promising and interesting task. In this paper, we propose a graphical multimedia query language called GMQL, which is developed based on a semi-structured data organization model. In GMQL, we combine the advantages of graphs and texts, making the query language much clear, easy to use and with powerful expressiveness. In this paper, we first present the notations and basic capabilities of GMQL by query examples. Second, we discuss the GMQL query processing techniques. Last, we evaluate and analyze our multimedia query language through the comparison with other existing multimedia query languages. The evaluation results show that, GMQL has powerful expressiveness, and thus is much applicable for multimedia information retrieval. © 2011 Elsevier B.V. All rights reserved.
Wu, Z, Xu, G, Zhang, Y, Dolog, P & Lu, C 2012, 'An Improved Contextual Advertising Matching Approach based on Wikipedia Knowledge', The Computer Journal, vol. 55, no. 3, pp. 277-292.
View/Download from: Publisher's site
View description>>
The current boom of the Web is associated with the revenues originated from Web advertising. As one prevalent type of Web advertising, contextual advertising refers to the placement of the most relevant commercial textual ads within the content of a Web page, so as to provide a better user experience and thereby increase the revenues of Web site owners and an advertising platform. Therefore, in contextual advertising, the relevance of selected ads with a Web page is essential. However, some problems, such as homonymy and polysemy, low intersection of keywords and context mismatch, can lead to the selection of irrelevant textual ads for a Web page, making that a simple keyword matching technique generally gives poor accuracy. To overcome these problems and thus to improve the relevance of contextual ads, in this paper we propose a novel Wikipedia-based matching technique which, using selective matching strategies, selects a certain amount of relevant articles from Wikipedia as an intermediate semantic reference model for matching Web pages and textual ads. We call this technique SIWI: Selective Wikipedia Matching, which, instead of using the whole Wikipedia articles, only matches the most relevant articles for a page (or a textual ad), resulting in the effective improvement of the overall matching performance. An experimental evaluation is conducted, which runs over a set of real textual ads, a set of Web pages from the Internet and a dataset of more than 260 000 articles from Wikipedia. The experimental results show that our method performs better than existing matching strategies, which can deal with the matching over the large dataset of Wikipedia articles efficiently, and achieve a satisfactory contextual advertising effect. © 2011 The Author. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved.
Xinwang Liu & Shui Yu 2012, 'On the Stress Function-Based OWA Determination Method With Optimization Criteria', IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 42, no. 1, pp. 246-257.
View/Download from: Publisher's site
Xu, Y, Merigó, JM & Wang, H 2012, 'Linguistic power aggregation operators and their application to multiple attribute group decision making', Applied Mathematical Modelling, vol. 36, no. 11, pp. 5427-5444.
View/Download from: Publisher's site
Ye, D, Zhang, M & Sutanto, D 2012, 'Self-organization in an agent network: A mechanism and a potential application', Decision Support Systems, vol. 53, no. 3, pp. 406-417.
View/Download from: Publisher's site
Yu, S 2012, 'Editorial', International Journal of Security and Networks, vol. 7, no. 4, p. 195.
Yu, S, Zhao, G, Dou, W & James, S 2012, 'Predicted Packet Padding for Anonymous Web Browsing Against Traffic Analysis Attacks', IEEE Transactions on Information Forensics and Security, vol. 7, no. 4, pp. 1381-1393.
View/Download from: Publisher's site
Yu, S, Zhou, W, Jia, W & Hu, J 2012, 'Attacking Anonymous Web Browsing at Local Area Networks Through Browsing Dynamics', The Computer Journal, vol. 55, no. 4, pp. 410-421.
View/Download from: Publisher's site
Yu, S, Zhou, W, Jia, W, Guo, S, Xiang, Y & Tang, F 2012, 'Discriminating DDoS Attacks from Flash Crowds Using Flow Correlation Coefficient', IEEE Transactions on Parallel and Distributed Systems, vol. 23, no. 6, pp. 1073-1080.
View/Download from: Publisher's site
Zhang, G, Xu, Y & Li, T 2012, 'A special issue on new trends in Intelligent Decision Support Systems', Knowledge-Based Systems, vol. 32, pp. 1-2.
View/Download from: Publisher's site
Zhang, G, Yang, Y & Chen, J 2012, 'A historical probability based noise generation strategy for privacy protection in cloud computing', Journal of Computer and System Sciences, vol. 78, no. 5, pp. 1374-1381.
View/Download from: Publisher's site
View description>>
Cloud computing promises an open environment where customers can deploy IT services in pay-as-you-go fashion while saving huge capital investment in their own IT infrastructure. Due to the openness, various malicious service providers can exist. Such ser
Zhang, T, Zhang, G, Lu, J, Feng, X & Yang, W 2012, 'A New Index and Classification Approach for Load Pattern Analysis of Large Electricity Customers', IEEE TRANSACTIONS ON POWER SYSTEMS, vol. 27, no. 1, pp. 153-160.
View/Download from: Publisher's site
View description>>
Conducting load pattern analysis is an important task in obtaining typical load profiles (TLPs) of customers and grouping them into classes according to their load characteristics. When using clustering techniques to obtain the load patterns of electrici
Zhang, Z, Cheng, J, Li, J, Bian, W & Tao, D 2012, 'Segment-Based Features for Time Series Classification.', Comput. J., vol. 55, no. 9, pp. 1088-1102.
View/Download from: Publisher's site
View description>>
In this paper, we propose an approach termed segment-based features (SBFs) to classify time series. The approach is inspired by the success of the component- or part-based methods of object recognition in computer vision, in which a visual object is described as a number of characteristic parts and the relations among the parts. Utilizing this idea in the problem of time series classification, a time series is represented as a set of segments and the corresponding temporal relations. First, a number of interest segments are extracted by interest point detection with automatic scale selection. Then, a number of feature prototypes are collected by random sampling from the segment set, where each feature prototype may include single segment or multiple ordered segments. Subsequently, each time series is transformed to a standard feature vector, i.e. SBF, where each entry in the SBF is calculated as the maximum response (maximum similarity) of the corresponding feature prototype to the segment set of the time series. Based on the original SBF, an incremental feature selection algorithm is conducted to form a compact and discriminative feature representation. Finally, a multi-class support vector machine is trained to classify the test time series. Extensive experiments on different time series datasets, including one synthetic control dataset, two sign language datasets and one gait dynamics dataset, have been performed to evaluate the proposed SBF method. Compared with other state-of-the-art methods, our approach achieves superior classification performance, which clearly validates the advantages of the proposed method. © 2011 The Author. Published by Oxford University Press on behalf of The British Computer Society. All rights reserved.
Zhou, L-G, Chen, H-Y, Merigó, JM & Gil-Lafuente, AM 2012, 'Uncertain generalized aggregation operators', Expert Systems with Applications, vol. 39, no. 1, pp. 1105-1117.
View/Download from: Publisher's site
Ahad, MT, Dyson, L & Gay, V 1970, 'Towards an M-banking framework for rural SMEs in Bangladesh', INNOVATION VISION 2020: SUSTAINABLE GROWTH, ENTREPRENEURSHIP, AND ECONOMIC DEVELOPMENT, VOLS 1-4, International Business Information Management, International Business Information Management Association, Barcelona, Spain, pp. 1153-1164.
View description>>
This research aims at discovering factors which impact on the intention of rural SME owners and managers to adopt m-banking in Bangladesh. Over the last ten years, a wide spectrum of mbanking frameworks has emerged that offers new insights into the adoption and acceptance of mbanking. However, m-banking has still not been extended to rural Bangladesh. To fill the gap this research surveyed 550 SMEs owners/managers in four rural villages. The result indicates that poor banking facilities, cost, credibility, gender, education and SME category are the main factors that significantly influence the intention to adopt m-banking. The analysis introduces three factors which have been largely overlooked in prior literature. The study broadens our understanding of m-banking and provides insights into developing m-banking strategies in Bangladesh. This research will be of potential value in accelerating the development of m-banking in Bangladesh
Ashamalla, A, Beydoun, G, Low, G & Yan, J 1970, 'Towards Modelling Real Time Constraints.', ICSOFT, International Conference on Software Paradigm Trends, SciTePress, Rome, Italy, pp. 158-164.
View description>>
Software agents are highly autonomous, situated and interactive software components. They autonomously sense their environment and respond accordingly. Agents behaviours are often constrained by by real time constraints such as the time in which the agent is expected to respond .i.e. time needed for a task to complete. Failing to meet such a constraint can result in a task being not achieved. This possibly causes an agent or a system to fail, depending on how critical the task is to the agent or system as a whole. Our research aims at identifying and modelling real time constraints in the early phase of analysis which helps in creating a more reliable and robust system.
Atif, A & Richards, D 1970, 'A technology acceptance model for unit guide information systems', Proceedings - Pacific Asia Conference on Information Systems, PACIS 2012.
View description>>
Curriculum mapping is an important task in implementing, embedding and monitoring the knowledge, skills and attributes that graduates must acquire in their program of study. Curriculum mapping ensures correspondence between learning outcomes, learning activities and assessments. To aid in performing this complex task, many higher education institutions are using unit/study guide tools or curriculum mapping tools. These tools may be known under different names in different institutions but we will refer to these tools as unit guide information systems. To evaluate the utilisation and acceptance of these tools, this research-in-progress paper draws on an extensive body of literature related to technology acceptance that includes social cognitive theory and model of PC utilization to explain the influence of perceived usefulness and perceived ease of use. Our research extends the technology acceptance model by incorporating the external variables of self-efficacy, anxiety and social influence. The results are expected to indicate which of the external factors are most important in predicting and explaining attitude and intention to use unit guide information systems.
Atif, A, Busch, P & Richards, D 1970, 'Towards an Ontology-Based Approach to Knowledge Management of Graduate Attributes in Higher Education', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, pp. 229-243.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 2012. Knowledge around graduate attributes (GAs) is an area in need of knowledge management strategies. GAs are the qualities and skills that a university agrees its students should develop during their time with the institution. The importance of GAs and ensuring they are embedded and assessed is widely accepted across higher education. This research paper uses Grounded Theory and Network Maps to gain insights into the issues of similarities and differences in the discourse around our sample universities. To cover these two perspectives, we had two researchers involved in data analysis, one with the goal of distilling key ideas and identifying similarities and the other with the goal of untangling and drawing out differences. There is no unified taxonomy of managing GAs. The motivation to create such ontology is to push the standardization process that will enable the connection among academic systems and improve educational workflows, communication, and collaboration between universities.
Atif, A, Richards, D & Bilgin, A 1970, 'Estimating non-response bias in a web-based survey of technology acceptance: A case study of unit guide information systems', ACIS 2012 : Proceedings of the 23rd Australasian Conference on Information Systems.
View description>>
Surveys are mostly challenged by response rates. Among the various types of survey research, web-based (internet-based/electronic/online) surveys are commonly used for data collection for a geographically diverse population. In surveys with high/low response rates, non-response bias can be a major concern. While it is not always possible to measure the actual bias due to non-response there are different approaches and techniques that help to identify reasons of non-response bias. The aims of this paper are twofold. (1) To provide an appropriate, interesting and important non-response bias case study for future web-based surveys that will provide guidance to other Information Systems researchers. The case-study concerns an online-survey to evaluate a technology acceptance model for Unit Guide Information systems (UGIS). (2) To discuss how nonresponse bias in a web-based technology acceptance study of an information system (UGIS in this case) can be contained and managed. Atif, Richards and Bilgin © 2012.
Atif, A, Richards, D & Bilgin, A 1970, 'Predicting the acceptance of unit guide information systems', ACIS 2012 : Proceedings of the 23rd Australasian Conference on Information Systems.
View description>>
Information Systems can play an important role in ensuring and improving the quality of education provided. However, lack of acceptance of these information systems and resistance of technology innovations by the end users limit the expected benefits of the system. This research attempts to identify the key determinants for the acceptance of the Unit Guide Information Systems (UGIS) in the Australian higher education sector. The technology acceptance model (TAM), social cognitive theory (SCT) and model of PC utilization (MPCU) are combined to provide a new framework for this analysis. Results of the study are consistent with the technology acceptance factors for explaining the behavioural intention of the academics. The study also shows the effects of application specific self-efficacy, application specific anxiety and social influence on the acceptance of UGIS. Implications of the results are discussed within the context of unit guides and curriculum mapping. Atif, Richards and Bilgin © 2012.
Babai, L & Qiao, Y 1970, 'Polynomial-time isomorphism test for groups with abelian Sylow towers', Leibniz International Proceedings in Informatics, LIPIcs, Symposium on Theoretical Aspects of Computer Science, Dagstuhl Publishing, Paris, France, pp. 453-464.
View/Download from: Publisher's site
View description>>
We consider the problem of testing isomorphism of groups of order n given by Cayley tables. The trivial nlogn bound on the time complexity for the general case has not been improved over the past four decades. Recently, Babai et al. (following Babai et al. in SODA 2011) presented a polynomial-time algorithm for groups without abelian normal subgroups, which suggests solvable groups as the hard case for group isomorphism problem. Extending recent work by Le Gall (STACS 2009) and Qiao et al. (STACS 2011), in this paper we design a polynomial-time algorithm to test isomorphism for the largest class of solvable groups yet, namely groups with abelian Sylow towers, defined as follows. A group G is said to possess a Sylow tower, if there exists a normal series where each quotient is isomorphic to a Sylow subgroup of G. A group has an abelian Sylow tower if it has a Sylow tower and all its Sylow subgroups are abelian. In fact, we are able to compute the coset of isomorphisms of groups formed as coprime extensions of an abelian group, by a group whose automorphism group is known. The mathematical tools required include representation theory, Wedderburn's theorem on semisimple algebras, and M. E. Harris's 1980 work on p'-automorphisms of abelian p-groups. We use tools from the theory of permutation group algorithms, and develop an algorithm for a parameterized version of the graph-isomorphism-hard setwise stabilizer problem, which may be of independent interest. © László Babai and Youming Qiao.
Babai, L, Codenotti, P & Qiao, Y 1970, 'Polynomial-Time Isomorphism Test for Groups with No Abelian Normal Subgroups', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Colloquium on Automata Languages and Programming, Springer Berlin Heidelberg, Warwick, UK, pp. 51-62.
View/Download from: Publisher's site
View description>>
We consider the problem of testing isomorphism of groups of order n given by Cayley tables. The trivial n logn bound on the time complexity for the general case has not been improved upon over the past four decades. We demonstrate that the obstacle to efficient algorithms is the presence of abelian normal subgroups; we show this by giving a polynomial-time isomorphism test for groups without nontrivial abelian normal subgroups. This concludes a project started by the authors and J. A. Grochow (SODA 2011). Two key new ingredient are: (a) an algorithm to test permutational isomorphism of permutation groups in time, polynomial in the order and simply exponential in the degree; (b) the introduction of the 'twisted code equivalence problem,' a generalization of the classical code equivalence problem by admitting a group action on the alphabet. Both of these problems are of independent interest. © 2012 Springer-Verlag.
Bakker, S, van den Hoven, E & Eggen, B 1970, 'FireFlies', Proceedings of the 24th Australian Computer-Human Interaction Conference, OzCHI '12: The 24th Australian Computer-Human Interaction Conference, ACM, Melbourne, pp. 26-29.
View/Download from: Publisher's site
View description>>
Primary school teachers usually perform several tasks simultaneously. Many secondary tasks, such as giving turns or encouraging children to work silently, could be supported by interactive systems, which may lighten the teacher's busy everyday routine. Such systems however, should afford being interacted with while performing another primary task. We call this type of design peripheral interaction design. In this paper we present FireFlies, an open-ended peripheral interaction design developed for primary schools. Preliminary results of a six week deployment of FireFlies in four classrooms, reveal that teachers used FireFlies to perform secondary tasks and saw it as a valuable addition to the classroom. Though different interactions with FireFlies required different levels of effort, teachers could successfully interact with FireFlies during or in between other tasks.
Bakker, S, van den Hoven, E, Eggen, B & Overbeeke, K 1970, 'Exploring peripheral interaction design for primary school teachers', Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, TEI'12: Sixth International Conference on Tangible, Embedded, and Embodied Interaction, ACM, Kingston, Canada, pp. 245-252.
View/Download from: Publisher's site
View description>>
This paper explores the concept of peripheral interactions; interactions with technology that take place in the background or periphery of the attention. We present two designs for a classroom setting. CawClock makes selected time frames audible in order to provide teachers with awareness of time. NoteLet is designed to support the teacher in observing children's behavior, by enabling him or her to take pictures of the classroom through straightforward interactions on a bracelet. A qualitative, two-week exploration of both systems in a classroom revealed that the soundscapes of CawClock indeed shifted to the periphery of the attention and supported the teacher's time awareness. The actions with NoteLet did not shift to the periphery. However, the tangible aspects of NoteLet seemed to facilitate the interaction to be quick and simple, which may indicate that it could shift to the periphery with more practice. Tangible interaction therefore seems a promising interaction style for this purpose.
Biesecker, M, Erion, R, Hay, CH, Henebry, GM, Johnston, CA, Kjaersgaard, JH, Shmagin, BA, Van Der Sluis, E, Capehart, W, Kirilenko, AE, Krakauer, NY, Sweeney, M & Voinov, AA 1970, 'UNCERTAINTY OF HYDROLOGIC EVENTS UNDER SOUTH DAKOTA'S CHANGING CONDITIONS: A RESEARCH AGENDA', PROCEEDINGS OF THE SOUTH DAKOTA ACADEMY OF SCIENCE, VOL 91, 97th Annual Meeting of the South-Dakota-Academy-of-Science, SOUTH DAKOTA ACAD SCIENCE, Univ S Dakota, Muenster Univ Ctr, Vermillion, SD, pp. 257-259.
Bo Rong, Yin Xu, Yiyan Wu, Gagnon, G, Bo Liu, Lin Gui & Wenjun Zhang 1970, 'Exploring controllable deterministic bits for LDPC iterative decoding in WiMAX networks', 2012 IEEE Global Communications Conference (GLOBECOM), GLOBECOM 2012 - 2012 IEEE Global Communications Conference, IEEE, Anaheim, CA, pp. 4018-4023.
View/Download from: Publisher's site
Bressan, N, James, A & McGregor, C 1970, 'Trends and opportunities for integrated real time neonatal clinical decision support', Proceedings of 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics, 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), IEEE, pp. 687-690.
View/Download from: Publisher's site
View description>>
Neonatal Intensive Care Unit maintain and support life during the critical period of premature development. This research presents the challenges, trends and opportunities for integrated real time neonatal clinical decision support. We demonstrated this potential using environment known as Artemis, a clinical decision support system. A review of the current devices in the intensive care unit and neonatal practice shows the current environment and our perspective for the future of the neonatal clinical decision support. The study demonstrates that Artemis will be able to incorporate new data streams from infusion pumps, EEG monitors and cerebral oxygenation monitors innovating the practice and improving the clinical support. © 2012 IEEE.
Bródka, P, Skibicki, K, Kazienko, P & Musiał, K 1970, 'A degree centrality in multi-layered social network', Proceedings of the 2011 International Conference on Computational Aspects of Social Networks, CASoN'11, pp. 237-242.
View/Download from: Publisher's site
View description>>
Multi-layered social networks reflect complex relationships existing inmodern interconnected IT systems. In such a network each pair of nodes may belinked by many edges that correspond to different communication orcollaboration user activities. Multi-layered degree centrality formulti-layered social networks is presented in the paper. Experimental studieswere carried out on data collected from the real Web 2.0 site. Themulti-layered social network extracted from this data consists of ten distinctlayers and the network analysis was performed for different degree centralitiesmeasures.
Budka, M, Musial, K & Juszczyszyn, K 1970, 'Predicting the Evolution of Social Networks: Optimal Time Window Size for Increased Accuracy', 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing, 2012 International Conference on Privacy, Security, Risk and Trust (PASSAT), IEEE, pp. 21-30.
View/Download from: Publisher's site
View description>>
This study investigates the data preparation process for predictive modelling of the evolution of complex networked systems, using an e - mail based social network as an example. In particular, we focus on the selection of optimal time window size for building a time series of network snapshots, which forms the input of chosen predictive models. We formulate this issue as a constrained multi - objective optimization problem, where the constraints are specific to a particular application and predictive algorithm used. The optimization process is guided by the proposed Windows Incoherence Measures, defined as averaged Jensen-Shannon divergences between distributions of a range of network characteristics for the individual time windows and the network covering the whole considered period of time. The experiments demonstrate that the informed choice of window size according to the proposed approach allows to boost the prediction accuracy of all examined prediction algorithms, and can also be used for optimally defining the prediction problems if some flexibility in their definition is allowed. © 2012 IEEE.
Cai, H, Liu, B, Gui, L & Wu, M-Y 1970, 'Neighbor discovery algorithms in wireless networks using directional antennas', 2012 IEEE International Conference on Communications (ICC), ICC 2012 - 2012 IEEE International Conference on Communications, IEEE, pp. 767-772.
View/Download from: Publisher's site
View description>>
Directional antennas provide great performance improvement for wireless networks, such as increased network capacity and reduced energy consumption. Nonetheless new media access and routing protocols are required to control the directional antenna system. One of the most important protocols is neighbor discovery, which is aiming at setting up links between nodes and their neighbors. In the past few years, a number of algorithms have been proposed for neighbor discovery with directional antennas. However, most of them cannot work efficiently when taking into account the collision case that more than one node exist in one directional beam. For practical considerations, we propose a new neighbor discovery algorithm to overcome this shortcoming. Moreover, we present a novel and practical mathematical model to analyze the performance of neighbor discovery algorithms considering collision effects. Numerical results clearly show our new algorithm always requires less time to discover the whole neighbors than previous ones. To the best of our knowledge, it is the first complete, practical analytical model that incorporates directional neighbor discovery algorithms. © 2012 IEEE.
Cetindamar, D & Kozanoglu, H 1970, 'Competitiveness of Turkish hidden champions', 2012 Proceedings of Portland International Center for Management of Engineering and Technology: Technology Management for Emerging Technologies, PICMET'12, Conference of PICMET - Technology Management for Emerging Technologies (PICMET), IEEE, Vancouver, CANADA, pp. 2072-2077.
View description>>
Understanding the competitive power of small and medium sized firms in emerging economies is a challenging task. This paper aims to analyze internationally successful small and medium sized firms that are so called hidden champions of emerging economies in the same way as they appear in advanced countries such as Germany and Austria. The analysis will shed some light to what makes these hidden champions so competitive in international markets. Knowing that developing country firms struggle to overcome the country-of-origin effects arising from the consumer perceptions on the country products/services, observing the successful practices might help to understand their strategies in overcoming these effects. The assessment of company practices in terms of competition is carried out by using a comprehensive model where the assessment of firm competitiveness is carried out through the outcome/performance of competition (i.e. output), assets/factors (i.e. input) and processes that turn the assets/factors into actual performance. The paper conducts a case study by concentrating in one emerging economy: Turkey. The in-depth analysis of 10 companies by using the firm competitiveness assessment model helps to identify some innovative ways of overcoming the country-of-origin effects. The paper ends with some managerial and policy implications. © 2012 IEEE.
Chen, X, Li, L, Xiao, H, Xu, G, Yang, Z & Kitsuregawa, M 1970, 'Recommending related microblogs: A comparison between topic and WordNet based approaches', Proceedings of the National Conference on Artificial Intelligence, AAAI Conference on Artificial Intelligence, AAAI Press, Toronto, pp. 2417-2418.
View description>>
Computing similarity between short microblogs is an important step in microblog recommendation. In this paper, we investigate a topic based approach and a WordNet based approach to estimate similarity scores between microblogs and recommend top related ones to users. Empirical study is conducted to compare their recommendation effectiveness using two evaluation measures. The results show that the WordNet based approach has relatively higher precision than that of the topic based approach using 548 tweets as dataset. In addition, the Kendall tau distance between two lists recommended by WordNet and topic approaches is calculated. Its average of all the 548 pair lists tells us the two approaches have the relative high disaccord in the ranking of related tweets. Copyright © 2012, Association for the Advancement of Artificial Intelligence. All rights reserved.
Chuang, C-H, Huang, C-S, Lin, C-T, Ko, L-W, Chang, J-Y & Yang, J-M 1970, 'Mapping Information Flow of Independent Source to Predict Conscious Level: A Granger Causality Based Brain-Computer Interface', 2012 International Symposium on Computer, Consumer and Control, 2012 International Symposium on Computer, Consumer and Control (IS3C), IEEE, pp. 813-816.
View/Download from: Publisher's site
View description>>
Recent studies have shown that the various brain networks over different cognitive states. In contrast to measure a physiological change over a single region, the information flows between brain regions described by effective connectivity provides an informative dynamic over the whole brain. In this study, we proposed a source information flow network based on the combination of Granger causality and support vector regression to predict driver's conscious level. This work provides the first application of using brain network to develop a brain-computer interface and obtain a sound result of performance. © 2012 IEEE.
Corney, M, Teague, D, Ahadi, A & Lister, R 1970, 'Some empirical results for neo-Piagetian reasoning in novice programmers and the relationship to code explanation questions', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Melbourne, Australia, pp. 77-86.
View description>>
Recent research on novice programmers has suggested that they pass through neo-Piagetian stages: sensorimotor, preoperational, and concrete operational stages, before eventually reaching programming competence at the formal operational stage. This paper presents empirical results in support of this neo-Piagetian perspective. The major novel contributions of this paper are empirical results for some exam questions aimed at testing novices for the concrete operational abilities to reason with quantities that are conserved, processes that are reversible, and properties that hold under transitive inference. While the questions we used had been proposed earlier by Lister, he did not present any data for how students performed on these questions. Our empirical results demonstrate that many students struggle to answer these problems, despite the apparent simplicity of these problems. We then compare student performance on these questions with their performance on six explain in plain English questions.
Dong, H, Hussain, FK & Chang, E 1970, 'Ontology-Learning-Based Focused Crawling for Online Service Advertising Information Discovery and Classification', Proceedings of the 10th International Conference on Service-Oriented Computing, 10th International Conference on Service-Oriented Computing, Springer Berlin Heidelberg, Shanghai, China, pp. 591-598.
View/Download from: Publisher's site
Dovey, KA 1970, 'Innovation Within Large Organizations: The Role of the Intrapreneur', XXIII ISPIM Conference: Action for Innovation: Innovating From Experience, XXIII ISPIM Conference: Action for Innovation: Innovating From Experience, International Society for Professional Innovation Management (ISPIM), Barcelona, Spain, pp. 49-49.
View description>>
This paper explores the role of the intrapreneur in successful innovation within large organizations. Offering a case of the creation of a successful new product within a large Australian organization, the paper explores the politics of championing creative ideas through to their realization in innovative new commercial offerings. Through the lens of one intrapreneurâs experience, it highlights the practices required to innovate within environments rich in innovation rhetoric but governed by contradictory enterprise logic. Furthermore, it addresses the self-management practices required of those wanting âto make a differenceâ in such organizations and warns of the fatal traps into which inexperienced intrapreneurs can fall.
ElSawah, S, Haase, D, Van Delden, H, Pierce, S, ElMahdi, A, Voinov, AA & Jakeman, AJ 1970, 'Using system dynamics for environmental modelling: Lessons learnt from six case studies', iEMSs 2012 - Managing Resources of a Limited Planet: Proceedings of the 6th Biennial Meeting of the International Environmental Modelling and Software Society, pp. 1367-1374.
View description>>
System dynamics modelling includes a set of conceptual and numerical methods that are used to understand the structure and behaviour of complex systems, such as socio-ecological systems. A system dynamics model represents the causal relationships, feedback loops, and delays that are thought to generate the system behaviour. System dynamics is widely used for developing environmental models and decision support systems. However, little attention has been given to reflecting on modelling exercises in terms of the utility of system dynamics, its strengths and limitations, experienced during modelling and implementation challenges. These practical lessons are useful for guiding modellers on deciding when and how to use system dynamics. The purpose of this paper is to shed some light on these issues drawing on experience from six case studies. Case studies demonstrate a wide range of applications (e.g. land use, groundwater management, urban water systems), tools, modelling approaches (e.g. coupled, integrated), and computational software.
Esfijani, A, Hussain, FK & Chang, E 1970, 'An Approach to University Social Responsibility Ontology Development Through Text Analyses', 2012 5TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTIONS (HSI 2012), International Conference on Human System Interactions (HSI), IEEE, Perth, WA, pp. 1-7.
View/Download from: Publisher's site
View description>>
The main purpose of this paper is to propose a content analysis approach in order to develop an ontology of university social responsibility (USR). The proposed approach comprises four main phases in which two content analyses software have been utilized to extract the main USR components and to identify the domain of this concept. To achieve the goal, the existing body of knowledge of USR definitions and specifications - using a variety of terms - has been considered to identify the main notions of USR and their relationships. The developed ontology can be applied to define a formal, explicit description of the USR concept and to construct a more reliable basis for measurement purposes. © 2012 IEEE.
Fu, B, Wang, Z, Pan, R, Xu, G & Dolog, P 1970, 'Learning Tree Structure of Label Dependency for Multi-label Learning', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Pacific-Asia Conference on Knowledge Discovery and Data Mining, Springer Berlin Heidelberg, Kuala Lumpur, Malaysia, pp. 159-170.
View/Download from: Publisher's site
View description>>
There always exists some kind of label dependency in multi-label data. Learning and utilizing those dependencies could improve the learning performance further. Therefore, an approach for multi-label learning is proposed in this paper, which quantifies the dependencies of pairwise labels firstly, and then builds a tree structure of the labels to describe them. Thus the approach could find out potential strong label dependencies and produce more generalized dependent relationships. The experimental results have validated that compared with other state-of-the-art algorithms, the method is not only a competitive alternative, but also has shown better performance after ensemble learning especially. © 2012 Springer-Verlag.
Gao, L, Li, M, Zhu, T, Bonti, A, Zhou, W & Yu, S 1970, 'AMDD: Exploring Entropy Based Anonymous Multi-dimensional Data Detection for Network Optimization in Human Associated DTNs', 2012 IEEE 11th International Conference on Trust, Security and Privacy in Computing and Communications, 2012 IEEE 11th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), IEEE, pp. 1245-1250.
View/Download from: Publisher's site
View description>>
Human associated delay-tolerant networks (HDTNs) are new networks where mobile devices are associated with humans and demonstrate social-related communication characteristics. Most of recent works use real social trace file to analyse its social characteristics, however social-related data is sensitive and has concern of privacy issues. In this paper, we propose an anonymous method that anonymize the original data by coding to preserve individual's privacy. The Shannon entropy is applied to the anonymous data to keep rich useful social characteristics for network optimization, e.g. routing optimization. We use an existing MIT reality dataset and Infocom 06 dataset, which are human associated mobile network trace files, to simulate our method. The results of our simulations show that this method can make data anonymously while achieving network optimization. © 2012 IEEE.
Garcia Marin, J 1970, 'Exergames for the elderly: towards an embedded Kinect-based clinical test of falls risk.', Stud Health Technol Inform. 2012;178:51-7..
View description>>
Falls are the leading cause of disability, injuries or even death among older adults. Exercise programmes that include a balance component reduce the risk of falling by 40%. However, such interventions are often perceived as boring and drop-out rates are high. The characteristics of videogames may overcome this weakness and increase exercise adherence. The use of modern input devices, such as the Microsoft Kinect, enables quantification of player performance in terms of motor function while engaging with games. This capability has just started to be explored. The work presented in this paper focuses on the development of a Kinect-based system to deliver step training while simultaneously measuring parameters of stepping performance that have shown to predict falls in older people.
Ghous, H, Kennedy, PJ, Ho, N & Catchpoole, DR 1970, 'Functional visualisation of genes using singular value decomposition', Conferences in Research and Practice in Information Technology Series, Australian Data Mining Conference, Australian Computer Society, Sydney, Australia, pp. 53-59.
View description>>
Progress in understanding core pathways and processes of cancer requires thorough analysis of many coding regions of the genome. New insights are hampered due to the lack of tools to make sense of large lists of genes identified using high throughput technology. Data mining, particularly visualisation that finds relationships between genes and the Gene Ontology (GO), has the potential to assist in functional understanding. This paper addresses the question of how well GO annotations can help in functional understanding of genes. We augment genes with associated GO terms and visualise with Singular Value Decomposition (SVD). Meaning of derived components is further interpreted using correlations to GO terms. The results demonstrate that SVD visualisation of GO-augmented genes matches the biological understanding expected in the simulated data and presents understanding of childhood cancer genes that aligns with published results.
Gill, AQ & Bunker, D 1970, 'Crowd Sourcing Challenges Assessment Index for Disaster Management.', AMCIS, Americas Conference on Information Systems, Association for Information Systems, Seattle, USA, pp. 4428-4438.
View description>>
Emergency agencies (EA) rely on inter-agency approaches to information management during disasters. EA have shown a significant interest in the use of cloud-based social media such as Twitter and Facebook for crowd-sourcing and distribution of disaster information. While the intentions are clear, the question of what are its major challenges are not. EA have a need to recognise the challenges in the use of social media under their local circumstances. This paper analysed the recent literature, 2010 Haiti earthquake and 2010-11 Queensland flood cases and developed a crowd sourcing challenges assessment index construct specific to EA areas of interest. We argue that, this assessment index, as a part of our large conceptual framework of context aware cloud adaptation (CACA), can be useful for the facilitation of citizens, NGOs and government agencies in a strategy for use of social media for crowd sourcing, in preventing, preparing for, responding to and recovering from disasters. © (2012) by the AIS/ICIS Administrative Office All rights reserved.
Gill, AQ, Bunker, D & Seltsikas, P 1970, 'Evaluating a communication technology assessment tool (CTAT): A case of a cloud based communication tool', Proceedings - Pacific Asia Conference on Information Systems, PACIS 2012, Pacific Asia Conference on Information Systems, AIS, Ho Chi Min, Vietnam, pp. 1-13.
View description>>
A primary concern of distributed adaptive development environment (DADE) is that of human communication and knowledge sharing among geographically dispersed developers. Emerging cloudbased communication technologies claim to provide a support for communication and knowledge sharing among developers in a DADE. However, the challenge is how to enable developers to self assess and select appropriate cloud-based communication technologies for their DADE. Based on our recent empirical study, we have developed the construct of a practical communication technologies assessment tool (CTAT). We argue that, CTAT construct, as a part of our large conceptual framework of context aware cloud adaptation (CACA), can be useful to assist developers in the self assessment of appropriate cloud-based communication technologies for their DADE. This paper presents the evaluation of the CTAT by using it for the assessment of the Force.com cloud-based Chatter communication tool. The main objective of this evaluation is to determine to what extent CTAT construct is relevant, valuable and sufficient to achieve its purpose. The results of this evaluation indicate that CTAT seems useful when performing vendor independent assessment of communication technologies in order to make an informed decision about the selection of a communication tool for the DADE.
Gluga, R, Kay, J, Lister, R & Teague, D 1970, 'On the reliability of classifying programming tasks using a neo-piagetian theory of cognitive development', Proceedings of the ninth annual international conference on International computing education research, ICER '12: International Computing Education Research Conference, ACM, Auckland, New Zealand, pp. 31-38.
View/Download from: Publisher's site
View description>>
Abstract: Recent research has proposed Neo-Piagetian theory as a useful way of describing the cognitive development of novice programmers. Neo-Piagetian theory may also be a useful way to classify materials used in learning and assessment. If Neo-Piagetian coding of learning resources is to be useful then it is important that practitioners can learn it and apply it reliably. We describe the design of an interactive web-based tutorial for Neo-Piagetian categorization of assessment tasks. We also report an evaluation of the tutorial's effectiveness, in which twenty computer science educators participated. The average classification accuracy of the participants on each of the three Neo-Piagetian stages were 85%, 71% and 78%. Participants also rated their agreement with the expert classifications, and indicated high agreement (91%, 83% and 91% across the three Neo-Piagetian stages). Self-rated confidence in applying Neo-Piagetian theory to classifying programming questions before and after the tutorial were 29% and 75% respectively. Our key contribution is the demonstration of the feasibility of the Neo-Piagetian approach to classifying assessment materials, by demonstrating that it is learnable and can be applied reliably by a group of educators. Our tutorial is freely available as a community resource.
Gluga, R, Kay, J, Lister, R, Kleitman, S & Lever, T 1970, 'Coming to terms with Bloom: An online tutorial for teachers of programming fundamentals', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Melbourne, Australia, pp. 147-156.
View description>>
This paper describes a web-based interactive tutorial that enables computer science tutors and lecturers to practice applying the Bloom Taxonomy in classifying programming exam questions. The structure, design and content of the tutorial are described in detail. The results of an evaluation with ten participants highlight important problem areas in the application of Bloom to programming assessments. The key contributions are the content and design of this tutorial and the insights derived from its evaluation. These are important results in continued work on methods of measuring learning progression in programming fundamentals.
Gluga, R, Kay, J, Lister, R, Kleitman, S & Lever, T 1970, 'Over-confidence and confusion in using bloom for programming fundamentals assessment', Proceedings of the 43rd ACM technical symposium on Computer Science Education, SIGCSE '12: The 43rd ACM Technical Symposium on Computer Science Education, ACM, Raleigh, North Carolina, USA, pp. 147-152.
View/Download from: Publisher's site
View description>>
Abstract: This paper describes a web-based interactive tutorial that enables computer science tutors and lecturers to practice applying the Bloom Taxonomy in classifying programming exam questions. The structure, design and content of the tutorial are described in detail. The results of an evaluation with ten participants highlight important problem areas in the application of Bloom to programming assessments. The key contributions are the content and design of this tutorial and the insights derived from its evaluation. These are important results in continued work on methods of measuring learning progression in programming fundamentals.
Gluga, R, Kay, J, Lister, RF & Lever, T 1970, 'A unified model for embedding learning standards into university curricula for effective accreditation and quality assurance', Proceedings of the 23rd Annual Conference of the Australasian Association for Engineering Education, AAEE - Annual Conference of Australasian Association for Engineering Education, The Engineering & Science Education Research (ESER) group, Melbourne, Australia, pp. 1-9.
Golsteijn, C, van den Hoven, E, Frohlich, D & Sellen, A 1970, 'Towards a more cherishable digital object', Proceedings of the Designing Interactive Systems Conference, DIS '12: Designing Interactive Systems Conference 2012, ACM, Newcastle upon Tyne, UK, pp. 655-664.
View/Download from: Publisher's site
View description>>
As we go about our everyday routines we encounter and interact with numerous physical (e.g. furniture or clothes) and digital objects (e.g. photos or e-mails). Some of these objects may be particular cherished, for example because of memories attached to them. As several studies into cherished objects have shown, we have more difficulties identifying cherished digital objects than physical ones. However, cherishing a small collection of digital objects can be beneficial; e.g. it can encourage active selection of digital objects to keep and discard. This paper presents a study that aimed to increase understanding of cherished physical and digital objects, and beyond that, of how we perceive physical and digital objects, and their advantages and disadvantages. We identified design opportunities for novel products and systems that support the creation of more cherishable digital objects by extrapolating the advantages of the physical to the digital, exploiting the reasons for cherishing digital objects, and aiming for meaningful integrations of physical and digital.
Hamade, RF, Ammouri, AH, Beydoun, G & ASME 1970, 'NESTED RIPPLE DOWN RULES (NRDR) AS A DESIGN ASSISTANT IN MECHANICAL DIMENSIONAL TOLERANCING', PROCEEDINGS OF THE ASME INTERNATIONAL MECHANICAL ENGINEERING CONGRESS AND EXPOSITION 2010, VOL 3, PTS A AND B, ASME International Mechanical Engineering Congress and Exposition (IMECE), AMER SOC MECHANICAL ENGINEERS, Vancouver, CANADA, pp. 491-496.
Hasan, MA, Xu, M, He, X & Chen, L 1970, 'Shot Classification Using Domain Specific Features for Movie Management', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on DASFAA, Springer Berlin Heidelberg, Busan, South Korea, pp. 314-318.
View/Download from: Publisher's site
View description>>
Among many video types, movie content indexing and retrieval is a significantly challenging task because of the wide variety of shooting techniques and the broad range of genres. A movie consists of a series of video shots. Managing a movie at shot level provides a feasible way for movie understanding and summarization. Consequently, an effective shot classification is greatly desired for advanced movie management. In this demo, we explore novel domain specific features for effective shot classification. Experimental results show that the proposed method classifies movie shots from wide range of movie genres with improved accuracy compared to existing work. © 2012 Springer-Verlag.
Hu, L, Cao, J, Xu, G & Gu, Z 1970, 'Latent informative links detection', Frontiers in Artificial Intelligence and Applications, 16th International Conference on Knowledge-Based and Intelligent Information & Engineering Systems, IOS Press, San Sebastian, Spain, pp. 1233-1242.
View/Download from: Publisher's site
View description>>
Sometimes, explicit relationships between entities do not provide sufficient information or can be unavailable in the real world. Unseen latent relationships may be more informative than explicit relationships. Thereby, we provide a method for constructing latent informative links between entities, using their common features, where entities are regarded as vertices on a graph. First, we employ a hierarchical nonparametric model to infer shared latent features for entities. Then, we define a filter function based on information theory to extract significant features and control the density of links. Finally, a couple of stochastic interaction processes are introduced to simulate dynamics on the net-works so that link strength can be retrieved from statistics in a natural way. In experiments, we evaluate the usage of filter function. The results of two examples based on mixture networks show how our method is capable of providing latent informative relationships in comparison to explicit relationships. © 2012 The authors and IOS Press. All rights reserved.
Hua Zuo & Guo-Li Zhang 1970, 'An analysis of solutions for fuzzy multi-objective linear programming problem', 2012 International Conference on Machine Learning and Cybernetics, 2012 International Conference on Machine Learning and Cybernetics (ICMLC), IEEE.
View/Download from: Publisher's site
Jian-Ru Zheng, Guo-Li Zhang & Hua Zuo 1970, 'Hybrid linear and nonlinear weight Particle Swarm Optimization algorithm', 2012 International Conference on Machine Learning and Cybernetics, 2012 International Conference on Machine Learning and Cybernetics (ICMLC), IEEE.
View/Download from: Publisher's site
Juszczyszyn, K, Gonczarek, A, Tomczak, JM, Musial, K & Budka, M 1970, 'A Probabilistic Approach to Structural Change Prediction in Evolving Social Networks', 2012 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, 2012 International Conference on Advances in Social Networks Analysis and Mining (ASONAM 2012), IEEE, Kadir Has Univ, Istanbul, TURKEY, pp. 996-1001.
View/Download from: Publisher's site
Kamaleswaran, R & McGregor, C 1970, 'CBPsp: Complex Business Processes for Stream Processing', 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), IEEE.
View/Download from: Publisher's site
View description>>
This paper presents an extension called the Complex Business Processes for Stream Processing (CBPsp) to the Solution Manager Service (SMS) framework to support the definition and enactment of complex business processes for event stream processing. The critical care of an infant involves multiple caregivers performing complex activities, thus a system that is capable of presenting complex business processes to produce context sensitive real-time support is required. The proposed system CBPsp, supports the integration of heterogeneous sequential business processes and distinct data types to produce meaningful business objective driven outputs in real-time. Two research contributions are delivered. The first contribution is the real-time integration of synchronous and asynchronous streams in a loosely coupled model based on Service-Oriented Architecture principles. The second contribution is the definition and enactment of complex business processes along with their meaningful business objectives at the point of analysis within data stream management systems. © 2012 IEEE.
Kamaleswaran, R, McGregor, C & James, A 1970, 'A novel framework for event stream processing of clinical practice guidelines', Proceedings of 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics, 2012 IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), IEEE, pp. 933-936.
View/Download from: Publisher's site
View description>>
Clinical Decision Support Systems (CDSSs) play important roles aiding in patient care; they provide accurate data analysis and timely evidence-informed recommendations. Although the availability of biomedical data continues to flourish, there have been limited translations of this type of data to information in real-time at the bedside. Existing systems have either focused on providing process-oriented or knowledge-modeled frameworks, often relying on retrospective data analysis. We have developed a framework capable of providing clinicians the ability to represent existing knowledge and processes in realtime. This framework presents a real-time environment for modeling clinical workflow processes abstracted from clinical guidelines, while applying existing knowledge to produce intelligent evidence-informed recommendations. In this paper we provide a framework to support the detection of neonatal hypoglycaemia using a design supporting the automated realtime, evidence-informed enactment of complex businessprocesses existing in clinical practice guidelines. © 2012 IEEE.
Kamaruddin, LA, Shen, J & Beydoun, G 1970, 'Evaluating Usage of WSMO and OWL-S in Semantic Web Services.', APCCM, Conferences in Research and Practice in Information Technology (CRPIT), Australian Computer Society, Melbourne, Australia, pp. 53-58.
View description>>
Applying ontologies is the most promising approach to semantically enrich Web services. To facilitate this, two efforts contributed the most in enabling the creation of ontologies: OWL-S from the US and WSMO in Europe. These two compete and promote their ontologies from the design perspective, reflecting their inventors' bias but not offering much help to Web service developers using them. To bypass existing biases and enable evaluation of ontologies expressed in these two languages, this paper provides a study of the two important facilitators, OWL-S and WSMO, surveying their usage in several SWS Projects and identifying their respective and outstanding gaps. The paper then proposes a set of evaluation criteria for usage measurement on the two prominent SWS ontologies. © 2012, Australian Computer Society, Inc.
Kennard, R & Leaney, J 1970, 'An Introduction to Software Mining', NEW TRENDS IN SOFTWARE METHODOLOGIES, TOOLS AND TECHNIQUES, 11th International Conference on Intelligent Software Methodologies, Tools, and Techniques (SoMeT), IOS PRESS, Genoa, ITALY, pp. 312-323.
View/Download from: Publisher's site
Li, J & Tao, D 1970, 'Sampling Normal Distribution Restricted on Multiple Regions.', ICONIP (1), International Conference on Neural Information Processing, Springer, Doha, Qatar, pp. 492-500.
View/Download from: Publisher's site
View description>>
We develop an accept-reject sampler for probability densities that have the similar form of a normal density function, but supported on restricted regions. Compared to existing techniques, the proposed method deals with multiple disjoint regions, truncated on one or both sides. For the original problem of sampling from one region, the efficiency is enhanced as well. We verify the desirable attributes of the proposed algorithm by both theoretical analysis and simulation studies. © 2012 Springer-Verlag.
Li, Y, Liu, B, Rong, B, Wu, Y, Gagnon, G, Gui, L & Zhang, W 1970, 'On the performance of LDPC-RS product codes for mobile DTV', IEEE international Symposium on Broadband Multimedia Systems and Broadcasting, 2012 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), IEEE, Yonsei Univ, Seoul, SOUTH KOREA.
View/Download from: Publisher's site
Liang, J, Hua, J, Huang, ML, Nguyen, QV & Simoff, S 1970, 'Rectangle orientation in area judgment task for treemap design', Proceedings of the 24th Australian Computer-Human Interaction Conference, OzCHI '12: The 24th Australian Computer-Human Interaction Conference, ACM, Melbourne, VIC, Australia, pp. 349-352.
View/Download from: Publisher's site
View description>>
Prior works on treemaps have mainly focused on developing the new layouts. The existing treemaps generated from various algorithms require careful examination on design parameter. However, current research does not provide usability studies of treemap guidelines on effectiveness of design parameters. Hence, selecting the most effective parameter for certain type of task is primarily based on intuition preference of visualization designer. For example, in the existing research, there is insufficient guidance on orientation for treemap design yet. Therefore, the impact of orientation remains unclear in visual analysis tasks performance. The contribution of this paper is to assess the effect of orientation in visual data analysis process so that we will further investigate treemap design guidance. © 2012 ACM.
Liang, J, Huang, ML & Nguyen, QV 1970, 'Perceptual User Study for Combined Treemap', 2012 11th International Conference on Machine Learning and Applications, 2012 Eleventh International Conference on Machine Learning and Applications (ICMLA), IEEE, Boca Raton, FL, USA, pp. 300-305.
View/Download from: Publisher's site
View description>>
Space-filling visualization techniques have proved their capability in visualizing large hierarchical structured data. However, most existing techniques restrict their partitioning process in vertical and horizontal direction only, which cause problem with identifying hierarchical structures. According to Gestalt research, limiting tree map visualisation to rectangles blocks the utilisation of human capability on object recognition, due to the same fixed size (90 degrees) of all the angles of the shapes in the tree visualisation. However, this assertion was only supported by theory and not rooted in empirical perception data. We conducted a series of controlled experiments to investigate the effect of shape variation of data elements and container in visual data analysis process. We first studied how shape variation affects user's perception in the visual data analysis process. We compared combined treemap with traditional rectangular treemaps, slice & dice treemaps and squarifed treemaps. Finally, we demonstrated the effect of the new approach which combines rectangular and non-rectangular treemaps and validate the method based on the empirical results. © 2012 IEEE.
Liang, J, Nguyen, QV, Simoff, S & Huang, ML 1970, 'Angular Treemaps - A New Technique for Visualizing and Emphasizing Hierarchical Structures', 2012 16th International Conference on Information Visualisation, 2012 16th International Conference on Information Visualisation (IV), IEEE, Montpellier, France, pp. 74-80.
View/Download from: Publisher's site
View description>>
Space-filling visualization techniques have proved their capability in visualizing large hierarchical structured data. However, most existing techniques restrict their partitioning process in vertical and horizontal direction only, which cause problem with identifying hierarchical structures. This paper presents a new space-filling method named Angular Treemaps that relax the constraint of the rectangular subdivision. The approach of Angular Treemaps utilizes divide and conquer paradigm to visualize and emphasize large hierarchical structures within a compact and limited display area with better interpretability. Angular Treemaps generate various layouts to highlight hierarchical sub-structure based on user's preferences or system recommendations. It offers flexibility to be adopted into a wider range of applications, regarding different enclosing shapes. Preliminary usability results suggest user's performance by using this technique is improved in locating and identifying categorized analysis tasks. © 2012 IEEE.
Lin, C-T, Wang, Y-K & Chen, S-A 1970, 'A hierarchal classifier for identifying independent components', The 2012 International Joint Conference on Neural Networks (IJCNN), 2012 International Joint Conference on Neural Networks (IJCNN 2012 - Brisbane), IEEE.
View/Download from: Publisher's site
View description>>
Brain-computer interface (BCI) has shown explosive growth for multiple applications in the recently years. Removing artifacts and selecting useful brain sources are essential in BCI research. Independent Component Analysis (ICA) has been proven as an effective technique to remove artifacts and many brain related researches are based on ICA. However, the useful independent components with brain sources are usually selected manually according to the scalp-plots. This is great inconvenience and a barrier for real-time BCI applications of EEG. In this investigation, a two-layer automatic identification model is proposed to select useful brain sources. It is based on neural network including support vector machine with radial basis function (SVMRBF) and self-organizing map (SOM). In the first layer, SVM discriminates useful independent components from the artifact effectively. In the second layer, these selected useful components are automatically classified to different spatial brain sources according to SOM. This study suggests this model to one general application for EEG study. It can reduce the effect of subjective judgment and improve the performance of EEG analysis. © 2012 IEEE.
Lin, H & Zhang, G 1970, 'Risk Prediction Framework and Model for Bank External Fund Attrition', 14th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems (IPMU), International Conference on Information Processing and Management of Uncertainty, Springer Berlin Heidelberg, Catania, Italy, pp. 170-180.
View/Download from: Publisher's site
View description>>
Customer Attrition is a function of customer transaction and service related characteristics and also a combination of cancellation and switching to a competitor. This paper first presents a risk prediction framework for bank customer attrition. A risk prediction approach and a combined sporadic risk prediction model are then proposed to support decision making of financial managers. Real world experiments validate the proposed framework, approach and model and show the positive results for bank customer attrition prediction and marketing decision making.
Linares-Mustarós, S, Merigó, JM & Ferrer-Comalat, JC 1970, 'A Method for Uncertain Sales Forecast by Using Triangular Fuzzy Numbers', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, MS 2012, International Conference of Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, New Rochelle, NY, pp. 98-113.
View/Download from: Publisher's site
Lister, R, Corney, M, Curran, J, D'Souza, D, Fidge, C, Gluga, R, Hamilton, M, Harland, J, Hogan, J, Kay, J, Murphy, T, Roggenkamp, M, Sheard, J, Simon & Teague, D 1970, 'Toward a shared understanding of competency in programming: An invitation to the BABELnot project', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Melbourne, Australia, pp. 53-60.
View description>>
The ICT degrees in most Australian universities have a sequence of up to three programming subjects, or units. BABELnot is an ALTC-funded project that will document the academic standards associated with those three subjects in the six participating universities and, if possible, at other universities. This will necessitate the development of a rich framework for describing the learning goals associated with programming. It will also be necessary to benchmark exam questions that are mapped onto this framework. As part of the project, workshops are planned for ACE 2012, ICER 2012 and ACE 2013, to elicit feedback from the broader Australasian computing education community, and to disseminate the project's findings. The purpose of this paper is to introduce the project to that broader Australasian computing education community and to invite their active participation.
Liu, L, Fan, D, Liu, M, Xu, G, Chen, S, Zhou, Y, Chen, X, Wang, Q & Wei, Y 1970, 'A MapReduce-Based Parallel Clustering Algorithm for Large Protein-Protein Interaction Networks', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), International Conference on Advanced Data Mining and Applications, Springer Berlin Heidelberg, Nanjing, China, pp. 138-148.
View/Download from: Publisher's site
View description>>
Clustering proteins or identifying functionally related proteins in Protein-Protein Interaction (PPI) networks is one of the most computation-intensive problems in the proteomic community. Most researches focused on improving the accuracy of the clustering algorithms. However, the high computation cost of these clustering algorithms, such as Girvan and Newmans clustering algorithm, has been an obstacle to their use on large-scale PPI networks. In this paper, we propose an algorithm, called Clustering-MR, to address the problem. Our solution can effectively parallelize the Girvan and Newmans clustering algorithms based on edge-betweeness using MapReduce. We evaluated the performance of our Clustering-MR algorithm in a cloud environment with different sizes of testing datasets and different numbers of worker nodes. The experimental results show that our Clustering-MR algorithm can achieve high performance for large-scale PPI networks with more than 1000 proteins or 5000 interactions. © Springer-Verlag 2012.
Loke, L, Khut, GP & Kocaballi, AB 1970, 'Bodily experience and imagination', Proceedings of the Designing Interactive Systems Conference, DIS '12: Designing Interactive Systems Conference 2012, ACM, Newcastle, UK, pp. 779-788.
View/Download from: Publisher's site
View description>>
We are exploring new possibilities for bodily-focused aesthetic experiences within participatory live-art contexts. As artist-researchers, we are interested in how we can understand and shape bodily experience and imagination as primary components of an interactive aesthetic experience, sonically mediated by digital biofeedback technologies. Through the making of a participatory live-art installation, we illustrate how we used the Bodyweather performance methodology to inform the design of ritual interactions intended to reframe the audience experience of self, body and the world through imaginative processes of scaling and metaphor. We report on the insights into the varieties of audience experience gathered from audience testing of the prototype artwork, with a particular focus on the relationship between the embodied imagination and felt sensation; the influence of objects and costume; and the sonically mediated experience of physiological processes of breathing and heartbeat. We offer some reflections on the use of ritual and scripted interactions as a strategy for facilitating coherent forms of bodily experience.
Long, G, Chen, L, Zhu, X & Zhang, C 1970, 'TCSST', Proceedings of the 21st ACM international conference on Information and knowledge management, CIKM'12: 21st ACM International Conference on Information and Knowledge Management, ACM, Maui, Hawaii, USA, pp. 764-772.
View/Download from: Publisher's site
View description>>
Short & sparse text is becoming more prevalent on the web, such as search snippets, micro-blogs and product reviews. Accurately classifying short & sparse text has emerged as an important while challenging task. Existing work has considered utilizing external data (e.g. Wikipedia) to alleviate data sparseness, by appending topics detected from external data as new features. However, training a classifier on features concatenated from different spaces is not easy considering the features have different physical meanings and different significance to the classification task. Moreover, it exacerbates the 'curse of dimensionality' problem. In this study, we propose a transfer classification method, TCSST, to exploit the external data to tackle the data sparsity issue. The transfer classifier will be learned in the original feature space. Considering that the labels of the external data may not be readily available or sufficiently enough, TCSST further exploits the unlabeled external data to aid the transfer classification. We develop novel strategies to allow TCSST to iteratively select high quality unlabeled external data to help with the classification. We evaluate the performance of TCSST on both benchmark as well as real-world data sets. Our experimental results demonstrate that the proposed method is effective in classifying very short & sparse text, consistently outperforming existing and baseline methods. © 2012 ACM.
Lu, LF, Huang, ML, Chen, YW, Liang, J & Nguyen, QV 1970, 'Clutter Reduction in Multi-dimensional Visualization of Incomplete Data Using Sugiyama Algorithm', 2012 16th International Conference on Information Visualisation, 2012 16th International Conference on Information Visualisation (IV), IEEE, Montpellier, France, pp. 93-99.
View/Download from: Publisher's site
View description>>
Visualization of uncertainty in datasets is a new field of research, which aims to represent incomplete data for analysis in real scenarios. In many cases, datasets, especially multi-dimensional datasets, often contain either errors or uncertain values. To address this challenge, we may treat these uncertainties as scalar values like probability. For visual representation in parallel coordinates, we draw a small 'circle' to temporarily define a dummy vertex for an uncertain value of a data item, at the crossing point between polylines and the axis of certain dimension. Furthermore, these temporary positions of uncertainty could be permuted to achieve visual effectiveness. This feature provides a great opportunity by optimizing the order of uncertain values to tackle another important challenge in information visualization: clutter reduction. Visual clutter always obscures the visualizing structure even in small datasets. In this paper, we apply Sugiyama's layered directed graph drawing algorithm into parallel coordinates visualization to minimize the number of edge crossing among polylines, which has significantly improved the readability of visual structure. Experiments in case studies have shown the effectiveness of our new methods for clutter reduction in parallel coordinates visualization. These experiments also imply that besides visual clutter, the number of uncertain values and the type of multi-dimensional data are important attributes that affect visualization performance in this field. © 2012 IEEE.
Lu, P, Lu, J & Zhang, G 1970, 'Case-Base Maintenance for Concept Drift', WASET Issue 72: Proceedings of the International Conference on Information Systems, International Conference on Information Systems, WASET, Penang, Malaysia, pp. 333-340.
View description>>
As the evolving nature of a real data stream, so called concept drift, and its accumulating volume, any deployed Case-Based Reasoning (CBR) system will nead to have procedures for Case-Base Maintenance (CBM). Traditional CBM methods for handling concept drift cannot well distinguish between noise and true concept drift, some of them may even run under the risk of losing case-base competence. Motivated by these two problems, wc present a twostage CBM approach. We propose a Noise-Enhanced Fast Context Switch (NEFCS) algorithm, which targets to remove noise under dynamic environment as Stage 1. We also propose a Recursive Conservative Redundancy Removal (RCRR) algorithm, which removes redundant cases in a recursively unifonn way while keeping the case·base coveragc as Stage 2. Experimental evaluations on realworld datasels show that our approach significant improves the perfonnance compared with other CBM approaches.
Luo, Z, Hu, Z, Song, Y, Xu, Z, Liu, H, Jia, L & Lu, H 1970, 'Economic analyses of plug-in electric vehicle battery providing ancillary services', 2012 IEEE International Electric Vehicle Conference, 2012 IEEE International Electric Vehicle Conference (IEVC), IEEE, Greenville, SC, USA.
View/Download from: Publisher's site
View description>>
This paper explores the potential financial return for using plug-in electric vehicles (PEVs) as a grid resource. There are two methods for PEVs to provide ancillary services call interruptible load and vehicle to grid The contract market is introduced first, then the method to calculate the cost benefit of plug-in electric vehicles (PEVs) to provide ancillary services are proposed. Additionally, the expected profits and profits of providing the ancillary services when considering the uncertainties of driving behaviors are both calculated and compared. The calculation results indicate that profits of participating in frequency regulation are higher than that of reserve services. When penalty is neglected or the penalty coefficient is low, the revenue of regulation down services is relatively high. However, with the increasing of penalty factor, the profits decrease dramatically. When the penalty coefficient is sufficiently high, participating in regulation up services in V2G mode is most profitable. © 2012 IEEE.
McGregor, C, Catley, C & James, A 1970, 'Variability analysis with analytics applied to physiological data streams from the neonatal intensive care unit', 2012 25th IEEE International Symposium on Computer-Based Medical Systems (CBMS), 2012 25th IEEE International Symposium on Computer-Based Medical Systems (CBMS), IEEE.
View/Download from: Publisher's site
View description>>
Late onset neonatal sepsis (LONS) is one clinical condition that shows promise for earlier onset detection through the analysis of physiological signals. However, current work on Heart Rate Variability (HRV) analysis does not discuss the impact of narcotics and other drugs on early identification of sepsis. We present results of a pilot retrospective data mining study of neonatal intensive care unit patients using a dataset of 30 second spot readings. We derive analytics by creating temporal abstractions of hourly summaries for HRV and respiratory rate variability (RRV). Using representative patient examples, we illustrate an analytics user interface design that shows 1) the potential in using our HRV analytics for early identification of LONS with 30 second spot readings; and 2) that based on initial pilot results, reporting analytics for HRV and RRV concurrently adds value to HRV analysis by distinguishing between patients with low HRV due to imminent sepsis and those patients with low HRV due to the presence of confounding factors such as surgery and narcotics. © 2012 IEEE.
McGregor, C, Steadman, A, Percival, J & James, A 1970, 'A Method for Modeling Health Informatics Capacity in Patient Journeys Supported by Interprofessional Teams', 2012 45th Hawaii International Conference on System Sciences, 2012 45th Hawaii International Conference on System Sciences (HICSS), IEEE, pp. 2790-2799.
View/Download from: Publisher's site
View description>>
Neonatal intensive care is one of the most complex areas of healthcare; as a result, information flow within this unit must be as efficient as possible. This paper presents initial research findings based on the use of the patient journey modeling technique known as PaJMa to audit the current state of health informatics within NICUs in Canada and internationally. In this paper, a case study from an Ontario NICU is utilized and their "Investigations" process is modeled using PaJMa. © 2012 IEEE.
Mearns, H, Leaney, J, Parakhine, A, Debenham, J & Verchere, D 1970, 'CARMA: Complete Autonomous Responsible Management Agents for Telecommunications and Inter-cloud Services', 2012 IEEE NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (NOMS), IEEE Network Operations and Management Symposium, IEEE, Maui, USA, pp. 1089-1095.
View/Download from: Publisher's site
View description>>
The continuing rise in telecommunication and cloud services usage is matched by an increased complexity in maintaining adequate performance management. To combat this complexity, researchers and telecommunication companies are exploring a variety of management strategies to leverage their individual infrastructures to provide better performance and increased utilisation. We extend these strategies by addressing the complexities that arise through the interaction of multiple telecommunication and cloud providers when providing a modern complex service. Our overall aim is for the management to accept responsibility for the complex service in an open marketplace. Responsibility is, firstly, defined by aiming to cover the totality of modern complex services, managing both the connectivity and virtual infrastructure. Secondly, responsibility is defined as managing risk and resilience in the provisioning and operation of the complex service. With these aims, we are working towards a bundled service provider agent architecture, which can negotiate on the open service market. This approach aims to also optimise the utilisation of the providers infrastructure while reducing the risk of failure to users through total service management. We present the specification, design and simulation of the Complete Autonomous Responsible Management Agents (CARMA) in a marketplace environment. © 2012 IEEE.
Memon, T, Lu, J & Hussain, FK 1970, 'Semantic De-biased Associations (SDA) Model to Improve Ill-Structured Decision Support', NEURAL INFORMATION PROCESSING, ICONIP 2012, PT II, International Conference on Neural Information Processing, Springer Verlag, Doha, Qatar, pp. 483-490.
View/Download from: Publisher's site
Meng, Q & Kennedy, PJ 1970, 'Determining the Number of Clusters in Co-authorship Networks Using Social Network Theory', 2012 Second International Conference on Cloud and Green Computing, 2012 International Conference on Cloud and Green Computing (CGC), IEEE, Xiangtan, Hunan, China, pp. 337-343.
View/Download from: Publisher's site
View description>>
Spectral clustering is a modern data clustering methodology with many notable advantages. However, this method has a weakness in that it requires researchers to specify a priori the number of clusters. In most cases, it is a challenge to know the number of clusters accurately. Here, we propose a novel way to solve this problem by involving the concept of group leaders and members from social network theory. From the perspective of social networks, groups are organized by leaders and this can provide a hint to finding the number of clusters in social networks by identifying group leaders. However, due to the fact that a group can have more than one leader, we also propose an algorithm to combine leaders from the same group. The number of leaders after the combination is expected to be the number of clusters in a network. We validate this proposed approach by using spectral clustering to cluster data comprising the co-authorship network from the University of Technology, Sydney (UTS). The experimental results show that our proposed method is effective in determining the number of cluster and can facilitate spectral clustering to achieve better clusters compared with other methods of calculating the number of clusters
Meng, Q & Kennedy, PJ 1970, 'Using network evolution theory and singular value decomposition method to improve accuracy of link prediction in social networks', Conferences in Research and Practice in Information Technology Series, Australian Data Mining Conference, Australian Computer Society, Sydney, pp. 175-181.
View description>>
Link prediction in large networks, especially social networks, has received significant recent attention. Although there are many papers contributing methods for link prediction, the accuracy of most predictors is generally low as they treat all nodes equally. We propose an effective approach to identifying the level of activities of nodes in networks by observing their behaviour during network evolution. It is clear that nodes that have been active previously contribute more to the changes in a network than stable nodes, which have low activity. We apply truncated singular value decomposition (SVD) to exclude the interference of stable nodes by treating them as noise in our dataset. Finally, in order to test the effectiveness of our proposed method, we use co-authorship networks from an Australian university from between 2006 and 2011 as an experimental dataset. The results show that our proposed method achieves higher accuracy in link prediction than previous methods, especially in predicting new links.
Merigo, JM 1970, 'Decision making in complex environments with generalized aggregation operators', 2012 IEEE Conference on Computational Intelligence for Financial Engineering & Economics (CIFEr), 2012 IEEE Conference on Computational Intelligence for Financial Engineering & Economics (CIFEr), IEEE, New York, NY, pp. 113-119.
View/Download from: Publisher's site
MERIGÓ, JM 1970, 'DECISION MAKING IN COMPLEX ENVIRONMENTS WITH UNCERTAIN GENERALIZED UNIFIED AGGREGATION OPERATORS', UNCERTAINTY MODELING IN KNOWLEDGE ENGINEERING AND DECISION MAKING, 10th International Conference on Fuzzy Logic and Intelligent Technologies in Nuclear Science (FLINS), WORLD SCIENTIFIC, Istanbul, TURKEY, pp. 357-862.
View/Download from: Publisher's site
Merigó, JM 1970, 'Measuring Errors with the OWA Operator', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, MS 2012, International Conference of Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, New Rochelle, NY, pp. 24-33.
View/Download from: Publisher's site
Merigo, JM & Casanovas, M 1970, 'Linguistic decision making with probabilistic information and induced aggregation operators', 2012 IEEE Conference on Computational Intelligence for Financial Engineering & Economics (CIFEr), 2012 IEEE Conference on Computational Intelligence for Financial Engineering & Economics (CIFEr), IEEE, New York, NY, pp. 25-31.
View/Download from: Publisher's site
Merigó, JM & Gil-Lafuente, AM 1970, 'Complex Group Decision Making under Risk and Uncertainty', Studies in Fuzziness and Soft Computing, Springer Berlin Heidelberg, pp. 81-93.
View/Download from: Publisher's site
View description>>
We develop a new method for group decision making under risk and uncertain environments. We introduce the uncertain induced generalized probabilistic ordered weighted averaging weighted average (UIGPOWAWA) operator. It is an aggregation operator that unifies the probabilistic aggregation, the weighted average and the ordered weighted average in the same formulation and considering the degree of importance that each concept has in the analysis. It also deals with uncertain environments where the information is imprecise and can be assessed with interval numbers. Moreover, it deals with complex attitudinal characters represented with order inducing variables and generalizes the aggregation with generalized means. We study some of its main properties and develop an application in a group decision making problem concerning the selection of the optimal strategies. © 2012 Springer-Verlag Berlin Heidelberg.
MERIGÓ, JM & XU, Y 1970, 'INDUCED AND HEAVY AGGREGATION OPERATORS', UNCERTAINTY MODELING IN KNOWLEDGE ENGINEERING AND DECISION MAKING, 10th International Conference on Fuzzy Logic and Intelligent Technologies in Nuclear Science (FLINS), WORLD SCIENTIFIC, Istanbul, TURKEY, pp. 812-817.
View/Download from: Publisher's site
Mir, S, Hawryszkiewycz, I & Zowghi, D 1970, 'A method to explore wicked problems in complex environments: A research proposal', Innovation Vision 2020: Sustainable growth, Entrepreneurship, and Economic Development - Proceedings of the 19th International Business Information Management Association Conference, International Business Information Management, International Business Information Management Association (IBIMA), Barcelona, Spain, pp. 1958-1962.
View description>>
This research, proposes a method to investigate wicked problems occurring in open, heterogeneous and evolving complex socio-technical ecosystems. Investigating the behaviour of these environments and predicting the consequences of proposed solutions remain challenging. The proposed method addresses complexity causes based on interdependencies of multiple types of elements in the system. The issues can be investigated through behavioural and functional change analysis by any individuals, activities, interactions and interventions involved in the system. The context of problem space is described as a network of actors involved in the system. This research extends actors to roles, activities, artefacts, groups or organizations. The new aspect of this approach is its flexibility to investigate multiplicity and diversity of perspectives in complex environments and its ideal potential to involve human elements. This method has the ability to integrate qualitative social elements with quantitative technical issues in complex socio technical environments. Agent based simulation, facilitate this approach.
Mir, S, Hawryszkiewycz, I & Zowghi, D 1970, 'Toward a methodology for managing complexity in information systems development projects', Innovation Vision 2020: Sustainable growth, Entrepreneurship, and Economic Development - Proceedings of the 19th International Business Information Management Association Conference, International Business Information Management, International Business Information Management Association (IBIMA), Barcelona, Spain, pp. 1945-1950.
View description>>
This research is motivated by the need to develop systematic approach to effectively manage complexity in information systems development processes which is still an open and critical issue as environment become increasingly complex. Managing complexity is critical because there is a need to continually develop new ideas in design changes. For this reason, a conceptual method with an emphasis on supporting the decision-making process is developed. This method uses ideas from system thinking and management to study complexity issues in information systems development processes. To support decision makers, this research develops guidelines in order to deal with systems complexity using the living systems theory. Living systems theory is used as diagnostic tool to find important elements that need more attention and to investigate different kinds of uncertainty and conflicts that exist during decision making.
Mir, S, Hawryszkiewycz, IT & Zowghi, D 1970, 'A Method to Explore Wicked Problems in Complex Environments: A Research Proposal', Innovation Vision 2020: Sustainable growth, Entrepreneurship, and Economic Development, International Business Information Management, International Business Information Management Association (IBIMA), Barcelona, Spain, pp. 1951-1955.
View description>>
This research, proposes a method to investigate wicked problems occurring in open, heterogeneous and evolving complex socio-technical ecosystems. Investigating the behaviour of these environments and predicting the consequences of proposed solutions remain challenging. The proposed method addresses complexity causes based on interdependencies of multiple types of elements in the system. The issues can be investigated through behavioural and functional change analysis by any individuals, activities, interactions and interventions involved in the system. The context of problem space is described as a network of actors involved in the system. This research extends actors to roles, activities, artefacts, groups or organizations. The new aspect of this approach is its flexibility to investigate multiplicity and diversity of perspectives in complex environments and its ideal potential to involve human elements. This method has the ability to integrate qualitative social elements with quantitative technical issues in complex socio technical environments. Agent based simulation, facilitate this approach.
Mueller, F, Toprak, C, Graether, E, Walmink, W, Bongers, B & van den Hoven, E 1970, 'Hanging off a bar', CHI '12 Extended Abstracts on Human Factors in Computing Systems, CHI '12: CHI Conference on Human Factors in Computing Systems, ACM, Austin, TX, USA, pp. 1055-1058.
View/Download from: Publisher's site
View description>>
Exertion Games involve physical effort and as a result can facilitate physical health benefits. We present Hanging off a Bar, an action hero-inspired Exertion Game in which players hang off an exercise bar over a virtual river for as long as possible. Initial observations from three events with audiences ranging from the general public to expert game designers suggest that Hanging off a Bar can be engaging for players and facilitate intense exertion within seconds. Furthermore, we collected suggestions for what game elements players believe could entice them to increase their physical effort investment. These suggestions, combined with Hanging off a Bar as research vehicle due to the easy measurement of exertion through hanging time, enable future explorations into the relationship between digital game elements and physical exertion, guiding designers on how to support exertion in digital games.
Murphy, L, Fitzgerald, S, Lister, R & McCauley, R 1970, 'Ability to 'explain in plain english' linked to proficiency in computer-based programming', Proceedings of the ninth annual international conference on International computing education research, ICER '12: International Computing Education Research Conference, ACM, Auckland, New Zealand, pp. 111-118.
View/Download from: Publisher's site
View description>>
Abstract: This study investigates the relationship between novice programmers' ability to explain code segments and their ability to write code. Results show a strong correlation between ability to correctly answer 'explain in plain English' (EiPE) questions and ability to write code indicating that there are aspects of reasoning about code that are common to both writing code and explaining code. Student explanations were categorized using the Structure of the Observed Learning Outcome (SOLO) taxonomy. The better programmers were more likely to articulate relational aspects of the algorithms. While earlier work also found such a link, the code writing in those earlier studies was done on paper. This is the first such result where the writing component was done with 'hands on' a computer. Our results add further evidence for the existence of an aspect of reasoning about code that is common to both explaining code and writing code, which in turn suggests that a judicious mix of teaching both code skills and code explaining skills may lead to a more effective process by which novices learn to reason about code.
Musial, K & Sastry, N 1970, 'Social media', Proceedings of the Fourth Annual Workshop on Simplifying Complex Networks for Practitioners, SIMPLEX '12: Simplifying Complex Networks for Practitioners, ACM, pp. 1-6.
View/Download from: Publisher's site
View description>>
On many social media and user-generated content sites, users can not only upload content but also create links with other users to follow their activities. It is interesting to ask whether the resulting user-user Followers' Network is based more on social ties, or shared interests in similar content. This paper reports our preliminary progress in answering this question using around five years of data from social video-sharing site vimeo. Many links in the Followers' Network are between users who do not have any videos in common, which would imply the network is not interest-based, but rather has a social character. However, the Followers' Network also exhibits properties unlike other social networks, for instance, clustering co-efficient is low, links are frequently not reciprocated, and users form links across vast geographical distances. In addition, analysis of the relationship strength, calculated as the number of commonly liked videos, people who follow each other and share some "likes" have more video likes in common than the general population. We conclude by speculating on the reasons for these differences and proposals for further work. © 2012 ACM.
Oberst, S & Lai, JCS 1970, 'Analysis of disc brake squeal: Progress and challenges', 19th International Congress on Sound and Vibration 2012, ICSV 2012, pp. 2874-2881.
View description>>
Brake squeal noise has been the subject of intense research efforts owing to concerns of car manufacturers caused by complaints lodged and warranty claim related costs arising from dissatisfied customers. Brake squeal is known to be fugitive, and often not repeatable, even under apparently similar operating conditions. The production of brake squeal is dependent on a large number of interacting parameters, such as the mechanical properties of the brake lining materials, contact conditions, wear, operating pressure and temperature which contribute to its often observed nonrepeatability. In this paper, an overview of the state-of-the-art understanding of brake squeal mechanisms and numerical analysis methods (primarily based on finite element analysis) for the prediction of brake squeal propensity is presented. The question of nonlinearity of brake squeal is raised in terms of analysing the mechanisms and how present solution methods reflect this degree of nonlinearity. This is complemented by a description of current industrial practice in the treatment of brake squeal which is, generally, managed on a case-by-case, trial-and-error basis using expensive equipment and time-consuming noise dynamometer and/or on-vehicle tests. The gaps between theory and industrial practice and, hence, challenges for brake squeal research are identified. Recommendations for bridging these gaps and improving the usefulness of current numerical methods for practical industrial use are proposed.
Oberst, S & Lai, JCS 1970, 'The role of nonlinearity in disc brake squeal', Proceedings - European Conference on Noise Control, pp. 1334-1339.
View description>>
The prediction of disc brake squeal propensity remains difficult despite significant progress made in the last two decades towards understanding its nature. Most of the numerical analysis of brake squeal is based on linear methods that have found some success in guiding the development of brakes in industry. One popular approach is the complex eigenvalue analysis using finite element models to predict unstable vibration modes. However, the complex eigenvalue analysis may over-predict or under-predict the number of unstable vibration modes and not all predicted unstable vibration modes will result in squeal. Therefore, extensive brake testing in noise dynamometers is required in order to ensure that the noise performance of brakes is acceptable. Although the analysis of brake squeal propensity is primarily based on linear approaches, it has been recognised that the operation of a brake contains a number of nonlinearities such as the excitation through the friction contact between the disc and pad, material properties, and operating conditions. The purpose of this paper is to provide an overview on nonlinearity as one mechanism of the cause of brake squeal and to discuss how such knowledge could be used to develop alternative strategies in numerical prediction of brake squeal propensity. © European Acoustics Association.
O'Reilly, RD, Morrison, JP & McGregor, C 1970, 'A system for the transmission, processing and visualisation of EEG to support Irish Neonatal Intensive Care Units', 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), 2012 25th IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), IEEE.
View/Download from: Publisher's site
View description>>
A system, constructed as a "proof of concept", for providing an Irish tele-neurophysiology service is presented. It is based on a distributed architecture and capable of handling synchronous data streams from multiple Irish Neonatal Intensive Care Units. It provides Ireland with an infrastructure for overcoming factors affecting the diagnosis of neurological disorders in neonates. The system supports collaborative efforts by neurophysiologists, removes geographical constraints on expert knowledge and allows for the creation of national data stores while simultaneously supporting the trans-Atlantic processing of EEG. Technical obstacles affecting its successful implementation are outlined and solutions proposed. The implementation of such a system could significantly improve the quality of care provided to neonates. © 2012 IEEE.
Othman, SH & Beydoun, G 1970, 'Evaluating Disaster Management Knowledge Model by Using a Frequency-Based Selection Technique.', PKAW, International workshop on Knowledge Management and Acquisition for Smart Systems and Services, Springer, Kuching, Malaysia, pp. 12-27.
View/Download from: Publisher's site
View description>>
© Springer-Verlag Berlin Heidelberg 2012. Disaster Management (DM) is a multidisciplinary endeavour and a very difficult knowledge domain to model. It is a diffused area of knowledge that is continuously evolving and informally represented. Metamodel is the output artefact of metamodelling, a software engineering approach, which makes statements about what can be expressed in the valid models of the knowledge domain. It is an appropriate high level knowledge structure to facilitate it being communicated among DM stakeholders. A Disaster Management Metamodel (DMM) is developed. To satisfy the expressiveness and the correctness of a DMM, in this paper we present a metamodel evaluation technique by using a Frequency-based Selection. The objective of this technique is to evaluate the importance of the individual concepts used in the DMM, thus, the quality of the metamodel can be measured quantitatively.
Pan, R, Xu, G & Dolog, P 1970, 'Improving Recommendations in Tag-Based Systems with Spectral Clustering of Tag Neighbors', Lecture Notes in Electrical Engineering, International Symposium on Computer Science and Its Applications (CSA), Springer Netherlands, Jeju Island, Korea, pp. 355-364.
View/Download from: Publisher's site
View description>>
Tag as a useful metadata reflects the collaborative and conceptual features of documents in social collaborative annotation systems. In this paper, we propose a collaborative approach for expanding tag neighbors and investigate the spectral clustering algorithm to filter out noisy tag neighbors in order to get appropriate recommendation for users. The preliminary experiments have been conducted on MovieLens dataset to compare our proposed approach with the traditional collaborative filtering recommendation approach and naive tag neighbors expansion approach in terms of precision, and the result demonstrates that our approach could considerably improve the performance of recommendations. © 2012 Springer Science+Business Media B.V.
Pan, R, Xu, G, Dolog, P & Zong, Y 1970, 'Group Division for Recommendation in Tag-Based Systems', 2012 Second International Conference on Cloud and Green Computing, 2012 International Conference on Cloud and Green Computing (CGC), IEEE, Xiangtan, China, pp. 399-404.
View/Download from: Publisher's site
View description>>
The common usage of tags in these systems is to add the tagging attribute as an additional feature to re-model users or resources over the tag vector space, and in turn, making tag-based recommendation or personalized recommendation. With the help of tagging data, user annotation preference and document topical tendency are substantially coded into the profiles of users or documents. However, obtaining the proper relationship among user, resource and tag is still a challenge in social annotation based recommendation researches. In this paper, we utilize the relationship from between tags and resources and between tags and users to extract group information. With the help of such relationship, we can obtain the Topic-Groups based on the bipartite relationship between tags and resources, and Interest-Groups based on the bipartite relationship between tags and users. The preliminary experiments have been conducted on Movie Lens dataset to compare our proposed approach with the traditional collaborative filtering recommendation approach approach in terms of precision, and the result demonstrates that our approach could considerably improve the performance of recommendations. © 2012 IEEE.
Parvin, S & Hussain, FK 1970, 'Trust-based Security for Community-based Cognitive Radio Networks', 2012 IEEE 26TH INTERNATIONAL CONFERENCE ON ADVANCED INFORMATION NETWORKING AND APPLICATIONS (AINA), International Conference on Advanced Information Networking and Applications (was ICOIN), IEEE, Fukuoka, Japan, pp. 518-525.
View/Download from: Publisher's site
View description>>
Cognitive Radio (CR) is considered to be a necessary mechanism to detect whether a particular segment of the radio spectrum is currently in use, and to rapidly occupy the temporarily unused spectrum without interfering with the transmissions of other users. As Cognitive Radio has dynamic properties, so a member of Cognitive Radio Networks may join or leave the network at any time. These properties mean that the issue of secure communication in CRNs becomes more critical than for other conventional wireless networks. This work thus proposes a trust-based security system for community-based CRNs. A CR node's trust value is analyzed according to its previous behavior in the network and, depending on this trust value, it is decided whether this member node can take part in the communication of CRNs. For security purposes, we have designed our model to ensure that the proposed approach is secure in different contexts.
Peng Li, Song Guo, Shui Yu & Vasilakos, AV 1970, 'CodePipe: An opportunistic feeding and routing protocol for reliable multicast with pipelined network coding', 2012 Proceedings IEEE INFOCOM, IEEE INFOCOM 2012 - IEEE Conference on Computer Communications, IEEE, Orlando, FL, pp. 100-108.
View/Download from: Publisher's site
Pileggi, SF, Ibanez, G, Fernandez-Llatas, C & Carlos Narajo-Martinez, J 1970, 'ENABLING SEMANTIC RESOURCES IN THE CLOUD', Proceedings of the 4th International Conference on Agents and Artificial Intelligence, Special Session on Semantic Interoperability, SciTePress - Science and and Technology Publications, PORTUGAL, pp. 541-546.
View/Download from: Publisher's site
Purba, JH, Lu, J & Zhang, G 1970, 'AN AREA DEFUZZIFICATION TECHNIQUE AND ESSENTIAL FUZZY RULES FOR DEFUZZIFYING NUCLEAR EVENT FAILURE POSSIBILITIES INTO RELIABILITY DATA', UNCERTAINTY MODELING IN KNOWLEDGE ENGINEERING AND DECISION MAKING, The 10th International FLINS Conference, World Scientific Publishing Co. Pte. Lt., Istanbul, Turkey, pp. 1208-1213.
Qinxue Meng & Kennedy, PJ 1970, 'Using Field of Research Codes to Discover Research Groups from Co-authorship Networks', 2012 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, 2012 International Conference on Advances in Social Networks Analysis and Mining (ASONAM 2012), IEEE, Istanbul, Turkey, pp. 289-293.
View/Download from: Publisher's site
View description>>
Nowadays, academic collaboration has become more prevalent and crucial than ever before and many studies of academic collaboration analysis are implemented based on coauthor ship networks. This paper aims to build a novel coauthor ship network by importing field of research codes based on Newman's model, and then analyze and extract research groups via spectral clustering. In order to explain the effectiveness of this revised network, we take the academic collaboration at the University of Technology, Sydney (UTS) as an example. The result of this study advances methods for selecting the most prolific research groups and individuals in research institutions, and provides scientific evidence for policymakers to manage laboratories and research groups more efficiently in the future.
Raduescu, C & Gill, AQ 1970, 'Handling the Complexity of ISD Projects with Agile Methods: A Conceptual Foundation.', ISD, International Conference on Information Systems Development, Springer, Prato Centre, Italy, pp. 417-427.
View/Download from: Publisher's site
View description>>
Traditional approaches to software and information systems development (ISD) cannot fulfill the challenges presented by the complexity inherent in todays dynamic and changing environments. In this study we argue that ISD projects are socially complex endeavors and suggest that agile development methods display characteristics that justify them as being appropriate for such project environments. We suggest that one theory that justifies the appropriateness of agile methods in such contexts is the complex adaptive systems (CAS) theory. We first argue that ISD projects can be treated as CAS, and second, we assess the alignment between CAS characteristics and agile methods principles. We therefore propose and discuss a preliminary conceptual foundation for handling the complexity of ISD projects with agile methods. Our future research directions seek to investigate the applicability of specific agile methods and develop a comprehensive framework that will offer a validated theoretical justification of better approaches to manage complex ISD projects in practice.
Ramezani, F & Lu, J 1970, 'A COGNITIVE GROUP DECISION SUPPORT SYSTEM FOR PROJECTS EVALUATION', UNCERTAINTY MODELING IN KNOWLEDGE ENGINEERING AND DECISION MAKING, International Fuzzy Logic and Intelligent technologies in Nuclear Science Conference, World Scientific, Istanbul-Turkey, pp. 231-236.
View description>>
In any organization there are some main goals and lots of projects for achieving these goals. For any organization, it is important to determine how much these projects affect on achieving the main goals. This paper proposes a new fuzzy multiple attribute-based decision support system (DSS) for evaluating projects in promoting the goals as such a selection may involve both quantitative and qualitative assessment attributes. In addition the proposed DSS has ability to choose the most appropriate fuzzy ranking method for solving given MADM problem. Also it contains sensitivity analysis system which provides opportunity for analyzing the impacts of attributesâ weights and project sâ performance on achieving organizationsâ goals, and assess the reliability of the decision making process. The proposed DSS can be applied for solving every FMADM problem which needs to rank some alternatives according to some attributes.
Ramezani, F & Lu, J 1970, 'A Fuzzy Group Decision Support System for Projects Evaluation', 14th International Conference on Information Processing and Management of Uncertainty in Knowledge-Based Systems, IPMU 2012, International Conference on Information Processing and Management of Uncertainty, Springer Berlin Heidelberg, CATANIA, ITALY, pp. 160-169.
View/Download from: Publisher's site
View description>>
In any organization there are some main goals and lots of projects for achieving these goals. For any organization, it is important to determine how much these projects affect on achieving the main goals. This paper proposes a new fuzzy multiple attribute-based decision support system (DSS) for evaluating projects in promoting the goals as such a selection may involve both quantitative and qualitative assessment attributes. There are many fuzzy ranking methods available to solve multi-attribute decision making (MADM) problems. Some are more suitable than other for particular decision problems. The proposed DSS has ability to choose the most appropriate fuzzy ranking method for solving given MADM problem. In addition it contains sensitivity analysis system which provides opportunity for analyzing the impacts of attributes weights and projects performance on achieving organizations goals. A DSS software prototype has been developed on the basis of the proposed DSS which can be applied for solving every FMADM problem which needs to rank some alternatives according to some attributes.
Ray, I, Yan, Z, Yu, S & Liu, L 1970, 'Message from TrustCom Workshop/Symposium Chairs', 2012 IEEE 11th International Conference on Trust, Security and Privacy in Computing and Communications, 2012 IEEE 11th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), IEEE, p. 32.
View/Download from: Publisher's site
Rehman, ZU, Hussain, OK & Hussain, FK 1970, 'Iaas Cloud Selection using MCDM Methods', 2012 NINTH IEEE INTERNATIONAL CONFERENCE ON E-BUSINESS ENGINEERING (ICEBE), IEEE International Conference on e-Business Engineering, IEEE, Hangzhou, China, pp. 246-251.
View/Download from: Publisher's site
View description>>
The popularity of cloud computing and IaaS has spawned numerous cloud service providers which offer various cloud services, including IaaS, to cloud users. These services vary considerably in terms of their performance and cost and the selection of a suitable cloud service becomes a complex decision making issue for a cloud service user. Furthermore, the cloud services have several attributes all of which are the criteria that have to be taken into account when making a service selection decision. In the presence of these multiple criteria, a compromise has to be made because in most real-world situations, no single service exceeds all other services in all criteria but one service may be better in terms of some of the criteria while other services may outperform it if judged on the basis of the remaining criteria. Multi-criteria decision-making is a sub-field in operations research that deals with the techniques to solve such multi-criteria problems. There are several methods of multicriteria decision-making. In this paper, we use key multi-criteria decision-making methods for IaaS cloud service selection in a case study which contains five basic performance measurements of thirteen cloud services by a third party monitoring service. We demonstrate the use of these multi-criteria methods for cloud service selection and compare the results obtained by using each method to find out how the choice of a particular MCDM method affects the outcome of the decision-making process for IaaS cloud service selection.
Saberi, M, Azadeh, A, Saberi, Z & Pazhoheshfar, P 1970, 'A knowledge management system based on artificial intelligence (AI) methods: A flexible fuzzy regression-analysis of variance algorithm for natural gas consumption estimation', 2012 International Conference on Information Retrieval & Knowledge Management, 2012 International Conference on Information Retrieval & Knowledge Management (CAMP), IEEE, pp. 143-147.
View/Download from: Publisher's site
View description>>
A knowledge management (KM) system has a different schema that one of them has been studied in the present study. One of version of KM is based on artificial intelligence (AI) methods. KM based on AI has been investigated in the present work based on natural gas consumption estimation domain. Developing accurate and flexible model to natural gas consumption estimation is a strategic step in policy and decision-making process in energy sector. This paper provides a stage algorithm during that it gains optimal fuzzy regression model for studding natural gas consumption in sixteen countries according to data in years 1989 till 2007. Different countries have selected from Africa, America, Asia, Europe and Middle East based on high, middle and low GDP index and for every one of them nine fuzzy regression models have executed and the results and error of each model have calculated. Preprocess has been done on the initial data to gain better results which the min-max method has been used for this purpose. Two criterions have been used to determining suitable and appropriate fuzzy regression model in each country. Firstly, fuzzy regression models with MAPE value below 10 is deleted from the assessment and remained fuzzy regression models are compared with ANOVA. Upon logic that given algorithm sketches for some countries none of used models are proper while for other countries optimal model is gained in first or second filter of algorithm. To show the applicability and superiority of the proposed flexible Fuzzy regression model the data for oil consumption in Japan, Thailand, Bangladesh from Asia and Norway, Italy, Bulgaria from Europe and Qatar, Iran, Iraq from Middle East and The united state, Mexico, Bolivia from North America and Libya, Tunisia, Nigeria from Africa during 1989 to 2007 are used. © 2012 IEEE.
Shangguan, Q, Hu, L, Cao, J & Xu, G 1970, 'Book Recommendation Based on Joint Multi-relational Model', 2012 Second International Conference on Cloud and Green Computing, 2012 International Conference on Cloud and Green Computing (CGC), IEEE, Xiangtan, China, pp. 523-530.
View/Download from: Publisher's site
Sharma, N, Pal, U & Blumenstein, M 1970, 'Recent Advances in Video Based Document Processing: A Review', 2012 10th IAPR International Workshop on Document Analysis Systems, 2012 10th IAPR International Workshop on Document Analysis Systems (DAS), IEEE, Gold Coast, Australia, pp. 63-68.
View/Download from: Publisher's site
View description>>
Extraction and recognition of text present in video has become a very popular research area in the last decade. Generally, text present in video frames is of different size, orientation, style, etc. with complex backgrounds, noise, low resolution and contrast. These factors make the automatic text extraction and recognition in video frames a challenging task. A large number of techniques have been proposed by various researchers in the recent past to address the problem. This paper presents a review of various state-of-the-art techniques proposed towards different stages (e.g. detection, localization, extraction, etc.) of text information processing in video frames. Looking at the growing popularity and the recent developments in the processing of text in video frames, this review imparts details of current trends and potential directions for further research activities to assist researchers. © 2012 IEEE.
Sharma, N, Shivakumara, P, Pal, U, Blumenstein, M & Tan, CL 1970, 'A new method for arbitrarily-oriented text detection in video', Proceedings - 10th IAPR International Workshop on Document Analysis Systems, DAS 2012, International Workshop on Document Analysis Systems, IEEE, Institute of Electrical and Electronics Engineers, Gold Coast, Australia, pp. 74-78.
View/Download from: Publisher's site
View description>>
Text detection in video frames plays a vital role in enhancing the performance of information extraction systems because the text in video frames helps in indexing and retrieving video efficiently and accurately. This paper presents a new method for arbitrarily-oriented text detection in video, based on dominant text pixel selection, text representatives and region growing. The method uses gradient pixel direction and magnitude corresponding to Sobel edge pixels of the input frame to obtain dominant text pixels. Edge components in the Sobel edge map corresponding to dominant text pixels are then extracted and we call them text representatives. We eliminate broken segments of each text representatives to get candidate text representatives. Then the perimeter of candidate text representatives grows along the text direction in the Sobel edge map to group the neighboring text components which we call word patches. The word patches are used for finding the direction of text lines and then the word patches are expanded in the same direction in the Sobel edge map to group the neighboring word patches and to restore missing text information. This results in extraction of arbitrarily-oriented text from the video frame. To evaluate the method, we considered arbitrarily-oriented data, non-horizontal data, horizontal data, Hua's data and ICDAR-2003 competition data (Camera images). The experimental results show that the proposed method outperforms the existing method in terms of recall and f-measure. © 2012 IEEE.
Si Liu, Bo Liu, Xiaoqiang Ma, Bo Rong & Lin Gui 1970, 'Low-complexity PAPR reduction algorithm in OFDM systems by designing data subcarriers', 2012 IEEE Global Communications Conference (GLOBECOM), GLOBECOM 2012 - 2012 IEEE Global Communications Conference, IEEE, Anaheim, CA, pp. 4747-4751.
View/Download from: Publisher's site
Simon, Sheard, J, Carbone, A, Chinn, D, Laakso, MJ, Clear, T, de Raadt, M, D'Souza, D, Lister, R, Philpott, A, Skene, J & Warburton, G 1970, 'Introductory programming: Examining the exams', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Melbourne, Australia, pp. 61-70.
View description>>
This paper describes a classification scheme that can be used to investigate the characteristics of introductory programming examinations. The scheme itself is described and its categories explained. We describe in detail the process of determining the level of agreement among classifiers, that is, the inter-rater reliability of the scheme, and we report the results of applying the classification scheme to 20 introductory programming examinations. We find that introductory programming examinations vary greatly in the coverage of topics, question styles, skill required to answer questions and the level of difficulty of questions. This study is part of a project that aims to investigate the nature and composition of formal examination instruments used in the summative assessment of introductory programming students, and the pedagogical intentions of the educators who construct these instruments.
Sojda, RS, Chen, SH, El Sawah, S, Guillaume, JHA, Jakeman, AJ, Lautenbach, S, McIntosh, BS, Rizzoli, AE, Seppelt, R, Struss, P, Voinov, AA & Volk, M 1970, 'Identifying the decision to be supported: A review of papers from environmental modelling and software', iEMSs 2012 - Managing Resources of a Limited Planet: Proceedings of the 6th Biennial Meeting of the International Environmental Modelling and Software Society, pp. 73-80.
View description>>
Two of the basic tenets of decision support system efforts are to help identify and structure the decisions to be supported, and to then provide analysis in how those decisions might be best made. One example from wetland management would be that wildlife biologists must decide when to draw down water levels to optimise aquatic invertebrates as food for breeding ducks. Once such a decision is identified, a system or tool to help them make that decision in the face of current and projected climate conditions could be developed. We examined a random sample of 100 papers published from 2001-2011 in Environmental Modelling and Software that used the phrase 'decision support system' or 'decision support tool', and which are characteristic of different sectors. In our review, 41% of the systems and tools related to the water resources sector, 34% were related to agriculture, and 22% to the conservation of fish, wildlife, and protected area management. Only 60% of the papers were deemed to be reporting on DSS. This was based on the papers reviewed not having directly identified a specific decision to be supported. We also report on the techniques that were used to identify the decisions, such as formal survey, focus group, expert opinion, or sole judgment of the author(s). The primary underlying modelling system, e.g., expert system, agent based model, Bayesian belief network, geographical information system (GIS), and the like was categorised next. Finally, since decision support typically should target some aspect of unstructured decisions, we subjectively determined to what degree this was the case. In only 23% of the papers reviewed, did the system appear to tackle unstructured decisions. This knowledge should be useful in helping workers in the field develop more effective systems and tools, especially by being exposed to the approaches in different, but related, disciplines. We propose that a standard blueprint for reporting on DSS be developed for consi...
Tafavogh, S, Kennedy, PJ, Catchpoole, DR & IEEE 1970, 'Determining Cellularity Status of Tumors based on Histopathology using Hybrid Image Segmentation', 2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), IEEE International Joint Conference on Neural Networks, IEEE, Brisbane, Australia, pp. 1-8.
View/Download from: Publisher's site
View description>>
A Computer Aided Diagnosis (CAD) system is developed to determine cellularity status of a tumor. The system helps pathologists to distinguish a tumor with cell proliferation from normal tumors. The developed CAD system implements a hybrid segmentation method to identify and extract the morphological features that are used by pathologists for determining cellularity status of tumor. Adaptive Mean Shift (AMS) clustering as a non-parametric technique is integrated with Color Template Matching (CTM) to construct segmentation approach. We used Expectation Maximization (EM) clustering as a parametric technique for the sake of comparison with our proposed approach. The output of our proposed system and EM are validated by two pathologists as ground truth. The result of our developed system is quite close to the decision of pathologists, and it significantly outperforms EM in terms of accuracy. © 2012 IEEE.
Tang, F, You, I, Yu, S, Li, H & Wang, CL 1970, 'Grid transaction management and an efficient development kit', Computer Systems Science and Engineering, C R L PUBLISHING LTD, pp. 345-353.
View description>>
Grid transaction management aims at guaranteeing the system consistency in face of various failures in Grid environments. In this paper, we propose a Grid transaction service (GridTS) and design coordination mechanisms for atomic, long-lived and real-time Grid transactions respectively, based on the features of Grid environments. GridTS has the following three advantages. Firstly, it separates the transaction management unit with transaction coordination algorithms so that it can coordinate the above three categories of transactions in a uniform way. Secondly, GridTS can dynamically generate compensating transactions during the long-lived transaction processing. Finally, it provides the programming interfaces similar to traditional distributed transactions. Moreover, we implement a Grid transaction development kit (GridTDK) for application programmers based on our GridTS. We evaluate the feasibility and effectiveness of GridTS by developing an application system using our GridTDK. ©2012 CRL Publishing Ltd.
Tang, F, You, I, Yu, S, Wang, C-L, Guo, M & Liu, W 1970, 'An efficient deadlock prevention approach for service oriented transaction processing', Computers & Mathematics with Applications, Elsevier BV, pp. 458-468.
View/Download from: Publisher's site
Teague, D, Corney, M, Ahadi, A & Lister, R 1970, 'Swapping as the 'Hello World' of relational reasoning: Replications, reflections and extensions', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society Inc, Melbourne, Australia, pp. 87-94.
View description>>
At the previous conference in this series, Corney, Lister and Teague presented research results showing relationships between code writing, code tracing and code explaining, from as early as week 3 of semester. We concluded that the problems some students face in learning to program start very early in the semester. In this paper we report on our replication of that experiment, at two institutions, where one is the same as the original institution. In some cases, we did not find the same relationship between explaining code and writing code, but we believe this was because our teachers discussed the code in lectures between the two tests. Apart from that exception, our replication results at both institutions are consistent with our original study.
Teague, D, Corney, M, Fidge, CF, Roggenkamp, M, Ahadi, A & Lister, RF 1970, 'Using Neo-Piagetian Theory, Formative In-Class Tests and Think Alouds to Better Understand Student Thinking: A Preliminary Report on Computer Programming', Proceedings of the 23rd Annual Conference for the Australasian Association for Engineering Education - The Profession of Engineering Education: Advancing Teaching, Research and Careers, AAEE - Annual Conference of Australasian Association for Engineering Education, Swinburne University of Technology, Melbourne, Australia, pp. 1-9.
View description>>
BACKGROUND Around the world, and for many years, students have struggled to learn to program computers. The reasons for this are poorly understood by their lecturers. PURPOSE When the intuitions of many skilled lecturers have failed to solve a pedagogical problem, then a systematic research programme is needed. We have implemented a research programme based on three elements: (1) a theory that provides an organising conceptual framework, (2) representative data on how the class performs on formative assessment tasks, and (3) microgenetic data from one-on-one think aloud sessions, to establish why students struggle with some of the formative tasks. DESIGN / METHOD We have adopted neo-Piagetian theory as our organising framework. We collect data by two methods. The first method is a series of small tests that we have students complete during lectures, at roughly two week intervals. These tests did not count toward the studentsâ final grade, which affords us the opportunity to ask unusual questions that probe at the boundaries of student understanding. Think aloud sessions are the second data collection method, in which a small number of selected, volunteer students attempt problems similar to the problems in the in-class tests. RESULTS The results in this paper serve to illustrate our research programme rather than answer a single, tight research question. These illustrative results focus upon one very simple type of programming question that was put to students, very early in their first programming subject. That simple question required students to write code to swap the values in two variables (e.g., temp = a; a = b; b = temp). The common intuition among programming lecturers is that students should be able to easily solve such a problem by, say, week 4 of semester. On the contrary, we found that 40% of students in a class at one of the participating institutions answered this question incorrectly in week 4 of semester. CONCLUSIONS What is emerging fro...
Thapngam, T, Yu, S & Zhou, W 1970, 'DDoS discrimination by Linear Discriminant Analysis (LDA)', 2012 International Conference on Computing, Networking and Communications (ICNC), 2012 International Conference on Computing, Networking and Communications (ICNC), IEEE, pp. 532-536.
View/Download from: Publisher's site
View description>>
In this paper, we propose an effective approach with a supervised learning system based on Linear Discriminant Analysis (LDA) to discriminate legitimate traffic from DDoS attack traffic. Currently there is a wide outbreak of DDoS attacks that remain risky for the entire Internet. Different attack methods and strategies are trying to challenge defence systems. Among the behaviours of attack sources, repeatable and predictable features differ from source of legitimate traffic. In addition, the DDoS defence systems lack the learning ability to fine-tune their accuracy. This paper analyses real trace traffic from publicly available datasets. Pearson's correlation coefficient and Shannon's entropy are deployed for extracting dependency and predictability of traffic data respectively. Then, LDA is used to train and classify legitimate and attack traffic flows. From the results of our experiment, we can confirm that the proposed discrimination system can differentiate DDoS attacks from legitimate traffic with a high rate of accuracy. © 2012 IEEE.
Ubaid, A, Rehman, U & Abidi, MA 1970, 'Adaptive modular recovery block', 2012 IEEE Asia-Pacific Conference on Applied Electromagnetics (APACE), 2012 IEEE Asia-Pacific Conference on Applied Electromagnetics (APACE), IEEE.
View/Download from: Publisher's site
ur Rehman, Z, Hussain, OK, Parvin, S & Hussain, FK 1970, 'A Framework for User Feedback Based Cloud Service Monitoring', 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems, 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems (CISIS), IEEE, Palermo, Italy, pp. 257-262.
View/Download from: Publisher's site
View description>>
The increasing popularity of the cloud computing paradigm and the emerging concept of federated cloud computing have motivated research efforts towards intelligent cloud service selection aimed at developing techniques for enabling the cloud users to gain maximum benefit from cloud computing by selecting services which provide optimal performance at lowest possible cost. Given the intricate and heterogeneous nature of current clouds, the cloud service selection process is, in effect, a multi criteria optimization or decision-making problem. The possible criteria for this process are related to both functional and nonfunctional attributes of cloud services. In this context, the two major issues are: (1) choice of a criteria-set and (2) mechanisms for the assessment of cloud services against each criterion for thorough continuous cloud service monitoring. In this paper, we focus on the issue of cloud service monitoring wherein the existing monitoring and assessment mechanisms are entirely dependent on various benchmark tests which, however, are unable to accurately determine or reliably predict the performance of actual cloud applications under a real workload. We discuss the recent research aimed at achieving this objective and propose a novel user-feedback-based approach which can monitor cloud performance more reliably and accurately as compared with the existing mechanisms.
Vizuete Luciano, E, Merigó, JM, Gil-Lafuente, AM & Boria Reverté, S 1970, 'OWA Operators in the Assignment Process: The Case of the Hungarian Algorithm', MODELING AND SIMULATION IN ENGINEERING, ECONOMICS, AND MANAGEMENT, MS 2012, International Conference of Modeling and Simulation in Engineering, Economics, and Management, Springer Berlin Heidelberg, New Rochelle, NY, pp. 166-177.
View/Download from: Publisher's site
Wang, X, Wang, Z & Xu, X 1970, 'Analytic Profit Optimization of Service-Based Systems', 2012 IEEE 19th International Conference on Web Services, 2012 IEEE 19th International Conference on Web Services (ICWS), IEEE, pp. 359-367.
View/Download from: Publisher's site
View description>>
Service computing has become a dominant paradigm enabling the building of complex service-oriented systems, with the aim of business added-value. Because these systems are inevitably based on uncontrollable services on the unpredictable Internet, it is important to find effective ways of maximizing the profit of service-oriented systems in such unreliable environment. In this paper, we propose an analytic approach that employs a build-time analysis of the runtime dynamics of service execution to maximize the net profit from delivering composite services under full probability of uncertainty. We also present methods for improving the optimization efficiency, including reusing intermediate computation results and adopting specialized profit optimization algorithms. The superiority of the proposed approach is both theoretically proved and empirically demonstrated through experiments. © 2012 IEEE.
Wei, B, Jin, Z, Zowghi, D & Yin, B 1970, 'Automated Reasoning with Goal Tree Models for Software Quality Requirements', 2012 IEEE 36th Annual Computer Software and Applications Conference Workshops, 2012 IEEE 36th IEEE Annual Computer Software and Applications Conference Workshops (COMPSACW), IEEE, Izmir, Turkey, pp. 373-378.
View/Download from: Publisher's site
View description>>
Implementation of software quality requirements is critical for producing high-quality softwares. High-level quality requirements are usually refined stepwise by different low-level quality requirements, until some potential functional design alternatives are identified. An important question is how design alternatives can be effectively selected to satisfice the quality requirements. This paper focuses on the satisficing statuses of nodes in the quality requirements goal tree models, and presents an automated reasoning technique to select design alternatives. The final satisficing status of quality requirements can be obtained provided that the satisficing statuses of design alternatives are assigned. Existing approaches propose reasoning approaches which do not support efficient identification if many design alternatives and candidate solutions may exist. Our work provides an alternative approach to identify what is the acceptable design decision in a timely manner. A case study is also presented to illustrate our proposed automated reasoning approach.
Wu, Y, Zowghi, D, Peng, X & Zhao, W 1970, 'Towards understanding requirement evolution in a software product line an industrial case study', 2012 First IEEE International Workshop on the Twin Peaks of Requirements and Architecture (TwinPeaks), 2012 IEEE First International Workshop on the Twin Peaks of Requirements and Architecture (Twin Peaks), IEEE, Chicago, Illinois, USA, pp. 7-14.
View/Download from: Publisher's site
View description>>
In most software development practices, software requirements and architecture are addressed simultaneously. The architecture may grow based on a core specification of requirements, and the requirements may also be elicited incrementally as the architecture gets more and more concrete. In this paper, we present a case study on the development history of Wingsoft Examination System Product Line (WES-PL), an active, industrial software product line with a history of more than eight years. We focus on 10 member products, 51 major versions that have been delivered to customer or archived in the repository between December 2003 and May 2012, by tracing both requirement and architectural changes. We identify a requirement change classification from the viewpoint of architectural impact. We claim that software requirements are negotiated and may be guided by existing software architecture design, especially in the process of software product line development. Product strategy requirements play an important role in marketing requirement negotiation. We also find typical evidences showing that a product leader or architect has to make difficult decisions to keep a balance between marketing requirements from customer-side and software architectural design from his own side. © 2012 IEEE.
Xiang, Y, Natgunanathan, I, Peng, D, Zhou, W & Yu, S 1970, 'A Dual-Channel Time-Spread Echo Method for Audio Watermarking', IEEE Transactions on Information Forensics and Security, Institute of Electrical and Electronics Engineers (IEEE), pp. 383-392.
View/Download from: Publisher's site
Xu, G & Wu, Z 1970, 'On Smart and Accurate Contextual Advertising', Lecture Notes in Computer Science, Database Systems for Advanced Applications, Springer Berlin Heidelberg, Busan, South Korea, pp. 104-104.
View/Download from: Publisher's site
View description>>
Wide Web to attract customers, has become one of the most important marketing channels. As one prevalent type ofWeb advertising, contextual advertising refers to the placement of the most relevant commercial ads into the content of a Web page, so as to increase the number of adclicks. However, some problems such as homonymy and polysemy, low intersection of keywords, and context mismatch, can lead to the selection of irrelevant ads for a generic page, making that the traditional keyword matching techniques generally present a poor accuracy. Furthermore, existing contextual advertising techniques only take into consideration how to select as relevant ads for a generic page as possible, without considering the positional effect of the ad placement in the page. In this paper, we propose a new contextual advertising framework to tackle problems, which (1) usesWikipedia concept and category information to enrich the semantic representation of a page (or a textual ad) and (2) takes the placement position of embedded advertise into account. To accomplish these steps, we first map each page (or ad) into three feature vectors: a keyword vector, a concept vector and a category vector. Second, we determine the relevant ads for a given page based on a similarity measure which combines the above three feature vectors. In dealing with position-wise contextual advertising, the relevant ads are selected based on not only global context relevance but also local context relevance, so that the embedded ads yield contextual relevance to both the whole targeted page and the insertion positions where the ads are placed. We experimentally validate our approach by using a real ads set, a real pages set , and a set of more than 260,000 concepts and 12,000 categories from Wikipedia. The experimental results show that our approach performs better than the simple keyword matching and can improve the precision of ads-selection effectively.
Xu, Y, Luo, T, Xu, G & Pan, R 1970, 'A Topic-Oriented Syntactic Component Extraction Model for Social Media', Lecture Notes in Electrical Engineering, Human Centric Technology and Service in Smart Space, Springer Netherlands, Gwangju, Korea, pp. 221-229.
View/Download from: Publisher's site
View description>>
Topic-oriented understanding is to extract information from various language instances, which reflects the characteristics or trends of semantic information related to the topic via statistical analysis. The syntax analysis and modeling is the basis of such work. Traditional syntactic formalization approaches widely used in natural language understanding could not be simply applied to the text modeling in the context of topic-oriented understanding. In this paper, we review the information extraction mode, and summarize its inherent relationship with the "Subject- Predicate" syntactic structure in Aryan language. And we propose a syntactic element extraction model based on the "topic-description" structure, which contains six kinds of core elements, satisfying the desired requirement for topic-oriented understanding. This paper also describes the model composition, the theoretical framework of understanding process, the extraction method of syntactic components, and the prototype system of generating syntax diagrams. The proposed model is evaluated on the Reuters 21578 and SocialCom2009 data sets, and the results show that the recall and precision of syntactic component extraction are up to 93.9% and 88%, respectively, which further justifies the feasibility of generating syntactic component through the word dependencies. © 2012 Springer Science+Business Media.
Yu Zong, Guandong Xu, Ping Jin, Xun Yi, Enhong Chen & Zongda Wu 1970, 'A projective clustering algorithm based on significant local dense areas', The 2012 International Joint Conference on Neural Networks (IJCNN), 2012 International Joint Conference on Neural Networks (IJCNN 2012 - Brisbane), IEEE, Brisbane, Australia, pp. 1-8.
View/Download from: Publisher's site
Yu, S, Guo, S & Stojmenovic, I 1970, 'Can we beat legitimate cyber behavior mimicking attacks from botnets?', 2012 Proceedings IEEE INFOCOM, IEEE INFOCOM 2012 - IEEE Conference on Computer Communications, IEEE, Orlando, FL, pp. 2851-2855.
View/Download from: Publisher's site
Yu, S, Zhou, W, Dou, W & Makki, SK 1970, 'Why it is Hard to Fight against Cyber Criminals?', 2012 32nd International Conference on Distributed Computing Systems Workshops, 2012 32nd International Conference on Distributed Computing Systems Workshops (ICDCS Workshops), IEEE, pp. 537-541.
View/Download from: Publisher's site
View description>>
We are witnessing numerous cyber attacks every day, however, we do not see many cyber criminals are brought to justice. One reason is that it is technically hard to identify and trace cyber criminals. One reason for this passive situation is our limited or even inappropriate understanding of the cyber space. In this paper, we survey the challenges and opportunities in this research field for interested readers. We also list promising tools and directions based on our understanding. © 2012 IEEE.
Zawawi, RA, Akpolat, H & Bagia, R 1970, 'Managing Knowledge in Aircraft Engineering - An Operations-Based Approach', The 3rd International Conference on Industrial Engineering and Operations Management, International Conference on Industrial Engineering and Operations Management, Industrial Engineering and Operations Management Society, Istanbul, Turkey, pp. 1196-1205.
Zhao, Y, Li, J, Christen, P & Kennedy, PJ 1970, 'Preface', Conferences in Research and Practice in Information Technology Series, p. vii.
Zhou, A, Xu, G, Agarwal, N, King, I, Nejdl, W & Wang, F 1970, 'Message from the SCA2012 Chairs', 2012 Second International Conference on Cloud and Green Computing, 2012 International Conference on Cloud and Green Computing (CGC), IEEE.
View/Download from: Publisher's site
View description>>
The 2nd International Conference on Social Computing and Its Applications (SCA2012) was held in Xiangtan, China, November 1-3, 2012. SCA (Social Computing and its Applications) is created to provide a prime international forum for researchers, industry practitioners and environment experts to exchange the latest fundamental advances in the state of the art and practice of social computing and broadly related areas. SCA2012 consists of the main conference and three workshops: the 2012 International Workshop on Social Network Analysis and Information Diffusion Modelling (SNAIDM2012), the 2012 International Workshop on Web Wisdom (WW2012), and the 2012 International Workshop on Social Network Service on Databases (SNSDB2012). We greatly thank the Workshop Chairs for their valuable time and effort in organizing the workshops. SCA2012 is held jointly with the 2nd International Conference on Cloud and Green Computing (CGC2012). SCA2012 received 98 submissions from Germany, Canada, Japan, Australia, Sweden, South Korea, Portugal, Denmark, Poland and Mainland China. Each paper was peer reviewed by at least three program committee members. The final decision has been taken after a high quality review process. There are 45 paper accepted and the regular paper acceptance rate is about 32%. © 2012 IEEE.
Zhou, J, Luo, T & Xu, G 1970, 'Academic Recommendation on Graph with Dynamic Transfer Chain', 2012 Second International Conference on Cloud and Green Computing, 2012 International Conference on Cloud and Green Computing (CGC), IEEE, Xiangtan, China, pp. 331-336.
View/Download from: Publisher's site
View description>>
Academic contents update and learner's capability change over time. But nowadays, academic recommendation system does not take time factors into account. There are two challenges to capture learner's preferences and learning context accurately and dynamically. First modeling academic trend and user's cognitive level transferred by time is a hard problem. And designing dynamic algorithm to improve recommendation accuracy with implicit behavior data is difficult. In this paper, we propose Dynamic Transfer Chain (DTC) to model user's preferences and academic context over time on transaction data. Based on DTC model, we present a novel algorithm Dynamic Academic Recommendation on Graph (DARG). We evaluate the effectiveness of our method using an open dataset named CiteULike, including 9170 users, 11343 papers, 194596 user-paper pairs. The evaluation metric we used is Hit Ratio. The results show that our proposed approach gives 12.873% to 33.852% improvement over the previous counterpart, including User-KNN, Item-KNN, TUser-KNN, TItem-KNN. © 2012 IEEE.