-, JL, -, DP, -, HH, -, LG & -, XZ 2010, 'Analyzing Triggers in XML Data Integration Systems', International Journal of Digital Content Technology and its Applications, vol. 4, no. 5, pp. 38-45.
View/Download from: Publisher's site
Anvari, M, Saidi Mehrabad, M, Azadeh, A & Saberi, M 2010, 'Performance assessment of decision-making units using an adaptive neural network algorithm: one period case', The International Journal of Advanced Manufacturing Technology, vol. 46, no. 9-12, pp. 1059-1069.
View/Download from: Publisher's site
Azadeh, A, Javanmardi, L & Saberi, M 2010, 'The impact of decision-making units features on efficiency by integration of data envelopment analysis, artificial neural network, fuzzy C-means and analysis of variance', International Journal of Operational Research, vol. 7, no. 3, pp. 387-387.
View/Download from: Publisher's site
View description>>
In today's working environment, there is a great desire to identify the critical attributes for sensitivity analysis of inefficient decision-making units (DMUs) regarding personnel attributes. An integrated algorithm, which uses data envelopment analysis (DEA) and data mining tools including fuzzy C-means (FCM), rough set theory (RST), artificial neural network (ANN), cross validation test technique (CVTT) and analysis of variance (ANOVA), is proposed to asses the impact of personnel attributes on efficiency. DEA is used for DMUs' efficiency evaluation. ANN is employed with regard to its ability to model linear and non-linear systems. As numerous inputs are not useful for ANN modelling, RST and ANN are combined to resolve this issue. RST is used to decrease the time of decision-making. FCM is used for data clustering and finally ANOVA is utilised for identification of attributes importance. The proposed algorithm is applied to an actual banking system. Copyright © 2010 Inderscience Enterprises Ltd.
Azadeh, A, Saberi, M & Anvari, M 2010, 'An integrated artificial neural network algorithm for performance assessment and optimization of decision making units', Expert Systems with Applications, vol. 37, no. 8, pp. 5688-5697.
View/Download from: Publisher's site
Azadeh, A, Saberi, M & Ghorbani, S 2010, 'An ANFIS algorithm for improved forecasting of oil consumption: A case study of USA, Russia, India and Brazil', Journal of Scientific and Industrial Research, vol. 69, no. 3, pp. 194-203.
View description>>
This paper proposed an adaptive network-based fuzzy inference system (ANFIS) algorithm for oil consumption forecasting based on monthly oil consumption (January 2001 - September 2006) in USA, Russia, India and Brazil. Using mean absolute percentage error (MAPE), efficiency of different ANFIS models was examined. Proposed algorithm used Autocorrelation Function (ACF) to define input variables irrespective of trial and error method (TEM). Algorithm for calculating ANHS performance is based on its closed and open simulation abilities.
Azadeh, A, Saberi, M & Gitiforouz, A 2010, 'An integrated fuzzy regression algorithm for improved electricity consumption estimation', International Journal of Operational Research, vol. 9, no. 1, pp. 1-1.
View/Download from: Publisher's site
View description>>
This study presents an integrated fuzzy regression and time-series technique to estimate and predict electricity demand. Furthermore, it is difficult to model uncertain behaviour of energy consumption with only conventional time-series and fuzzy regression, which could be an ideal substitute for such cases. After reviewing various fuzzy regression models and studying their advantages and shortcomings, the best model is selected. Also, the impact of data preprocessing and post-processing on the fuzzy regression performance is to study and to show that this method does not contribute to the efficiency of the model. In addition, another unique feature of this study is utilisation of autocorrelation function to define input variables versus trial and error method. At last, the comparison of actual data with fuzzy regression and ARIMA model, using Granger-Newbold test, is achieved. Monthly electricity consumption of Iran from 1995 to 2005 is considered as the case of this study. Copyright © 2010 Inderscience Enterprises Ltd.
Azadeh, A, Saberi, M & Seraj, O 2010, 'An integrated fuzzy regression algorithm for energy consumption estimation with non-stationary data: A case study of Iran', Energy, vol. 35, no. 6, pp. 2351-2366.
View/Download from: Publisher's site
Azadeh, A, Saberi, M, Anvari, M & Moghaddam, M 2010, 'An integrated ANN-K-Means algorithm for improved performance assessment of electricity distribution units', JOURNAL OF SCIENTIFIC & INDUSTRIAL RESEARCH, vol. 69, no. 9, pp. 672-679.
Azadeh, A, Saberi, M, Ghaderi, SF & Gitiforouz, A 2010, 'Estimating and improving electricity demand function in residential sector with imprecise data by fuzzy regression', International Journal of Mathematics in Operational Research, vol. 2, no. 4, pp. 405-405.
View/Download from: Publisher's site
View description>>
This paper presents a fuzzy regression approach for estimation of electricity demand in residential sector with imprecise data. Moreover, electricity consumption in residential sector plays an important role in economical decision-making process. This is also highlighted by the fact that residential sector has the largest share of consumption among all the other sectors including industrial, business, and so on. The importance of fuzzy regression becomes evident by facing imprecise quantities and insufficient amount of data for estimation of energy consumption in residential sector. Fuzzy regression is applied to Iranian residential sectors. A review of a fuzzy linear regression is presented in which the centre regression line has the best ability to interpret training data. The interpretation ability of the regression line can be measured by the proposed index of confidence. Finally, an estimation of electricity demand function in residential sector for three different values of h is presented. Copyright © 2010 Inderscience Enterprises Ltd.
Beydoun, G, Hoffmann, A & Hamade, RF 2010, 'Automating dimensional tolerancing using Ripple down Rules (RDR)', EXPERT SYSTEMS WITH APPLICATIONS, vol. 37, no. 7, pp. 5101-5109.
View/Download from: Publisher's site
Blount, M, Ebling, MR, Eklund, JM, James, AG, McGregor, C, Percival, N, Smith, K & Sow, D 2010, 'Real-Time Analysis for Intensive Care: Development and Deployment of the Artemis Analytic System', IEEE Engineering in Medicine and Biology Magazine, vol. 29, no. 2, pp. 110-118.
View/Download from: Publisher's site
Bousquet, F & Voinov, A 2010, 'Preface to this thematic issue', Environmental Modelling & Software, vol. 25, no. 11, pp. 1267-1267.
View/Download from: Publisher's site
Catchpoole, DR, Kennedy, P, Skillicorn, DB & Simoff, S 2010, 'The Curse of Dimensionality: A Blessing to Personalized Medicine', Journal of Clinical Oncology, vol. 28, no. 34, pp. e723-e724.
View/Download from: Publisher's site
Chang, C-W, Ko, L-W, Lin, F-C, Su, T-P, Jung, T-P, Lin, C-T & Chiou, J-C 2010, 'Drowsiness Monitoring with EEG-Based MEMS Biosensing Technologies', GeroPsych, vol. 23, no. 2, pp. 107-113.
View/Download from: Publisher's site
View description>>
Electroencephalography (EEG) has been widely adopted to monitor changes in cognitive states, particularly stages of sleep, as EEG recordings contain a wealth of information reflecting changes in alertness and sleepiness. In this study, silicon dry electrodes based on Micro-Electro-Mechanical Systems (MEMS) were developed to bring high-quality EEG acquisition to operational workplaces. They have superior conductivity performance, large signal intensity, and are smaller in size than conventional (wet) electrodes. An EEG-based drowsiness estimation system consisting of a dry-electrode array, power spectrum estimation, principal component analysis (PCA)-based EEG signal analysis, and multivariate linear regression was developed to estimate drivers’ drowsiness levels in a virtual-reality-based dynamic driving simulator. The proposed system can help elders who are often affected by periods of tiredness and fatigue.
Chen, Y-C, Duann, J-R, Chuang, S-W, Lin, C-L, Ko, L-W, Jung, T-P & Lin, C-T 2010, 'Spatial and temporal EEG dynamics of motion sickness', NeuroImage, vol. 49, no. 3, pp. 2862-2870.
View/Download from: Publisher's site
Chin-Teng Lin, Kuan-Cheng Chang, Chun-Ling Lin, Chia-Cheng Chiang, Shao-Wei Lu, Shih-Sheng Chang, Bor-Shyh Lin, Hsin-Yueh Liang, Ray-Jade Chen, Yuan-Teh Lee & Li-Wei Ko 2010, 'An Intelligent Telecardiology System Using a Wearable and Wireless ECG to Detect Atrial Fibrillation', IEEE Transactions on Information Technology in Biomedicine, vol. 14, no. 3, pp. 726-733.
View/Download from: Publisher's site
Devitt, S 2010, 'Scalable quantum information processing and the optical topological quantum computer', Optics and Spectroscopy, vol. 108, no. 2, pp. 267-281.
View/Download from: Publisher's site
Erfani, S & Akhgar, B 2010, 'A novel knowledge management implementation model for mobile telecommunication industry', World Appl Sci J, vol. 11, no. 1, pp. 29-37.
Gaddis, EJB & Voinov, A 2010, 'Spatially Explicit Modeling of Land Use Specific Phosphorus Transport Pathways to Improve TMDL Load Estimates and Implementation Planning', Water Resources Management, vol. 24, no. 8, pp. 1621-1644.
View/Download from: Publisher's site
Gaddis, EJB, Falk, HH, Ginger, C & Voinov, A 2010, 'Effectiveness of a participatory modeling effort to identify and advance community water resource goals in St. Albans, Vermont', Environmental Modelling & Software, vol. 25, no. 11, pp. 1428-1438.
View/Download from: Publisher's site
Gallo, DA, Foster, KT, Wong, JT & Bennett, DA 2010, 'False recollection of emotional pictures in Alzheimer's disease', Neuropsychologia, vol. 48, no. 12, pp. 3614-3618.
View/Download from: Publisher's site
View description>>
Alzheimer's Disease (AD) can reduce the effects of emotional content on memory for studied pictures, but less is known about false memory. In healthy adults, emotionally arousing pictures can be more susceptible to false memory effects than neutral pictures, potentially because emotional pictures share conceptual similarities that cause memory confusions. We investigated these effects in AD patients and healthy controls. Participants studied pictures and their verbal labels, and then picture recollection was tested using verbal labels as retrieval cues. Some of the test labels had been associated with a picture at study, whereas other had not. On this picture recollection test, we found that both AD patients and controls incorrectly endorsed some of the test labels that had not been studied with pictures. These errors were associated with medium to high levels of confidence, indicating some degree of false recollection. Critically, these false recollection judgments were greater for emotional compared to neutral items, especially for positively valenced items, in both AD patients and controls. Dysfunction of the amygdala and hippocampus in early AD may impair recollection, but AD did not disrupt the effect of emotion on false recollection judgments.
Gao, Y, Zhang, G, Ma, J & Lu, J 2010, 'A lambda-Cut and Goal-Programming-Based Algorithm for Fuzzy-Linear Multiple-Objective Bilevel Optimization', IEEE TRANSACTIONS ON FUZZY SYSTEMS, vol. 18, no. 1, pp. 1-13.
View/Download from: Publisher's site
View description>>
Bilevel-programming techniques are developed to handle decentralized problems with two-level decision makers, which are leaders and followers, who may have more than one objective to achieve. This paper proposes a λ-cut and goal-programming-based algorithm to solve fuzzy-linear multiple-objective bilevel (FLMOB) decision problems. First, based on the definition of a distance measure between two fuzzy vectors using λ-cut, a fuzzy-linear bilevel goal (FLBG) model is formatted, and related theorems are proved. Then, using a λ-cut for fuzzy coefficients and a goal-programming strategy for multiple objectives, a λ-cut and goal-programming-based algorithm to solve FLMOB decision problems is presented. A case study for a newsboy problem is adopted to illustrate the application and executing procedure of this algorithm. Finally, experiments are carried out to discuss and analyze the performance of this algorithm. © 2006 IEEE.
Gil-Lafuente, AM & Merigó, JM 2010, 'Decision Making Techniques in Political Management', Studies in Fuzziness and Soft Computing, vol. 254, pp. 389-405.
View/Download from: Publisher's site
View description>>
In this paper, we develop a new decision making model and apply it in political management. We use a framework based on the use of ideals in the decision process and several similarity measures such as the Hamming distance, the adequacy coefficient and the index of maximum and minimum level. For each similarity measure, we use different types of aggregation operators such as the simple average, the weighted average, the ordered weighted averaging (OWA) operator and the generalized OWA (GOWA) operator. This new approach considers several attributes and different scenarios that may occur in the uncertain environment. We see that depending on the particular type of aggregation operator used the results may lead to different decisions. © 2010 Springer-Verlag Berlin Heidelberg.
Gui, L, Xu, Y, Liu, B, Gong, L & Li, Y 2010, 'An iterative decoding technique and architecture for RS concatenated TCM coding systems', IEEE Transactions on Consumer Electronics, vol. 56, no. 3, pp. 1288-1296.
View/Download from: Publisher's site
Guo, Z, Dong, Y, Wang, J & Lu, H 2010, 'The Forecasting Procedure for Long-Term Wind Speed in the Zhangye Area', Mathematical Problems in Engineering, vol. 2010, pp. 1-17.
View/Download from: Publisher's site
View description>>
Energy crisis has made it urgent to find alternative energy sources for sustainable energy supply; wind energy is one of the attractive alternatives. Within a wind energy system, the wind speed is one key parameter; accurately forecasting of wind speed can minimize the scheduling errors and in turn increase the reliability of the electric power grid and reduce the power market ancillary service costs. This paper proposes a new hybrid model for long-term wind speed forecasting based on the first definite season index method and the Autoregressive Moving Average (ARMA) models or the Generalized Autoregressive Conditional Heteroskedasticity (GARCH) forecasting models. The forecasting errors are analyzed and compared with the ones obtained from the ARMA, GARCH model, and Support Vector Machine (SVM); the simulation process and results show that the developed method is simple and quite efficient for daily average wind speed forecasting of Hexi Corridor in China.
Hamade, RF, Moulianitis, VC, D'Addonna, D & Beydoun, G 2010, 'A dimensional tolerancing knowledge management system using Nested Ripple Down Rules (NRDR)', ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, vol. 23, no. 7, pp. 1140-1148.
View/Download from: Publisher's site
HUO, H, CHEN, Q-K, WANG, G-R, PENG, D-L, HAO, J-T & GAO, L-P 2010, 'The Adaptive Fragmentation for XML Stream Dissemination', Chinese Journal of Computers, vol. 33, no. 10, pp. 1953-1962.
View/Download from: Publisher's site
Huo, H, Wang, G, Chen, Q & Peng, D 2010, 'SLCA algorithm for XML streams based on hole-filler model', Jisuanji Yanjiu yu Fazhan/Computer Research and Development, vol. 47, no. 5, pp. 886-892.
View description>>
Unlike in traditional databases, queries on XML streams are bounded not only by memory but also by real time processing. A novel technique for keyword search over streamed XML fragments is presented, which adopts broadcast model and hole-filler model for XML fragments dissemination, addressing the problem of disordered fragment transmission and considering the quality of searching results due to either keyword mismatch or data absence. Two efficient indexes for candidate elements are developed to further improve the performance: Hierarchical hash table and LCA table. The former indexes structure keywords which act as the structure of result, while the latter indexes the condition keywords which refine the keyword search condition. SLCA computing algorithm, which is triggered by condition keywords, only computes the candidate fragments that involve keywords, thus avoiding redundant operations that will not contribute to the final result. The algorithm produces part of the matched answers continuously without having to wait for the end of the stream. The experiments evaluate the performance of the SLCA algorithm with different types of keywords, different document fragmentation and different keyword frequencies, and compare the SLCA algorithm with other XML keyword matching algorithms. The experiment study shows that the SLCA algorithm performs well on saving processing power and memory space.
Juszczyszyn, K, Kazienko, P & Musiał, K 2010, 'Personalized Ontology-Based Recommender Systems for Multimedia Objects', Studies in Computational Intelligence, vol. 289, pp. 275-292.
View/Download from: Publisher's site
View description>>
A framework for recommendation of multimedia objects based on processing of individual ontologies is proposed in the chapter. The recommendation process takes into account similarities calculated both between objects' and users' ontologies, which reflect the social and semantic features existing in the system. The ontologies, which are close to the current context, provide a list of suggestions presented to the user. Each user in the system possesses its own Personal Agent that performs all necessary online tasks. Personal Agents co-operate each other and enrich lists of possible recommendations. The system was developed for the use inthe Flickr multimedia sharing system. © 2010 Springer-Verlag Berlin Heidelberg.
Ken-Li Lin, Chin-Teng Lin & Pal, NR 2010, 'Incremental Mountain Clustering Method to Find Building Blocks for Constructing Structures of Proteins', IEEE Transactions on NanoBioscience, vol. 9, no. 4, pp. 278-288.
View/Download from: Publisher's site
Kennard, R & Leaney, J 2010, 'Towards a general purpose architecture for UI generation', JOURNAL OF SYSTEMS AND SOFTWARE, vol. 83, no. 10, pp. 1896-1906.
View/Download from: Publisher's site
View description>>
Many software projects spend a significant proportion of their time developing the User Interface (UI), therefore any degree of automation in this area has clear benefits. Such automation is difficult due principally to the diversity of architectures, platforms and development environments. Attempts to automate UI generation to date have contained restrictions which did not accommodate this diversity, leading to a lack of wide industry adoption or standardisation. The authors set out to understand and address these restrictions. We studied the issues of UI generation (especially duplication) in practice, using action research cycles guided by interviews, adoption studies and close collaboration with industry practitioners. In addressing the issues raised in our investigation, we identified five key characteristics any UI generation technique would need before it should expect wide adoption or standardisation. These can be summarised as: inspecting existing, heterogeneous back-end architectures; appreciating different practices in applying inspection results; recognising multiple, and mixtures of, UI widget libraries; supporting multiple, and mixtures of, UI adornments; applying multiple, and mixtures of, UI layouts. Many of these characteristics seem ignored by current approaches. In addition, we discovered an emergent feature of these characteristics that opens the possibility of a previously unattempted goal â namely, retrofitting UI generation to an existing application.
Kuo, F-C, Zhou, ZQ, Ma, J & Zhang, G 2010, 'Metamorphic testing of decision support systems: a case study', IET Software, vol. 4, no. 4, pp. 294-294.
View/Download from: Publisher's site
View description>>
Decision support systems provide critical support to decision makers. These systems are increasingly complex and, as a result, they are very difficult to test because of the lack of an ideal test oracle. Lack of testing may result in poor software qualit
Liao, L-D, Li, M-L, Lai, H-Y, Shih, Y-YI, Lo, Y-C, Tsang, S, Chao, PC-P, Lin, C-T, Jaw, F-S & Chen, Y-Y 2010, 'Imaging brain hemodynamic changes during rat forepaw electrical stimulation using functional photoacoustic microscopy', NeuroImage, vol. 52, no. 2, pp. 562-570.
View/Download from: Publisher's site
Lin Gui, Wenfeng Ma, Bo Liu, Jingkan Lu & Peixin Shen 2010, 'Single Frequency Network System Coverage and Trial Testing of High Speed Railway Television System', IEEE Transactions on Broadcasting, vol. 56, no. 2, pp. 160-170.
View/Download from: Publisher's site
Lin, C-T, Chang, C-J, Lin, B-S, Hung, S-H, Chao, C-F & Wang, I-J 2010, 'A Real-Time Wireless Brain–Computer Interface System for Drowsiness Detection', IEEE Transactions on Biomedical Circuits and Systems, vol. 4, no. 4, pp. 214-222.
View/Download from: Publisher's site
Lin, C-T, Huang, K-C, Chao, C-F, Chen, J-A, Chiu, T-W, Ko, L-W & Jung, T-P 2010, 'Tonic and phasic EEG and behavioral changes induced by arousing feedback', NeuroImage, vol. 52, no. 2, pp. 633-642.
View/Download from: Publisher's site
Lin, C-T, Ko, L-W, Chang, M-H, Duann, J-R, Chen, J-Y, Su, T-P & Jung, T-P 2010, 'Review of Wireless and Wearable Electroencephalogram Systems and Brain-Computer Interfaces – A Mini-Review', Gerontology, vol. 56, no. 1, pp. 112-119.
View/Download from: Publisher's site
View description>>
Biomedical signal monitoring systems have rapidly advanced in recent years, propelled by significant advances in electronic and information technologies. Brain-computer interface (BCI) is one of the important research branches and has become a hot topic in the study of neural engineering, rehabilitation, and brain science. Traditionally, most BCI systems use bulky, wired laboratory-oriented sensing equipments to measure brain activity under well-controlled conditions within a confined space. Using bulky sensing equipments not only is uncomfortable and inconvenient for users, but also impedes their ability to perform routine tasks in daily operational environments. Furthermore, owing to large data volumes, signal processing of BCI systems is often performed off-line using high-end personal computers, hindering the applications of BCI in real-world environments. To be practical for routine use by unconstrained, freely-moving users, BCI systems must be noninvasive, nonintrusive, lightweight and capable of online signal processing. This work reviews recent online BCI systems, focusing especially on wearable, wireless and real-time systems.
Lin, C-T, Shen, T-K & Shou, Y-W 2010, 'Construction of Fisheye Lens Inverse Perspective Mapping Model and Its Applications of Obstacle Detection', EURASIP Journal on Advances in Signal Processing, vol. 2010, no. 1.
View/Download from: Publisher's site
Lin, C-T, Yang, C-T, Shou, Y-W & Shen, T-K 2010, 'An Efficient and Robust Moving Shadow Removal Algorithm and Its Applications in ITS', EURASIP Journal on Advances in Signal Processing, vol. 2010, no. 1.
View/Download from: Publisher's site
Lister, R 2010, 'COMPUTING EDUCATION RESEARCHGeek genes and bimodal grades', ACM Inroads, vol. 1, no. 3, pp. 16-17.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2010, 'COMPUTING EDUCATION RESEARCHTeaching the super profs to fish', ACM Inroads, vol. 1, no. 2, pp. 16-17.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2010, 'CS EDUCATION RESEARCHThe naughties in CSEd research', ACM Inroads, vol. 1, no. 1, pp. 22-24.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2010, 'CS Research', ACM SIGCSE Bulletin, vol. 41, no. 4, pp. 13-14.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R 2010, 'The closing of the CSEd mind', ACM Inroads, vol. 1, no. 4, pp. 17-18.
View/Download from: Publisher's site
View description>>
This is a regular invited column I write for this journal.
Lister, R, Clear, T, Simon, Bouvier, DJ, Carter, P, Eckerdal, A, Jacková, J, Lopez, M, McCartney, R, Robbins, P, Seppälä, O & Thompson, E 2010, 'Naturally occurring data as research instrument', ACM SIGCSE Bulletin, vol. 41, no. 4, pp. 156-173.
View/Download from: Publisher's site
View description>>
In New Zealand and Australia, the BRACElet project has been investigating students' acquisition of programming skills in introductory programming courses. The project has explored students' skills in basic syntax, tracing code, understanding code, and writing code, seeking to establish the relationships between these skills. This ITiCSE working group report presents the most recent step in the BRACElet project, which includes replication of earlier analysis using a far broader pool of naturally occurring data, refinement of the SOLO taxonomy in code-explaining questions, extension of the taxonomy to code-writing questions, extension of some earlier studies on students' 'doodling' while answering exam questions, and exploration of a further theoretical basis for work that until now has been primarily empirical.
Lu, H, Sriyanyong, P, Song, YH & Dillon, T 2010, 'Experimental study of a new hybrid PSO with mutation for economic dispatch with non-smooth cost function', International Journal of Electrical Power & Energy Systems, vol. 32, no. 9, pp. 921-935.
View/Download from: Publisher's site
View description>>
Particle swarm optimization (PSO) is a population-based evolutionary technique. Advancements in the PSO development over the last decade have made it one of the most promising optimization algorithms for a wide range of complex engineering optimization problems which traditional derivative-based optimization techniques cannot handle. The most attractive features of PSO are its algorithmic simplicity and fast convergence. However, PSO tends to suffer from premature convergence when applied to strongly multi-modal optimization problems. This paper proposes a method of incorporating a real-valued mutation (RVM) operator into the PSO algorithms, aimed at enhancing global search capability. Three variants of PSO algorithms are considered. The resultant hybrid PSO-RVM algorithms are experimentally investigated along with the PSO variants and an existing PSO with Gaussian mutation using six typical benchmark functions. It is interesting to see that the effectiveness of RVM varies for different PSO variants as well as different kinds of functions. It has been found that one of the hybrid algorithms, CBPSO-RVM, which is an integration of the PSO with the constriction factor and inertia weight (CBPSO) and the RVM operator, exhibits significantly better performance in most of the test cases compared to the other algorithms under consideration. Furthermore, this algorithm is superior to most of the existing algorithms used in this study when applied to two practical ED problems with non-smooth cost function considering the multiple fuel type and/or valve-point loading effects.
Lu, J & Zhang, G 2010, 'A special issue on decision intelligence with soft computing', SOFT COMPUTING, vol. 14, no. 12, pp. 1253-1254.
View/Download from: Publisher's site
Lu, J, Ruan, D & Zhang, G 2010, 'A special issue on Intelligent Decision Support and Warning Systems', KNOWLEDGE-BASED SYSTEMS, vol. 23, no. 1, pp. 1-2.
View/Download from: Publisher's site
Lu, J, Shambour, Q, Xu, Y, Lin, Q & Zhang, G 2010, 'BizSeeker A hybrid semantic recommendation system for personalized government-to-business e-services', INTERNET RESEARCH, vol. 20, no. 3, pp. 342-365.
View/Download from: Publisher's site
View description>>
Purpose - The purpose of this paper is to develop a hybrid semantic recommendation system to provide personalized government to business (G2B) e-services, in particular, business partner recommendation e-services for Australian small to medium enterprises (SMEs). Design/methodology/approach - The study first proposes a product semantic relevance model. It then develops a hybrid semantic recommendation approach which combines item-based collaborative filtering (CF) similarity and item-based semantic similarity techniques. This hybrid approach is implemented into an intelligent business-partner-locator recommendation-system prototype called BizSeeker. Findings - The hybrid semantic recommendation approach can help overcome the limitations of existing recommendation techniques. The recommendation system prototype, BizSeeker, can recommend relevant business partners to individual business users (e.g. exporters), which therefore will reduce the time, cost and risk of businesses involved in entering local and international markets. Practical implications - The study would be of great value in e-government personalization research. It would facilitate the transformation of the current G2B e-services into a new stage wherein the e-government agencies offer personalized e-services to business users. The study would help government policy decision-makers to increase the adoption of e-government services. Originality/value - Providing personalized e-services by e-government can be seen as an evolution of the intentions-based approach and will be one of the next directions of government e-services. This paper develops a new recommender approach and systems to improve personalization of government e-services.
Lu, J, Wang, C, Zhang, G & Ma, J 2010, 'Collaborative management of web ontology data with flexible access control', EXPERT SYSTEMS WITH APPLICATIONS, vol. 37, no. 5, pp. 3737-3746.
View/Download from: Publisher's site
View description>>
The creation and management of ontology data on web sites (e.g. instance data that is used to annotate web pages) is important technical support for the growth of the semantic web. This study identifies some key issues for web ontology data management and describes an ontology data management system, called robinet, to perform the management. This paper presents the structure of the system and introduces a Web ontology data management model that enables a flexible access control mechanism. This model adds rules into the robinet system to utilize the semantics of ontology for controlling the access to ontology data. The implementation of the rule-based access control mechanism and related testing are also discussed
Ma, J, Lu, J & Zhang, G 2010, 'Decider: A fuzzy multi-criteria group decision support system', KNOWLEDGE-BASED SYSTEMS, vol. 23, no. 1, pp. 23-31.
View/Download from: Publisher's site
View description>>
Multi-criteria group decision making (MCGDM) aims to support preference-based decision over the available alternatives that are characterized by multiple criteria in a group. To increase the level of overall satisfaction for the final decision across the group and deal with uncertainty in decision process, a fuzzy MCGDM process (FMP) model is established in this study. This FMP model can also aggregate both subjective and objective information under multi-level hierarchies of criteria and evaluators. Based on the FMP model, a fuzzy MCGDM decision support system (called Decider) is developed, which can handle information expressed in linguistic terms, boolean values, as well as numeric values to assess and rank a set of alternatives within a group of decision makers. Real applications indicate that the presented FMP model and the Decider software are able to effectively handle fuzziness in both subjective and objective information and support group decision-making under multi-level criteria with a higher level of satisfaction by decision makers. © 2009 Elsevier B.V. All rights reserved.
Ma, J, Lu, J & Zhang, G 2010, 'Team situation awareness measure using semantic utility functions for supporting dynamic decision-making', SOFT COMPUTING, vol. 14, no. 12, pp. 1305-1316.
View/Download from: Publisher's site
View description>>
Team decision-making is a remarkable feature in a complex dynamic decision environment, which can be supported by team situation awareness. In this paper, a team situation awareness measure (TSAM) method using a semantic utility function is proposed. The semantic utility function is used to clarify the semantics of qualitative information expressed in linguistic terms. The individual and team situation awareness are treated as linguistic possibility distributions on the potential decisions in a dynamic decision environment. In the TSAM method, team situation awareness is generated through reasoning and aggregating individual situation awareness based on a multi-level hierarchy mental model of the team. Individual and team mental models are composed of key drivers and significant variables. An illustrative example in telecoms customer churn prediction is given to explain the effectiveness and the main steps of the TSAM method. © 2009 Springer-Verlag.
Ma, J, Zhang, G & Lu, J 2010, 'A state-based knowledge representation approach for information logical inconsistency detection in warning systems', KNOWLEDGE-BASED SYSTEMS, vol. 23, no. 2, pp. 125-131.
View/Download from: Publisher's site
View description>>
Detecting logical inconsistency in collected information is a vital function when deploying a knowledge-based warning system to monitor a specific application domain for the reason that logical inconsistency is often hidden from seemingly consistent information and may lead to unexpected results. Existing logical inconsistency detection methods usually focus on information stored in a knowledge base by using a well-defined general purpose knowledge representation approach, and therefore cannot fulfill the demands of a domain-specific situation. This paper first proposes a state-based knowledge representation approach, in which domain-specific knowledge is expressed by combinations of the relevant objects' states. Based on this approach, a method for information logical inconsistency detection (ILID) is developed which can flexibly handle the demands of various domain-specific situations through reducing part of restrictions in existing methods. Finally, two real-case based examples are presented to illustrate the ILID method and its advantages. © 2009 Elsevier B.V. All rights reserved.
MacDougall, C, McGregor, C & Percival, J 2010, 'The fusion of clinical guidelines with technology: Trends & challenges', Electronic Journal of Health Informatics, vol. 5, no. 2.
View description>>
The use of Health Information Technology (HIT) within the healthcare setting can be a great resource to contribute to improved patient care. Clinical guidelines are developed to aid in the decision making process of healthcare professionals and contain the leading edge of best patient practice. There is an abundance of evidence presenting the benefits that HIT contain; however, its use is rarely incorporated in today's clinical guidelines. Although, research suggests that the benefits of HIT are enough to integrate their use in clinical guidelines, there are a number of challenges that interfere with its implementation, such as, cultural diversity, interdisciplinary nature, lack of HIT knowledge for workers, evolution of technology, heavy clinical workload and lack of medical background in developers of HIT. The purpose of this research project is to present a literature review to further understand the trends and challenges of implementing HIT use within clinical guidelines. A modelling system, PaJMa is also introduced to visually depict a patient's journey and the methods of documentation. PaJMa can aid in discovering gaps in healthcare documentation and closing those gaps through HIT use within clinical guidelines. Further research revealed that there are models and approaches supporting the process and creation of clinical guidelines but none of these enable the inclusion of what technology will be used to support the implementation of these procedures. The research project concludes with ideas for future research in the area of clinical guideline development and HIT implementation. © of articles is retained by authors.
Mathieson, L 2010, 'The parameterized complexity of editing graphs for bounded degeneracy', Theoretical Computer Science, vol. 411, no. 34-36, pp. 3181-3187.
View/Download from: Publisher's site
McGregor, C & Eklund, JM 2010, 'Next generation remote critical care through service-oriented architectures: challenges and opportunities', Service Oriented Computing and Applications, vol. 4, no. 1, pp. 33-43.
View/Download from: Publisher's site
View description>>
Health care providers and governments are under pressure to maintain and improve the quality of care to an increasing volume of critical care patients at either end of the life cycle, namely premature and ill term babies together with the elderly. The provision of a service of critical care utilizing real time service-oriented architectures has the potential to enable clinicians to be supported in the care of a greater number patients that are, perhaps more importantly, located elsewhere to their intensive care units. This paper presents a review of recent research in the application of computing and IT to support the service of critical care and determines the trends and challenges for the application of real time service-oriented architectures within the domain. It then presents some case study-based research on the design of a service-oriented architecture-based approach to support two aspects of critical care namely elderly care and neonatal intensive care to provide further context to trends and opportunities. © 2010 Springer-Verlag London Limited.
Mellor, D, Prieto, E, Mathieson, L & Moscato, P 2010, 'A Kernelisation Approach for Multiple d-Hitting Set and Its Application in Optimal Multi-Drug Therapeutic Combinations', PLoS ONE, vol. 5, no. 10, pp. e13055-e13055.
View/Download from: Publisher's site
View description>>
Therapies consisting of a combination of agents are an attractive proposition, especially in the context of diseases such as cancer, which can manifest with a variety of tumor types in a single case. However uncovering usable drug combinations is expensive both financially and temporally. By employing computational methods to identify candidate combinations with a greater likelihood of success we can avoid these problems, even when the amount of data is prohibitively large. HITTING SET is a combinatorial problem that has useful application across many fields, however as it is NP-complete it is traditionally considered hard to solve exactly. We introduce a more general version of the problem (α,β,d)-HITTING SET, which allows more precise control over how and what the hitting set targets. Employing the framework of Parameterized Complexity we show that despite being NP-complete, the ((α,β,d)-HITTING SET problem is fixed-parameter tractable with a kernel of size O(αdkd) when we parameterize by the size k of the hitting set and the maximum number a of the minimum number of hits, and taking the maximum degree d of the target sets as a constant. We demonstrate the application of this problem to multiple drug selection for cancer therapy, showing the flexibility of the problem in tailoring such drug sets. The fixed-parameter tractability result indicates that for low values of the parameters the problem can be solved quickly using exact methods. We also demonstrate that the problem is indeed practical, with computation times on the order of 5 seconds, as compared to previous Hitting Set applications using the same dataset which exhibited times on the order of 1 day, even with relatively relaxed notions for what constitutes a low value for the parameters. Furthermore the existence of a kernelization for ((α,β,d)-HITTING SET indicates that the problem is readily scalable to large datasets. © 2010 Mellor et al.
Merigó Lindahl, JM & Casanovas Ramón, M 2010, 'The generalized hybrid averaging operator and its application in decision making', Revista de Metodos Cuantitativos para la Economia y la Empresa, vol. 9, no. 1, pp. 69-84.
View description>>
We present the generalized hybrid averaging (GHA) operator. It is a new aggregation operator that generalizes the hybrid averaging (HA) operator by using the generalized mean. Thus, we are able to generalize a wide range of mean operators such as the HA, the hybrid geometric averaging (HGA), the hybrid quadratic averaging (HQA), the generalized ordered weighted averaging (GOWA) operator and the weighted generalized mean (WGM). A key feature in this aggregation operator is that it is able to deal with the weighted average and the ordered weighted averaging (OWA) operator in the same formulation. We further generalize the GHA by using quasi-arithmetic means obtaining the quasi-arithmetic hybrid averaging (Quasi-HA) operator. We conclude the paper with an example of the new approach in a financial decision making problem.
Merigó, JM 2010, 'Fuzzy decision making with immediate probabilities', Computers & Industrial Engineering, vol. 58, no. 4, pp. 651-657.
View/Download from: Publisher's site
Merigo, JM & Casanovas, M 2010, 'Induced and heavy aggregation operators with distance measures', Journal of Systems Engineering and Electronics, vol. 21, no. 3, pp. 431-439.
View/Download from: Publisher's site
Merigó, JM & Casanovas, M 2010, 'Decision making with distance measures and linguistic aggregation 0operators', International Journal of Fuzzy Systems, vol. 12, no. 3, pp. 190-198.
View description>>
We present a new decision making model with distance measures by using linguistic aggregation operators. We introduce a new aggregation operator called the linguistic ordered weighted averaging distance (LOWAD) operator. This aggregation operator provides a parameterized family of blinguistic aggregation operators that includes the maximum distance, the minimum distance, the linguistic normalized Hamming distance and the linguistic weighted Hamming distance, among others. We study some of its main properties and different families of LOWAD operators such as the median-LOWAD, the Olympic-LOWAD, the S-LOWAD and the centered-LOWAD. We also develop an application of the new approach in a decision making problem concerning human resource management. © 2010 TFSA.
Merigó, JM & Casanovas, M 2010, 'Fuzzy generalized hybrid aggregation operators and its application in fuzzy decision making', International Journal of Fuzzy Systems, vol. 12, no. 1, pp. 15-24.
View description>>
The hybrid averaging (HA) is an aggregation operator that uses the weighted average (WA) and the ordered weighted averaging (OWA) operator in the same formulation. In this paper, we introduce several generalizations of the HA operator by using generalized and quasi-arithmetic means, fuzzy numbers and order inducing variables in the reordering step of the aggregation process. We present the fuzzy generalized hybrid averaging (FGHA) operator, the fuzzy induced generalized hybrid averaging (FIGHA) operator, the Quasi-FHA operator and the Quasi-FIHA operator. The main advantage of these operators is that they generalize a wide range of fuzzy aggregation operators that can be used in a wide range of applications such as decision making problems. For example, we could mention the fuzzy induced hybrid averaging (FIHA), the fuzzy weighted generalized mean (FWGM) and the fuzzy induced generalized OWA (FIGOWA). We end the paper with an application of the new approach in a decision making problem. © 2010 TFSA.
Merigó, JM & Casanovas, M 2010, 'THE FUZZY GENERALIZED OWA OPERATOR AND ITS APPLICATION IN STRATEGIC DECISION MAKING', Cybernetics and Systems, vol. 41, no. 5, pp. 359-370.
View/Download from: Publisher's site
Merigó, JM & Gil-Lafuente, AM 2010, 'New decision-making techniques and their application in the selection of financial products', Information Sciences, vol. 180, no. 11, pp. 2085-2094.
View/Download from: Publisher's site
MERIGÓ, JM, CASANOVAS, M & MARTÍNEZ, L 2010, 'LINGUISTIC AGGREGATION OPERATORS FOR LINGUISTIC DECISION MAKING BASED ON THE DEMPSTER-SHAFER THEORY OF EVIDENCE', International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vol. 18, no. 03, pp. 287-304.
View/Download from: Publisher's site
View description>>
In this paper, we develop a new approach for decision making with Dempster-Shafer theory of evidence by using linguistic information. We suggest the use of different types of linguistic aggregation operators in the model. We then obtain as a result, the belief structure — linguistic ordered weighted averaging (BS-LOWA), the BS — linguistic hybrid averaging (BS-LHA) and a wide range of particular cases. Some of their main properties are studied. Finally, we provide an illustrative example that shows the different results obtained by using different types of linguistic aggregation operators in the new approach.
Merigó, JM, Gil Lafuente, AM & Barcellos, L 2010, 'UNCERTAIN INDUCED GENERALIZED AGGREGATION OPERATORS AND ITS APPLICATION IN THE THEORY OF EXPERTONS', FUZZY ECONOMIC REVIEW, vol. 15, no. 02, pp. 25-42.
View/Download from: Publisher's site
View description>>
We present a new approach that unifies the induced generalized ordered weighted averaging (IGOWA) operator with the weighted average (WA) when the available information is uncertain and can be assessed with interval numbers. We call it the uncertain induced generalized ordered weighted averaging - weighted averaging (UIGOWAWA) operator. The main advantage of this approach is that it unifies the IOWA and the WA taking into account the degree of importance of each case in the formulation and considering that the information is given with interval numbers. We also study different properties of the UIGOWAWA operator and different particular cases. We also analyze the applicability of the new approach and we see that it is possible to develop a wide range of applications because all the previous studies that use the WA can be revised and extended with this new approach. We focus on an application in decision making with the theory of expertons. Thus, we are able to assess group decision making problems in a more complete way.
Miliszewska, I & Sztendur, E 2010, 'Interest in ICT Studies and Careers: Perspectives of Secondary School Female Students from Low Socioeconomic Backgrounds', Interdisciplinary Journal of Information, Knowledge, and Management, vol. 5, pp. 237-260.
View/Download from: Publisher's site
View description>>
The under-representation of females in information and communication technology (ICT) fields of study and careers continues to attract considerable attention. This article discusses findings of a research study that investigated interest in ICT studies and careers among female secondary school students. The investigation focused on girls from schools in the Western suburbs of Melbourne, Australia, home to families with low socio-economic status and students exposed to "educational disadvantage." The article outlines the demographic background of the participating girls, their experiences with ICT, self-efficacy of ICT skills, and their preferences for future studies and careers in ICT. It also analyzes factors that might have influenced the girls' attitudes towards ICT studies and careers, including their ethnic background, exposure to ICT at school and home, and perceptions of ICT. The article concludes with suggestions for future research.
Milton, J & Kennedy, PJ 2010, 'Static and Dynamic Selection Thresholds Governing the Accumulation of Information in Genetic Algorithms Using Ranked Populations', EVOLUTIONARY COMPUTATION, vol. 18, no. 2, pp. 229-254.
View/Download from: Publisher's site
View description>>
Mutation applied indiscriminately across a population has, on average, a detrimental effect on the accumulation of solution alleles within the population and is usually beneficial only when targeted at individuals with few solution alleles. Many common selection techniques can delete individuals with more solution alleles than are easily recovered by mutation. The paper identifies static and dynamic selection thresholds governing accumulation of information in a genetic algorithm (GA). When individuals are ranked by fitness, there exists a dynamic threshold defined by the solution density of surviving individuals and a lower static threshold defined by the solution density of the information source used for mutation. Replacing individuals ranked below the static threshold with randomly generated individuals avoids the need for mutation while maintaining diversity in the population with a consequent improvement in population fitness. By replacing individuals ranked between the thresholds with randomly selected individuals from above the dynamic threshold, population fitness improves dramatically. We model the dynamic behavior of GAs using these thresholds and demonstrate their effectiveness by simulation and benchmark problems.
Munro, WJ, Harrison, KA, Stephens, AM, Devitt, SJ & Nemoto, K 2010, 'From quantum multiplexing to high-performance quantum networking', Nature Photonics, vol. 4, no. 11, pp. 792-796.
View/Download from: Publisher's site
Myles, A, Pietroni, N, Kovacs, D & Zorin, D 2010, 'Feature-aligned T-meshes', ACM Transactions on Graphics, vol. 29, no. 4, pp. 1-11.
View/Download from: Publisher's site
View description>>
High-order and regularly sampled surface representations are more efficient and compact than general meshes and considerably simplify many geometric modeling and processing algorithms. A number of recent algorithms for conversion of arbitrary meshes to regularly sampled form (typically quadrangulation) aim to align the resulting mesh with feature lines of the geometry. While resulting in a substantial improvement in mesh quality, feature alignment makes it difficult to obtain coarse regular patch partitions of the mesh. In this paper, we propose an approach to constructing patch layouts consisting of small numbers of quadrilateral patches while maintaining good feature alignment. To achieve this, we use quadrilateral T-meshes, for which the intersection of two faces may not be the whole edge or vertex, but a part of an edge. T-meshes offer more flexibility for reduction of the number of patches and vertices in a base domain while maintaining alignment with geometric features. At the same time, T-meshes retain many desirable features of quadrangulations, allowing construction of high-order representations, easy packing of regularly sampled geometric data into textures, as well as supporting different types of discretizations for physical simulation.
Pattinson, HM & Sood, SC 2010, 'Marketers expressing the future: Scenario planning for marketing action', Futures, vol. 42, no. 4, pp. 417-426.
View/Download from: Publisher's site
View description>>
Thomas Friedman exhorts us to imagine the future [1] - we urge marketers to invent the future, to learn the future faster, and to deliver the future earlier. Marketers are asked to develop scenarios about emerging technologies such as broadband wireless
Pietroni, N, Cignoni, P, Otaduy, MA & Scopigno, R 2010, 'Solid-Texture Synthesis: A Survey.', IEEE Computer Graphics and Applications, vol. 30, no. 4, pp. 74-89.
View/Download from: Publisher's site
Pietroni, N, Tarini, M & Cignoni, P 2010, 'Almost Isometric Mesh Parameterization through Abstract Domains.', IEEE Trans. Vis. Comput. Graph., vol. 16, pp. 621-635.
View/Download from: Publisher's site
Rizzi, R, Mahata, P, Mathieson, L & Moscato, P 2010, 'Hierarchical Clustering Using the Arithmetic-Harmonic Cut: Complexity and Experiments', PLoS ONE, vol. 5, no. 12, pp. e14067-e14067.
View/Download from: Publisher's site
View description>>
Clustering, particularly hierarchical clustering, is an important method for understanding and analysing data across a wide variety of knowledge domains with notable utility in systems where the data can be classified in an evolutionary context. This paper introduces a new hierarchical clustering problem defined by a novel objective function we call the arithmeticharmonic cut. We show that the problem of finding such a cut is NP-hard and APX-hard but is fixed-parameter tractable, which indicates that although the problem is unlikely to have a polynomial time algorithm (even for approximation), exact parameterized and local search based techniques may produce workable algorithms. To this end, we implement a memetic algorithm for the problem and demonstrate the effectiveness of the arithmetic-harmonic cut on a number of datasets including a cancer type dataset and a corona virus dataset. We show favorable performance compared to currently used hierarchical clustering techniques such as k-MEANS, Graclus and NORMALIZED-CUT. The arithmetic-harmonic cut metric overcoming difficulties other hierarchal methods have in representing both intercluster differences and intracluster similarities. © 2010 Rizzi et al.
Ruan, D, Lu, J, Laes, E, Zhang, G, Ma, J & Meskens, G 2010, 'Multi-criteria Group Decision Support with Linguistic Variables in Long-term Scenarios for Belgian Energy Policy', JOURNAL OF UNIVERSAL COMPUTER SCIENCE, vol. 16, no. 1, pp. 103-120.
View description>>
Real world decisions often made in the presence of multiple, conflicting, and incommensurate criteria. Decision making requires multiple perspectives of different individuals as more decisions are made now in groups than ever before. This is particularly true when the decision environment becomes more complex such as sustainability policies study in environmental and energy sectors. Group decision making processes judgments or solutions for decision problems based on the input and feedback of multiple individuals. Multi-criteria decision and evaluation problems at tactical and strategic levels in practice involve fuzziness in terms of linguistic variables vis-à-vis criteria, weights, and decision maker judgments. Relevant alternatives or scenarios are evaluated according to a number of desired criteria. A fuzzy multi-criteria group decision software tool is developed to analyze long-term scenarios for Belgian energy policy in this paper. © J.UCS.
Seely, AJE, Macklem, PT, Suki, B, Goldberger, A, Godin, P, Batchinsky, AI, Longtin, A, Jones, G, Seiver, A, McGregor, C, Norris, P, Maksym, G, Lake, D, Costa, MD, Marshall, JC, Morris, JA, Moorman, JR, Arnold, RC, Perez-Velazquez, JL & Nenadovic, V 2010, 'The Wakefield roundtable discussion on complexity and variability at the bedside', Journal of Critical Care, vol. 25, no. 3, pp. 536-537.
View/Download from: Publisher's site
Sheridan, L, Le, TP & Scarani, V 2010, 'Finite-key security against coherent attacks in quantum key distribution', New J. Phys., vol. 12, p. 123019.
View description>>
The work by Christandl, K\'onig and Renner [Phys. Rev. Lett. 102, 020504(2009)] provides in particular the possibility of studying unconditionalsecurity in the finite-key regime for all discrete-variable protocols. We spellout this bound from their general formalism. Then we apply it to the study of arecently proposed protocol [Laing et al., Phys. Rev. A 82, 012304 (2010)]. Thisprotocol is meaningful when the alignment of Alice's and Bob's reference framesis not monitored and may vary with time. In this scenario, the notion ofasymptotic key rate has hardly any operational meaning, because if one waitstoo long time, the average correlations are smeared out and no security can beinferred. Therefore, finite-key analysis is necessary to find the maximalachievable secret key rate and the corresponding optimal number of signals.
Tarini, M, Pietroni, N, Cignoni, P, Panozzo, D & Puppo, E 2010, 'Practical quad mesh simplification.', Comput. Graph. Forum, vol. 29, pp. 407-418.
View/Download from: Publisher's site
Tsai, C-H, Liao, L-D, Luo, Y-S, Chao, PC-P, Chen, E-C, Meng, H-F, Chen, W-D, Lin, S-K & Lin, C-T 2010, 'Optimal design and fabrication of ITO/organic photonic crystals in polymer light-emitting diodes using a focused ion beam', Microelectronic Engineering, vol. 87, no. 5-8, pp. 1331-1335.
View/Download from: Publisher's site
Voinov, A & Bousquet, F 2010, 'Modelling with stakeholders☆', Environmental Modelling & Software, vol. 25, no. 11, pp. 1268-1281.
View/Download from: Publisher's site
Voinov, A & Cerco, C 2010, 'Model integration and the role of data', Environmental Modelling & Software, vol. 25, no. 8, pp. 965-969.
View/Download from: Publisher's site
Voinov, AA, DeLuca, C, Hood, RR, Peckham, S, Sherwood, CR & Syvitski, JPM 2010, 'A Community Approach to Earth Systems Modeling', Eos, Transactions American Geophysical Union, vol. 91, no. 13, pp. 117-118.
View/Download from: Publisher's site
View description>>
Earth science often deals with complex systems spanning multiple disciplines. These systems are best described by integrated models built with contributions from specialists of many backgrounds. But building integrated models can be difficult; modular and hierarchical approaches help to manage the increasing complexity of these modeling systems, but there is a need for framework and integration methods and standards to support modularity. Complex models require many data and generate lots of output, so software and standards are required for data handling, model output, data distribution services, and user interfaces. Complex modeling systems must be efficient to be useful, so they require contributions by software engineers to ensure efficient architectures, accurate numerics, and implementation on fast computers. Further, integrated model systems can be difficult to learn and use unless adequate documentation, training, and support are provided.
Wang, J, Zhu, S, Zhang, W & Lu, H 2010, 'Combined modeling for electric load forecasting with adaptive particle swarm optimization', Energy, vol. 35, no. 4, pp. 1671-1678.
View/Download from: Publisher's site
View description>>
Electric load forecasting is crucial for managing electric power systems economically and safely. This paper presents a new combined model for electric load forecasting based on the seasonal ARIMA forecasting model, the seasonal exponential smoothing mod
Yang, X, Lu, J & Zhang, G 2010, 'Adaptive pruning algorithm for least squares support vector machine classifier', SOFT COMPUTING, vol. 14, no. 7, pp. 667-680.
View/Download from: Publisher's site
View description>>
As a new version of support vector machine (SVM), least squares SVM (LS-SVM) involves equality instead of inequality constraints and works with a least squares cost function. A well-known drawback in the LSSVM applications is that the sparseness is lost. In this paper, we develop an adaptive pruning algorithm based on the bottom-to-top strategy, which can deal with this drawback. In the proposed algorithm, the incremental and decremental learning procedures are used alternately and a small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using this set, one can construct the final classifier. In general, the number of the elements in the support vector set is much smaller than that in the training set and a sparse solution is obtained. In order to test the efficiency of the proposed algorithm, we apply it to eight UCI datasets and one benchmarking dataset. The experimental results show that the presented algorithm can obtain adaptively the sparse solutions with losing a little generalization performance for the classification problems with no-noises or noises, and its training speed is much faster than sequential minimal optimization algorithm (SMO) for the large-scale classification problems with no-noises.
Yang, X, Lu, J & Zhang, G 2010, 'Adaptive pruning algorithm for least squares support vector machine classifier', Soft Computing, vol. 14, no. 7, pp. 667-680.
View/Download from: Publisher's site
View description>>
As a new version of support vector machine (SVM), least squares SVM (LS-SVM) involves equality instead of inequality constraints and works with a least squares cost function. A well-known drawback in the LS-SVM applications is that the sparseness is lost. In this paper, we develop an adaptive pruning algorithm based on the bottom-to-top strategy, which can deal with this drawback. In the proposed algorithm, the incremental and decremental learning procedures are used alternately and a small support vector set, which can cover most of the information in the training set, can be formed adaptively. Using this set, one can construct the final classifier. In general, the number of the elements in the support vector set is much smaller than that in the training set and a sparse solution is obtained. In order to test the efficiency of the proposed algorithm, we apply it to eight UCI datasets and one benchmarking dataset. The experimental results show that the presented algorithm can obtain adaptively the sparse solutions with losing a little generalization performance for the classification problems with no-noises or noises, and its training speed is much faster than sequential minimal optimization algorithm (SMO) for the large-scale classification problems with no-noises. © Springer-Verlag 2009.
Zhang, G & Lu, J 2010, 'Fuzzy bilevel programming with multiple objectives and cooperative multiple followers', JOURNAL OF GLOBAL OPTIMIZATION, vol. 47, no. 3, pp. 403-419.
View/Download from: Publisher's site
View description>>
Classic bilevel programming deals with two level hierarchical optimization problems in which the leader attempts to optimize his/her objective, subject to a set of constraints and his/her followerâs solution. In modelling a real-world bilevel decision problem, some uncertain coefficients often appear in the objective functions and/or constraints of the leader and/or the follower. Also, the leader and the follower may have multiple conflicting objectives that should be optimized simultaneously. Furthermore, multiple followers may be involved in a decision problem and work cooperatively according to each of the possible decisions made by the leader, but with different objectives and/or constraints. Following our previous work, this study proposes a set of models to describe such fuzzy multi-objective, multi-follower (cooperative) bilevel programming problems. We then develop an approximation Kth-best algorithm to solve the problems.
Zhang, T, Zhang, G, Ma, J & Lu, J 2010, 'Power Distribution System Planning Evaluation by a Fuzzy Multi-Criteria Group Decision Support System', INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, vol. 3, no. 4, pp. 474-485.
View description>>
The evaluation of solutions is an important phase in power distribution system planning (PDSP) which allows issues such as quality of supply, cost, social service and environmental implications to be considered and usually involves the judgments of a group of experts. The planning problem is thus suitable for the multi-criteria group decision-making (MCGDM) method. The evaluation process and evaluation criteria often involve uncertainties incorporated in quantitative analysis with crisp values and qualitative judgments with linguistic terms; therefore, fuzzy sets techniques are applied in this study. This paper proposes a fuzzy multi-criteria group decision-making (FMCGDM) method for PDSP evaluation and applies a fuzzy multi-criteria group decision support system (FMCGDSS) to support the evaluation task. We introduce a PDSP evaluation model, which has evaluation criteria within three levels, based on the characteristics of a power distribution system. A case-based example is performed on a test distribution network and demonstrates how all the problems in a PDSP evaluation are addressed using FMCGDSS. The results are acceptable to expert evaluators.
Zhang, T, Zhang, G, Ma, J & Lu, J 2010, 'Power Distribution System Planning Evaluation by a Fuzzy Multi-Criteria Group Decision Support System', International Journal of Computational Intelligence Systems, vol. 3, no. 4, pp. 474-474.
View/Download from: Publisher's site
Zong, Y, Li, M-C, Xu, G-D & Zhang, Y-C 2010, 'High Dimensional Clustering Algorithm Based on Local Significant Units', Journal of Electronics & Information Technology, vol. 32, no. 11, pp. 2707-2712.
View/Download from: Publisher's site
View description>>
High dimensional clustering algorithm based on equal or random width density grid cannot guarantee high quality clustering results in complicated data sets. In this paper, a High dimensional Clustering algorithm based on Local Significant Unit (HC_LSU) is proposed to deal with this problem, based on the kernel estimation and spatial statistical theory. Firstly, a structure, namely Local Significant Unit (LSU) is introduced by local kernel density estimation and spatial statistical test; secondly, a greedy algorithm named Greedy Algorithm for LSU (GA_LSU) is proposed to quickly find out the local significant units in the data set; and eventually, the single-linkage algorithm is run on the local significant units with the same attribute subset to generate the clustering results. Experimental results on 4 synthetic and 6 real world data sets showed that the proposed high-dimensional clustering algorithm, HC_LSU, could effectively find out high quality clustering results from the highly complicated data sets.
Zong, Y, Xu, G, Zhang, Y, Jiang, H & Li, M 2010, 'A robust iterative refinement clustering algorithm with smoothing search space', Knowledge-Based Systems, vol. 23, no. 5, pp. 389-396.
View/Download from: Publisher's site
View description>>
Iterative refinement clustering algorithms are widely used in data mining area, but they are sensitive to the initialization. In the past decades, many modified initialization methods have been proposed to reduce the influence of initialization sensitivity problem. The essence of iterative refinement clustering algorithms is the local search method. The big numbers of the local minimum points which are embedded in the search space make the local search problem hard and sensitive to the initialization. The smaller number of local minimum points, the more robust of initialization for a local search algorithm is. In this paper, we propose a TopDown Clustering algorithm with Smoothing Search Space (TDCS3) to reduce the influence of initialization. The main steps of TDCS3 are to: (1) dynamically reconstruct a series of smoothed search spaces into a hierarchical structure by `filling the local minimum points; (2) at the top level of the hierarchical structure, an existing iterative refinement clustering algorithm is run with random initialization to generate the clustering result; (3) eventually from the second level to the bottom level of the hierarchical structure, the same clustering algorithm is run with the initialization derived from the previous clustering result. Experiment results on 3 synthetic and 10 real world data sets have shown that TDCS3 has significant effects on finding better, robust clustering result and reducing the impact of initialization.
Al- Hassan, MW, Lu, H & Lu, J 1970, 'A FRAMEWORK FOR DELIVERING PERSONALIZED E-GOVERNMENT TOURISM SERVICES', Proceedings of the 6th International Conference on Web Information Systems and Technology, 6th International Conference on Web Information Systems and Technologies, SciTePress - Science and and Technology Publications, Valencia, Spain, pp. 263-270.
View/Download from: Publisher's site
View description>>
E-government (e-Gov) has become one of the most important parts of government strategies. Significant efforts have been devoted to e-Gov tourism services in many countries because tourism is one of the major profitable industries. However, the current e-Gov tourism services are limited to simple online presentation of tourism information. Intelligent e-Gov tourism services, such as the personalized e-Gov (Pe-Gov) tourism services, are highly desirable for helping users decide âwhere to go, and what to do/seeâ amongst massive number of destinations and enormous attractiveness and activities. This paper proposes a framework of Pe-Gov tourism services using recommender system techniques and semantic ontology. This framework has the potential to enable tourism information seekers to locate the most interesting destinations with the most suitable activities with the least search efforts. Its workflow and some outstanding features are depicted with an example.
Alqahtani, A, Lu, H & Lu, J 1970, 'Towards Semantic-Aware and Ontology-Based e-Government Service Integration - An Applicative Case Study of Saudi Arabia's King Abdullah Scholarship Program', ADVANCES IN INTELLIGENT DECISION TECHNOLOGIES, The Second KES International Symposium IDT, Springer-Verlag, Baltimore, USA, pp. 403-411.
View/Download from: Publisher's site
View description>>
By improving the quality of e-government services by enabling access to services across different government agencies through one portal, services integration plays a key role in e-government development. This paper proposes a conceptual framework of ontology based e-government service integration, using Saudi Arabia's King Abdullah Scholarship Program (SAKASP) as a case study. SAKASP is a multi-domain program in which students must collect information from various Ministries to complete applications and the administering authority must verify the information supplied by the Ministries. The current implementation of SAKASP is clumsy because it is a mixture of online submission and manual collection and verification of information; its time-consuming and tedious procedures are inconvenient for the applicants and inefficient for the administrators. The proposed framework provides an integrated service by employing semantic web service (SWS) and ontology, improving the current implementation of SAKASP by automatically collecting and processing the related information for a given application. The article includes a typical scenario that demonstrates the workflow of the framework. This framework is applicable to other multi-domain e-government services.
Anaissi, A, Kennedy, PJ & Goyal, M 1970, 'A framework for high dimensional data reduction in the microarray domain', 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA), IEEE, Changsha, China, pp. 903-907.
View/Download from: Publisher's site
View description>>
Microarray analysis and visualization is very helpful for biologists and clinicians to understand gene expression in cells and to facilitate diagnosis and treatment of patients. However, a typical microarray dataset has thousands of features and a very small number of observations. This very high dimensional data has a massive amount of information which often contains some noise, non-useful information and small number of relevant features for disease or genotype. This paper proposes a framework for very high dimensional data reduction based on three technologies: feature selection, linear dimensionality reduction and non-linear dimensionality reduction. In this paper, feature selection based on mutual information will be proposed for filtering features and selecting the most relevant features with the minimum redundancy. A kernel linear dimensionality reduction method is also used to extract the latent variables from a high dimensional data set. In addition, a non-linear dimensionality reduction based on local linear embedding is used to reduce the dimension and visualize the data. Experimental results are presented to show the outputs of each step and the efficiency of this framework.
Azadeh, A, Neshat, N & Saberi, M 1970, 'An intelligent approach for improved predictive control of spray drying process', 2010 IEEE 14th International Conference on Intelligent Engineering Systems, 2010 IEEE 14th International Conference on Intelligent Engineering Systems, IEEE, pp. 127-136.
View/Download from: Publisher's site
View description>>
A flexible meta modelling approach is presented to predictive control of a drying process using Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Network (ANN) and Partial Least Squares (PLS) analysis. In the proposed approach, the PLS analysis is used to pre-process actual data and to provide the necessary background to apply ANN and ANFIS approaches. A reasonable section of this study is assigned to the modelling with aim at predicting the granule particle size and executing by ANFIS and ANN. ANN hold the promise of being capable of producing non-linear models, being able to work under noise conditions and being fault tolerant to the loss of neurons or connections. Also, the ANFIS approach combines the advantages of fuzzy system and artificial neural network to design architecture and is capable of dealing with both limitation and complexity in the data set. The efficiencies of ANFIS and ANN approaches in prediction are compared and the superior approach is selected. Finally, by deploying the preferred approach, several scenarios are presented to estimate the predictive control of spray drying as an accurate, fast running and inexpensive tool. This is the first study that presents a flexible intelligent approach for predictive control of drying process by ANN, ANFIS and PLS. The approach of this study may be easily applied to other drying process. © 2010 IEEE.
Babar, A, Zowghi, D & Chew, E 1970, 'Using goals to model strategy map for business IT alignment', CEUR Workshop Proceedings, International Workshop on Business/IT Alignment and Interoperability, CEUR-WS.org/WorldPress, Hammamet, Tunisia, pp. 16-30.
View description>>
Strategy Map (SM) is one of the widely used methods to create business aligned IT strategy map providing valuable insights to business executives. However, problem with strategy map method is that it is not easy to use which can lend itself to various interpretations. This is because linkages between the strategic objectives in the four strategy map perspectives are not explicit which makes SM ambiguous. Goal modelling approaches from Requirements Engineering (RE) have proven rigorous in elicitation and representation of information system requirements. In an attempt to make explicit the causal relationships of SM linkages meaningful this research proposes the use of goal modelling approach i*.
Bakker, S, van den Berg, R, Pijnappel, S & van den Hoven, E 1970, 'Sounds Like Home: Sonification and Physical Interaction in the Periphery and Center of the Attention', Proceedings of ISon 2010 - Interactive Sonification Workshop: Human Interaction with Auditory Displays, ISon 2010 3rd International Interactive Sonification Workshop, KTH School of Computer Science and Communication (CSC), Stockholm, Sweden, pp. 55-58.
View description>>
Our auditory perception skills enable us to selectively place one auditory channel in the center of our attention and simultaneously monitor others in the periphery of our attention. In this paper, we present and discuss two design cases that explore the design of physical interactive systems that leverage this perception skill to unobtrusively communicate relevant information. Sounds are mechanically generated by these systems, which strengthens the coupling between signification and physical interface. Both resulting designs are aimed to be used in a home environment.
Bakker, S, van den Hoven, E & Eggen, B 1970, 'Exploring Interactive Systems Using Peripheral Sounds', HAPTIC AND AUDIO INTERACTION DESIGN, The 5th InternationalWorkshop on Haptic and Audio Interaction Design, Springer-Verlag, Copenhagen, Denmark., pp. 55-64.
View description>>
Our everyday interaction in and with the physical world, has facilitated the development of auditory perception skills that enable us to selectively place one auditory channel in the center of our attention and simultaneously monitor others in the periphery. We search for ways to leverage these auditory perception skills in interactive systems. In this paper, we present three working demonstrators that use sound to subtly convey information to users in an open office. To qualitatively evaluate these demonstrators, each of them has been implemented in an office for three weeks. We have seen that such a period of time, sounds can start shifting from the center to the periphery of the attention. Furthermore, we found several issues to be addressed when designing such systems, which can inform future work in this area.
Behbood, V, Lu, J & Zhang, G 1970, 'Adaptive inference-based learning and rule generation algorithms in Fuzzy Neural Network for failure prediction', 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering, 2010 IEEE International Conference on Intelligent Systems and Knowledge Engineering (ISKE), IEEE, China, pp. 33-38.
View/Download from: Publisher's site
View description>>
highly desirable for decision makers and regulators in the finance industry. This study develops a new Failure Prediction (FP) approach which effectively integrates a fuzzy logic-based adaptive inference system with the learning ability of a neural network to generate knowledge in the form of a fuzzy rule base. This FP approach uses a preprocessing phase to deal with the imbalanced data-sets problem and develops a new Fuzzy Neural Network (FNN) including an adaptive inference system in the learning algorithm along with its network structure and rule generation algorithm as a means to reduce prediction error in the FP approach.
Behbood, V, Lu, J & Zhang, G 1970, 'Intelligent financial warning support system', International Conference on Applied Statistics and Financial Mathematics, International Conference on Applied Statistics and Financial Mathematics, IOS Press, Hong Kong.
Behrouznia, A, Saberi, M, Azadeh, A, Asadzadeh, SM & Pazhoheshfar, P 1970, 'An adaptive network based fuzzy inference system-fuzzy data envelopment analysis for gas consumption forecasting and analysis: The case of South America', 2010 International Conference on Intelligent and Advanced Systems, Advanced Systems (ICIAS 2010), IEEE.
View/Download from: Publisher's site
View description>>
This paper presents an adaptive-network-based fuzzy inference system (ANFIS)-fuzzy data envelopment analysis (FDEA)) for long-term natural gas (NG) consumption forecasting and analysis. Six models are proposed to forecast annual NG demand. 104 ANFIS have been constructed and tested in order to find the best ANFIS for natural gas (NG) consumption. Two parameters have been considered in construction and examination of plausible ANFIS models. Six different membership functions and several linguistic variables are considered in building ANFIS. The proposed models consist of two input variables, namely, Gross Domestic Product (GDP) and Population. All trained ANFIS are then compared with respect to mean absolute percentage error (MAPE). To meet the best performance of the intelligent based approaches, data are pre-processed (scaled) and finally our outputs are post-processed (returned to its original scale). FDEA is used to examine the behavior of gas consumption. To show the applicability and superiority of the ANFIS-FDEA approach, actual gas consumption in six Southern America countries from 1980 to 2007 is considered. The gas consumption is then forecasted up to 2015. The ANFIS-FDEA approach is capable of dealing both complexity and uncertainty as well several other unique features discussed in this paper.
Berglund, A & Lister, R 1970, 'Introductory Programming and the Didactic Triangle', Conferences in Research and Practice in Information Technology Series, Australasian Computing Education Conference, Australian Computer Society, Brisbane, Australia, pp. 35-44.
View description>>
In this paper, we use Kansanen's didactic triangle to structure and analyse research on the teaching and learning of programming. Students, teachers and course content are the three entities that form the corners of the didactic triangle. The edges of the triangle represent the relationships between these three entities. We argue that many computing educators and computing education researchers operate from within narrow views of the didactic triangle. For example, computing educators often teach programming based on how they relate to the computer, and not how the students relate to the computer. We conclude that, while research that focuses on the corners of the didactic triangle is sometimes appropriate, there needs to be more research that focuses on the edges of the triangle, and more research that studies the entire didactic triangle. © 2010, Australian Computer Society, Inc.
Bian, W, Li, J & Tao, D 1970, 'Feature Extraction for fMRI-Based Human Brain Activity Recognition.', MLMI, International Workshop on Machine Learning in Medical Imaging, Springer, Beijing, China, pp. 148-156.
View/Download from: Publisher's site
View description>>
Mitchell et al. [9] demonstrated that support vector machines (SVM) are effective to classify the cognitive state of a human subject based on fRMI images observed over a single time interval. However, the direct use of classifiers on active voxels veils
Blount, M, McGregor, C, James, A, Sow, D, Kamaleswaran, R, Tuuha, S, Percival, J & Percival, N 1970, 'On the integration of an artifact system and a real-time healthcare analytics system', Proceedings of the 1st ACM International Health Informatics Symposium, IHI '10: ACM International Health Informatics Symposium, ACM, pp. 647-655.
View/Download from: Publisher's site
View description>>
As a result of advances in software technology, particularly stream computing, it is now possible to implement scalable systems capable of real-time analysis of multiple physiological data streams of multiple patients. There is a growing body of evidence showing that early onset indicators of some medical conditions can be observed as subtle changes in the physiological data streams of affected patients. These real-time healthcare analytics systems can detect the early onset indicators and thus may result in earlier detection of the medical condition which may lead to earlier intervention and improved patient outcomes. Blood draws and nasal suctioning can cause changes in the values of some physiological data stream elements. Such events, sometimes referred to as physiological stream artifacts can cause the real-time analytics systems to generate false alarms since the systems assume each data element is indicative the patient's underlying physiological condition. In order to minimize the generation of false alarms, artifact events must be captured and integrated in real time with the analytics result. We present the summary of an artifact study in a tertiary neonatal intensive care unit within a children's hospital where a real-time analytics system is being piloted as part of a clinical research study. We utilize the information gathered relating to the nature of these events and propose a framework to integrate the artifact events with the analytic results in real time © 2010 ACM.
Bogg, P, Low, G, Henderson-Sellers, B & Beydoun, G 1970, 'Work Product-driven Software Development Methodology Improvement.', ICSOFT (2), International Conference on Software and Data Technologies, SciTePress, Athens, Greece, pp. 5-13.
View description>>
A work product is a tangible artifact used during a software development project; for example, a requirements specifications or class model diagram. Towards a general approach for evaluating and potentially improving the quality of methodologies, this paper proposes utilizing a work product-based approach to method construction known as the âwork product poolâ approach to situational method engineering to accomplish this quality improvement. Starting from the final software application and identifying work product pre-requisites by working backwards through the methodology process, work product inter-dependencies are revealed. Using method fragments from a specific methodology (here, MOBMAS), we use this backward chaining approach to effectively recreate that methodology. Evaluation of the artificially recreated methodology allows the identification of missing and/or extraneous method elements and where process steps could be improved.)
Boustedt, J, McCartney, R, Tenenberg, J, Gehringer, EF, Lister, R & Musicant, D 1970, 'It seemed like a good idea at the time', Proceedings of the 41st ACM technical symposium on Computer science education, SIGCSE10: The 41st ACM Technical Symposium on Computer Science Education, ACM, pp. 558-559.
View/Download from: Publisher's site
View description>>
We often learn of successful pedagogical experiments, but we seldom hear of the the ones that failed. For this special session we solicited submissions from the SIGCSE membership, selected the best from among these, and will have presentations at the session by the selected authors. Our contributions describe pedagogical approaches that seemed to be good ideas but turned out as failures. Contributors will describe their pedagogical experiment, the rationale for the experiment, evidence of failure, and lessons learned.
Bozzo, A, Panozzo, D, Puppo, E, Pietroni, N & Rocca, L 1970, 'Adaptive Quad Mesh Simplification.', Eurographics Italian Chapter Conference, Eurographics, pp. 95-102.
Bremner, MJ 1970, 'Classical simulation of commuting quantum computations implies collapse of the polynomial hierarchy', 10th Asian Conference on Quantum Information Science, Tokyo, Japan.
View description>>
The 10th Asian Conference on Quantum Information Science (AQIS'10) is a meeting focused on quantum information science and technology. Since this is a new interdisciplinary field, its broad scope includes advances in various fields such as quantum physics, computer science, mathematics and information technologies. This event is the memorable tenth conference which builds upon a successful series of EQIS'01-05 and AQIS'06-09 conferences. It will comprise tutorials and presentations of invited speakers, selected papers and posters. Areas of coverage include but are not limited to: Quantum computation, algorithms and complexity Quantum information theory Quantum error-correction and fault-tolerance, thresholds Quantum cryptography Quantum communications experiments and theory Quantum optics, NMR and solid-state technologies Quantum processors and computers design Quantum programming languages and semantics AQIS'10 will be held at the University of Tokyo, Tokyo, Japan.
Bródka, P, Musial, K & Kazienko, P 1970, 'A Method for Group Extraction in Complex Social Networks', Communications in Computer and Information Science, Springer Berlin Heidelberg, pp. 238-247.
View/Download from: Publisher's site
View description>>
The extraction of social groups from social networks existing among employees in the company, its customers or users of various computer systems became one of the research areas of growing importance. Once we have discovered the groups, we can utilise them, in different kinds of recommender systems or in the analysis of the team structure and communication within a given population. The shortcomings of the existing methods for community discovery and lack of their applicability in multi-layered social networks were the inspiration to create a new group extraction method in complex multi-layered social networks. The main idea that stands behind this new concept is to utilise the modified version of a measure called by authors multi-layered clustering coefficient. © 2010 Springer-Verlag.
Catley, C, Smith, K, McGregor, C, James, A & Eklund, JM 1970, 'A framework to model and translate clinical rules to support complex real-time analysis of physiological and clinical data', Proceedings of the 1st ACM International Health Informatics Symposium, IHI '10: ACM International Health Informatics Symposium, ACM, pp. 307-315.
View/Download from: Publisher's site
View description>>
We present a framework to model and translate clinical rules to support complex real-time analysis of both synchronous physiological data and asynchronous clinical data. The framework is demonstrated through a case study in a neonatal intensive care context showing how a clinical rule for detecting an apnoeic event is modeled across multiple physiological data streams in the Artemis environment, which employs IBM's InfoSphere Streams middleware to support real-time stream processing. Initial clinical hypotheses for apnoea detection are modeled using UML activity diagrams which are subsequently translated into Stream's SPADE code to be deployed in Artemis to deliver real-time decision support. Our aim is to provide a Clinical Decision Support System capable of identifying and detecting patterns in physiological data streams indicative of the onset of clinically significant conditions that that may adversely affect health outcomes. Benefits associated with our approach include: 1) reduced time and effort on the clinician's part to assess health data from multiple sources; 2) the ability to allow clinicians to control the rules-engine of Artemis to enhance clinical care within their unique environments; 3) the ability to apply clinical alerts to both synchronous and asynchronous data; and 4) the ability to continuously process data in real-time. © 2010 ACM.
Chuang, C-H, Lai, P-C, Ko, L-W, Kuo, B-C & Lin, C-T 1970, 'Driver's cognitive state classification toward brain computer interface via using a generalized and supervised technology', The 2010 International Joint Conference on Neural Networks (IJCNN), 2010 International Joint Conference on Neural Networks (IJCNN), IEEE, Barcelona, SPAIN.
View/Download from: Publisher's site
Cleland-Huang, J & Zowghi, D 1970, 'Message from the Chairs', 2010 18th IEEE International Requirements Engineering Conference, 2010 IEEE 18th International Conference on Requirements Engineering (RE), IEEE.
View/Download from: Publisher's site
Dovey, K & Mooney, G 1970, 'The Social Dynamics of Generating and Leveraging Intellectual Capital for Innovation', PROCEEDINGS OF THE 2ND EUROPEAN CONFERENCE ON INTELLECTUAL CAPITAL, 2nd European Conference on Intellectual Capital, ACAD CONFERENCES LTD, Lisbon, PORTUGAL, pp. 225-231.
View description>>
This paper explores the factors influencing an enterprises ability to generate and deploy intellectual capital in support of its strategic intent to innovate. Drawing on two research projects, we focus upon the leadership practices that enable an enterprise to innovatively leverage the intellectual capital that is potentially available to it. One project, using a phenomenological methodology, explores, at a high level, the social dynamics within twenty-five medium-sized enterprises noted for their innovative capabilities, in Sydney, Australia. The other project explores in finer detail, through an action research methodology, the transformation of stakeholder relationships within another medium-sized Sydney enterprise that has become highly innovative over the past five years. Our findings show that the most important forms of intangible capital for innovation are relationshipbased and are leveraged through stakeholder collaboration.
Erfani, SZ, Mojtahedi, SMH & Mousavi, SM 1970, 'Evaluating and implementing of Knowledge Management in the Mobile Telephone Switching Office', 2010 IEEE International Conference on Management of Innovation & Technology, 2010 IEEE International Conference on Management of Innovation & Technology, IEEE, pp. 904-909.
View/Download from: Publisher's site
View description>>
This research seeks to explore the current Knowledge Management cycle practice in the Mobile Telephone Switching Office including knowledge creation, knowledge organization, knowledge sharing and knowledge leverage. The Mobile Telephone Switching Office is located in the Sistan and Baluchistan province in the southeast of Iran. We have presented the proposed model by using the knowledge management concept and taking advantage from the concept of EFQM excellence model and Deming cycle, by the purpose of continuous improvement. This paper suggests proper KM implementation is a managerial approach that can turn an organization to an agile one. In order to verify and validate the performed research the planned model has been accomplished in the Mobile Telephone Switching Office, positive and acceptable results were obtained and organizations total factor productivity increase was achieved which was appreciated by the organization. © 2010 IEEE.
Gaddis, E, Adams, C & Voinov, A 1970, 'Effective engagement of stakeholders in Total Maximum Daily Load development and implementation', Modelling for Environment's Sake: Proceedings of the 5th Biennial Conference of the International Environmental Modelling and Software Society, iEMSs 2010, pp. 530-538.
View description>>
Total Maximum Daily Loads (TMDLs) identify the maximum amount of pollution that a water body can receive and still support its designated uses and allocates the maximum load to specific sources in the watershed. In the United States, The Clean Water Act requires public participation in the process of TMDL development. This requirement has been met through simple presentation of results at public meetings, strategic partnerships with key stakeholders, and/or to advisory committees in which stakeholders participate in critical decisions about TMDL definition and implementation. These decisions include model selection and assumptions, selection of water quality endpoints, load allocations, TMDL review, and implementation planning. In this article, we discuss the benefits and challenges of early and targeted engagement of stakeholders in TMDL development through a participatory modelling process based on our experience in Utah and Vermont.
Goldie, J, McGregor, C & Murphy, B 1970, 'Determining levels of arousal using electrocardiography: A study of HRV during transcranial magnetic stimulation', 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, 2010 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2010), IEEE, Buenos Aires, ARGENTINA, pp. 1198-1201.
View/Download from: Publisher's site
Gong, L, Xu, Y, Liu, B, Gui, L, Rong, B, Wu, Y & Zhang, W 1970, 'A Modified Belief Propagation Algorithm Based on Attenuation of the Extrinsic LLR', 2010 IEEE 72nd Vehicular Technology Conference - Fall, 2010 IEEE Vehicular Technology Conference (VTC 2010-Fall), IEEE, Ottawa, CANADA.
View/Download from: Publisher's site
Grant, S, Dyson, LE & Robertson, T 1970, 'A participatory approach to the inclusion of indigenous Australians in information technology', Proceedings of the 11th Biennial Participatory Design Conference, PDC '10: The 11th Biennial Participatory Design Conference, ACM, University of Technology Sydney, Australia, pp. 207-210.
View/Download from: Publisher's site
View description>>
Improving Indigenous access to university education has been a major focus in Australia over the last four decades. However, despite success in several areas of recognised priority to the Indigenous community, participation in Information Technology (IT) degree programs remained very low throughout the 1980s and â90s. The University of Technology, Sydney began a project to address this very issue in 2001. The Indigenous Participation in IT Project was initiated by the Faculty of Information Technology in collaboration with Indigenous Australians and members of staff of the Faculty. This project culminated in the design of a participatory IT program that has successfully seen the numbers of Indigenous students and staff in the Faculty increase. A number of factors were identified as contributing to this success. These included an improvement to recruitment processes, the building of a personalised approach to student support and the growing acceptance of the program as part of the academic culture of the faculty. Additionally, of great importance has been the development of the program as a collaboration between Indigenous staff and students and nonIndigenous staff at all levels of decision making and implementation.
Gunsel, A & Cetindamar, D 1970, 'Technology Audit: An Empirical Study on SMEs of Istanbul', PROCEEDINGS OF THE 5TH EUROPEAN CONFERENCE ON INNOVATION AND ENTREPRENEURSHIP, 5th European Conference on Entrepreneurship and Innovation, ACAD CONFERENCES LTD, Natl & Kapodistrian Univ Athens, Athens, GREECE, pp. 263-272.
Guo, Y, Zhu, J, Wang, Y, Lu, H & Lin, Z 1970, 'Performance Analysis of a Permanent Magnet Claw Pole SMC Motor with a Nonlinear Inductance Model', Proceedings of Asia-Pacific Symposium on Applied Electromagnetics and Mechanics (APSAEM2010), Japan Society of Applied Electromagnetics and Mechanics, Kuala Lumper, Malaysia, pp. 348-351.
Henderson-Sellers, B & Gonzalez-Perez, C 1970, 'Granularity in Conceptual Modelling: Application to Metamodels', CONCEPTUAL MODELING - ER 2010, International Conference on Conceptual Modelling, Springer-Verlag Berlin Heidelberg, Vancouver, Canada, pp. 219-232.
View/Download from: Publisher's site
View description>>
The granularity of conceptual models depends at least in part on the granularity of their underpinning metamodel. Here we investigate the theory of granularity as it can be applied to conceptual modelling and, especially, metamodelling for information systems development methodologies. With a background context of situational method engineering, this paper applies some theoretical works on granularity to the study of current metamodelling approaches. It also establishes some granularity-related best practices to take into account when adopting a metamodel, especially for its future use in developing method fragments for situational method engineering. Using these best practices will result in components of better quality and, consequently, better conceptual models and methodologies.
Hijikata, Y & Xu, G 1970, 'SNSMW 2010 Workshop Organizers’ Message', Database Systems For Advanced Applications, 15th International Conference on DASFAA 2010, Springer Berlin Heidelberg, Tsukuba, JAPAN, pp. 239-239.
View/Download from: Publisher's site
View description>>
NA
I-Jan Wang, Lun-De Liao, Yu-Te Wang, Chi-Yu Chen, Bor-Shyh Lin, Shao-Wei Lu & Chin-Teng Lin 1970, 'A Wearable Mobile Electrocardiogram measurement device with novel dry polymer-based electrodes', TENCON 2010 - 2010 IEEE Region 10 Conference, 2010 IEEE Region 10 Conference (TENCON 2010), IEEE, Fukuoka, JAPAN, pp. 379-384.
View/Download from: Publisher's site
Jansen, M, Qiao, Y & Sarma, J 1970, 'Deterministic Black-Box Identity Testing $π$-Ordered Algebraic Branching Programs', Leibniz International Proceedings in Informatics, LIPIcs, International Conference on Foundations of Software Technology and Theoretical Computer Science, Dagstuhl Publishing, Chennai, India, pp. 296-307.
View/Download from: Publisher's site
View description>>
In this paper we study algebraic branching programs (ABPs) with restrictionson the order and the number of reads of variables in the program. Given apermutation $\pi$ of $n$ variables, for a $\pi$-ordered ABP ($\pi$-OABP), forany directed path $p$ from source to sink, a variable can appear at most onceon $p$, and the order in which variables appear on $p$ must respect $\pi$. AnABP $A$ is said to be of read $r$, if any variable appears at most $r$ times in$A$. Our main result pertains to the identity testing problem. Over any field$F$ and in the black-box model, i.e. given only query access to the polynomial,we have the following result: read $r$ $\pi$-OABP computable polynomials can betested in $\DTIME[2^{O(r\log r \cdot \log^2 n \log\log n)}]$. Our next set of results investigates the computational limitations of OABPs.It is shown that any OABP computing the determinant or permanent requires size$\Omega(2^n/n)$ and read $\Omega(2^n/n^2)$. We give a multilinear polynomial$p$ in $2n+1$ variables over some specifically selected field $G$, such thatany OABP computing $p$ must read some variable at least $2^n$ times. We showthat the elementary symmetric polynomial of degree $r$ in $n$ variables can becomputed by a size $O(rn)$ read $r$ OABP, but not by a read $(r-1)$ OABP, forany $0 < 2r-1 \leq n$. Finally, we give an example of a polynomial $p$ and twovariables orders $\pi \neq \pi'$, such that $p$ can be computed by a read-once$\pi$-OABP, but where any $\pi'$-OABP computing $p$ must read some variable atleast $2^n$
Johnston, AJ & Humberstone, J 1970, 'Elective Music Students Experiences with Jam2Jam', 7th Australian Conference on Interactive Entertainment, Australian Conference on Interactive Entertainment, Massey University, College of Creative Arts, Institute of Communication Design, Wellington, New Zealand, pp. 8-15.
View description>>
This paper presents findings from a trial of the interactive music software Jam2Jam in a classroom music setting. Jam2Jam is software which allows musical novices to control generative music in real time. It has an interface which enables users to control multiple audio-visual parameters with a single gesture an approach intended to facilitate complex, conversational interaction. Examination of students experiences with Jam2Jam indicates that students find Jam2Jam attractive and that it has considerable potential. However, a number of issues for improvement, particularly a need for increased transparency of operation are identified. Extensions to Jam2Jam which would enable students to incorporate more of their own material into the music and visual they create during jam sessions are also proposed.
Johnston, AJ & Johnson, C 1970, 'Extreme Programming in the University', Proceedings of Annual International Conference on Computer Science Education: Innovation and Technology (CSEIT 2010), Annual International Conference on Computer Science Education: Innovation and Technology, Global Science and Technology Forum, Phuket, Thailand, pp. 3-8.
View description>>
This paper summarises our experiences teaching Extreme Programming to undergraduate students over a period of 8 years. We describe an approach in which students learn about the Extreme Programming (XP) method by using it on real software development projects. This experiential learning technique has been effective in helping students understand how XP works in practice and helped them to develop the skills to reflect on their current approaches to software development and critically evaluate agile methods. Problems, including a steep learning curve for some XP practices and difficulties scheduling pair-programming time in a university environment are also Identified.
Johnston, AJ, Beilharz, KA, Chen, Y & Ferguson, S 1970, 'Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010)', Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), University of Technology Sydney, Sydney, Australia.
Juszczyszyn, K, Musial, A, Musial, K & Brodka, P 1970, 'Utilizing Dynamic Molecular Modelling Technique for Predicting Changes in Complex Social Networks', 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 2010 IEEE/ACM International Conference on Web Intelligence-Intelligent Agent Technology (WI-IAT), IEEE, pp. 1-4.
View/Download from: Publisher's site
View description>>
We present a method that utilises dynamic molecular modelling technique to track the changes within complex social network. The users forming a social network are interpreted as large sets of interacting particles. The data for the conducted research was obtained from e-mail communication within Enron company. The social network of employees was extracted and used to evaluate the methodology of social network dynamics modelling. © 2010 IEEE.
Kaleem, Z, Lee, CK, Saqib, M, Mohsin, S & Salim, F 1970, 'The way towards amplifier design using CAD (ADS) tool', International Conference on Advanced Communication Technology, ICACT, pp. 962-967.
View description>>
This paper mainly focuses on design arts and operational mechanism of ADS tool. ADS tool is characterized by reliable, efficient and controlled functioning as compare to conventional approaches. In this paper, we describe an ADS tool based interactive procedure that provides the students in electrical and computer engineering programs with an easy-to-use reference and overview of an amplifier design. This multimedia-based system covers topics that start with introductory basic concepts in amplifier design and conclude with advanced and detailed concepts using the ADS tool.
Kamaleswaran, R, McGregor, C & Eklund, JM 1970, 'A method for clinical and physiological event stream processing', 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, 2010 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2010), IEEE, Buenos Aires, ARGENTINA, pp. 1170-1173.
View/Download from: Publisher's site
Karimi, F & Poo, DCC 1970, 'IT investment decision making: A decision analysis cycle model', 16th Americas Conference on Information Systems 2010, AMCIS 2010, pp. 2348-2357.
View description>>
Organizations spend substantial capital budgets on IT investment in order to achieve potential competitive advantages. However, IT investments are characterized with complexity, uncertainty, and risk. These attributes make appropriate IT investment decision making a challenging task. Decision analysis cycle methodology is a conceptual framework that facilitates achieving clarity of action in difficult decision problems. In the present study, this methodology is explained and applied on an imaging system investment of a mortgage bank to investigate how it can tackle complexity, risk and uncertainty in IT investment decisions.
Kazienko, P, Brodka, P & Musial, K 1970, 'Individual Neighbourhood Exploration in Complex Multi-layered Social Network', 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 2010 IEEE/ACM International Conference on Web Intelligence-Intelligent Agent Technology (WI-IAT), IEEE, pp. 5-8.
View/Download from: Publisher's site
View description>>
Social networks can be extracted from different data about communication or common activities in organizations, companies or various Internet-based services. Different types of data processed may result in creation of separate layers in the complex multi-layered social network. Analysis of neighbourhoods of network members and their utilization to social group discovery appears to be an interesting and important research domain. Since there is no measure to evaluate structure of the neighbourhoods in the multi-layered social network, a new measure called cross layered multi-layered clustering coefficient (CLMCC) is proposed in the paper. It enables to analyse the density of mutual connections of neighbours that occur in at least a given number of layers in a social network. Additionally, experimental studies on real-world data are presented. © 2010 IEEE.
Kazienko, P, Brodka, P, Musial, K & Gaworecki, J 1970, 'Multi-Layered Social Network Creation Based on Bibliographic Data', 2010 IEEE Second International Conference on Social Computing, 2010 IEEE Second International Conference on Social Computing (SocialCom), IEEE, pp. 407-412.
View/Download from: Publisher's site
View description>>
A method for extraction of the multi-layered social network based on the data about human collaborative achievements, in particular scientific papers, is presented in the paper. The objects linking people form a hierarchy, which is flattened in the pre-processing stage. Only one level of the hierarchy remains together with new activities moved from its other levels. Separate layers of the multi-layered social network are created based on these pre-processed activities. © 2010 IEEE.
Khoo, I-H, Reddy, HC, Van, L-D & Lin, C-T 1970, 'Generalized formulation of 2-D filter structures without global broadcast for VLSI implementation', 2010 53rd IEEE International Midwest Symposium on Circuits and Systems, 2010 53rd IEEE International Midwest Symposium on Circuits and Systems (MWSCAS), IEEE, Seattle, WA, pp. 426-429.
View/Download from: Publisher's site
Kocaballi, AB 1970, 'Wearable environments', Proceedings of the 28th Annual European Conference on Cognitive Ergonomics, ECCE '10: European Conference on Cognitive Ergonomics, ACM, pp. 315-318.
View/Download from: Publisher's site
View description>>
Motivation - The main motivation of this research is to gain a better understanding of dynamic agency between human, machine and environment relations mediated by a synthesis of wearable computing and smart environments technologies. Research approach - The study follows a research through design approach. There are two main stages of the study involving a series of workshops involving designed prototype systems with different configurations. The prototype systems are designed based on the idea of "Wearable Environments" combining wearable computing and smart environments approaches to ubiquitous computing together. The interactions between prototype systems and human participants are analysed from a post-phenomenological perspective. Findings/Design - The preliminary workshop study showed that the perception and interpretation of sonic and tactile feedbacks and consequently the strategies of subjects were highly dependent on the places of wearable devices attached to. Research limitations/Implications - The study deals with only low-level cognitive actions and microperception shaping the machine-mediated human agency. Originality/Value - The research will clarify some critical dimensions and aspects of complex phenomenon of agency in service of designing wearable environments by synthesizing the approaches of the fields of wearable computing and smart environments. Take away message - Wearable environments with enactive interfaces can provide unique opportunities for investigating and reconfiguring various forms of human-machine-environment relations.
Kocaballi, AB, Gemeinboeck, P & Saunders, R 1970, 'Enabling new forms of agency using wearable environments', Proceedings of the 8th ACM Conference on Designing Interactive Systems, DIS '10: Designing Interactive Systems Conference 2010, ACM, pp. 248-251.
View/Download from: Publisher's site
View description>>
Technological artefacts can mediate the relations between humans and the environment: mediation changes our agency, which can be defined as our capacity for action. There can be different types of technological mediation and each type shapes our agency differently. Our model of wearable environments, which combines wearable computing and smart environment approaches, is useful for exploring new types of relations and, by extension, new forms of agency. In this paper, we present the first stage of developing a wearable environment system involving a series of workshops using two prototype devices. We evaluated the workshop activities according to a post-phenomenological account: this has allowed us to analyse the transformation of machine-mediated agency vis-à-vis two dimensions: perception and praxis. Our findings showed that interpretations of sonic and tactile feedback were highly dependent upon the placement of the sensing and effecting capacities of the system. © 2010 ACM.
Li, C-H, Lin, C-T, Kuo, B-C & Chu, H-S 1970, 'An automatic method for selecting the parameter of the RBF kernel function to support vector machines', 2010 IEEE International Geoscience and Remote Sensing Symposium, IGARSS 2010 - 2010 IEEE International Geoscience and Remote Sensing Symposium, IEEE, Honolulu, HI, pp. 836-839.
View/Download from: Publisher's site
Li, C-H, Lin, C-T, Kuo, B-C & Ho, H-H 1970, 'An Automatic Method for Selecting the Parameter of the Normalized Kernel Function to Support Vector Machines', 2010 International Conference on Technologies and Applications of Artificial Intelligence, 2010 International Conference on Technologies and Applications of Artificial Intelligence (TAAI), IEEE, Hsinchu, TAIWAN, pp. 226-232.
View/Download from: Publisher's site
Li, J & Tao, D 1970, 'An Exponential Family Extension to Principal Component Analysis.', Aust. J. Intell. Inf. Process. Syst., International Conference on Neural Information Processing, Springer, Sydney, Australia, pp. 1-9.
View description>>
In this paper, we present a unified probabilistic model for constrained factorisation models, which employs exponential family distributions to represent the constrained factors. Our main objective is to provide a versatile framework, on which prototype models with various constraints can be implemented effortlessly. For learning the proposed stochastic model, Gibbs sampling is employed for model inference. We also demonstrate the utility and versatility of the model by experiments.
Li, J & Tao, D 1970, 'Boosted Dynamic Cognitive Activity Recognition from Brain Images.', ICMLA, International Conference on Machine Learning and Applications, IEEE Computer Society, Washington, D.C., USA, pp. 361-366.
View/Download from: Publisher's site
View description>>
Functional Magnetic Resonance Imaging (fMRI) has become an important diagnostic tool for measuring brain haemodynamics. Previous research on analysing fMRI data mainly focuses on detecting low-level neuron activation from the ensued haemodynamic activities. An important recent advance is to show that the high-level cognitive status is recognisable from a period of fMRI records. Nevertheless, it would also be helpful to reveal dynamics of cognitive activities during the period. In this paper, we tackle the problem of discovering the dynamic cognitive activities by proposing an algorithm of boosted structure learning. We employ statistic model of random fields to represent the dynamics of the brain. To exploit the rich fMRI observations with reasonable model complexity, we build multiple models, where one model links the cognitive activities to only a fraction of the fMRI observations. We combine the simple models by using an altered AdaBoost scheme for multi-class structure learning and show theoretical justification of the proposed scheme. Empirical test shows the method effectively links the physiological and the psychological activities of the brain. © 2010 IEEE.
Li, J & Tao, D 1970, 'Simple Exponential Family PCA.', AISTATS, JMLR.org, pp. 453-460.
View description>>
Bayesian principal component analysis (BPCA), a probabilistic reformulation of PCA with Bayesian model selection, is a systematic approach to determining the number of essential principal components (PCs) for data representation. However, it assumes that data are Gaussian distributed and thus it cannot handle all types of practical observations, e.g. integers and binary values. In this paper, we propose simple exponential family PCA (SePCA), a generalised family of probabilistic principal component analysers. SePCA employs exponential family distributions to handle general types of observations. By using Bayesian inference, SePCA also automatically discovers the number of essential PCs. We discuss techniques for fitting the model, develop the corresponding mixture model, and show the effectiveness of the model based on experiments.
Liang, J & Huang, ML 1970, 'Highlighting in Information Visualization: A Survey', 2010 14th International Conference Information Visualisation, 2010 14th International Conference Information Visualisation (IV), IEEE, London, UK, pp. 79-85.
View/Download from: Publisher's site
View description>>
Highlighting was the basic viewing control mechanism in computer graphics and visualization to guide users' attention in reading diagrams, images, graphs and digital texts. As the rapid growth of theory and practice in information visualization, highlighting has extended its role that acts as not only a viewing control, but also an interaction control and a graphic recommendation mechanism in knowledge visualization and visual analytics. In this work, we attempt to give a formal summarization and classification of the existing highlighting methods and techniques that can be applied in Information Visualization, Visual Analytics and Knowledge Visualization. We propose a new three-layer model of highlighting. We discuss the responsibilities of each layer in the different stage of the visual information processing. © 2010 IEEE.
Litchfield, A, Dyson, LE, Wright, M, Pradhan, S & Courtille, B 1970, 'Student-Produced Vodcasts as Active Metacognitive Learning', 2010 10th IEEE International Conference on Advanced Learning Technologies, 2010 IEEE 10th International Conference on Advanced Learning Technologies (ICALT), IEEE, Sousse, Tunisia, pp. 560-564.
View/Download from: Publisher's site
View description>>
Pod and vodcasts are increasingly used in international higher education. Most are produced by faculty either to replace the traditional lecture or to provide an alternative source of lecture material for students to listen to at convenient times. In contrast this paper examines the learning outcomes achieved when studentsâ produce vodcasts as an assigned task. When producing the vodcasts studentsâ were no longer 'time poor' often aiming only for a Pass. They were highly motivated and involved in activities designed to address the learning objectives and engaged in active metacognitive learning. The students were involved in peer learning developing research, teamwork and communication understandings and skills, all desirable professional attributes. Evaluation of the pilot of the student-produced vodcast assignment indicates that there was very high engagement and that the learning outcomes achieved were outstanding. Student pre-and-post assignment self-assessment surveys indicate they learnt significantly in the stated assignment objectives of (1) improved awareness of IT careers - 29% to 70% good awareness - and (2) improved skills in multimedia communication - 27% to 49% good video recording and 16% to 51% good multimedia editing skills.
Liu, B, Gong, L, Xu, Y, Rong, B, Wu, Y, Gagnon, G, Gui, L & Zhang, W 1970, 'Designing LDPC Codes with Gated Noise Model for Terrestrial Mobile DTV Channels', 2010 IEEE 72nd Vehicular Technology Conference - Fall, 2010 IEEE Vehicular Technology Conference (VTC 2010-Fall), IEEE, Ottawa, CANADA.
View/Download from: Publisher's site
Liu, B, Wu, Y, Rong, B, Gagnon, G, Nadeau, C, El-Tanany, MS, Gui, L & Zhang, W 1970, 'Transmitter identification of ATSC DTV under mobile environment', 2010 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), 2010 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), IEEE.
View/Download from: Publisher's site
View description>>
The ATSC A/110 standard specifies a transmitter identification (TxID) system using RF water-mark. The system, using a 16-bit Kasami sequence, can identify up to 16 millions DTV Transmitters. With the recent development of the ATSC Mobile DTV (M/H), the implementation of transmitter identification will enable new capabilities to receivers in mobile environment. In this paper, we investigate the TxID reception over mobile channels with Doppler fading, dynamic multipath, carrier offset and clock offset, etc. Our simulation results have confirmed the validity of TxID scheme in mobile environment.
Lopez-Lorca, A, Beydoun, G, Sterling, L & Miller, T 1970, 'Ontology-based Validation of Agent Oriented Modelling.', EKAW (Posters and Demos), CEUR-WS.org.
View description>>
Despite the potential of Multi-Agent Systems (MAS), this tech-nology has not been widely adopted by industry yet. Due to its complexity, errors in modelling activities can be costly. Early validation of MAS models can prevent rework or building a sys-tem non-compliant with client's specification. We propose a gen-eral ontology-based process to validate any kind of software mod-els that can be adapted in a broad range of software development projects. We illustrate this for MAS development as its complex-ity justifies additional costs associated with applying our add-on validation process. This work provides early evidence of the soundness of our approach. We successfully validate and improve the quality of MAS models for a real-life development project showing that our MAS models validation process can contribute to harnessing the commercial potential of MAS technology. Copyright 2010 ACM.
Lopez-Lorca, AA, Beydoun, G, Sterling, L & Miller, T 1970, 'Ontology-mediated Validation of Software Models.', ISD, International Conference on Information Systems Development, Springer, Prague, Czech Republic, pp. 455-467.
View/Download from: Publisher's site
View description>>
When errors in software modelling activities propagate to later phases of software development lifecycle, they become costlier to fix and lower the quality of the final product. Early validation of software models can prevent rework and incorrect development non-compliant with client's specification. In this paper we advocate the use of ontologies to validate and improve the quality of software models as they are being developed, at the same time bridging the traditional gap between developers and clients.We propose a general ontology-mediated process to validate software models that can be adapted in a broad range of software development projects.We illustrate this for Multi-Agent Systems (MAS) development providing early evidence of the soundness of our approach.We successfully validate and improve the quality of MAS models for a real-life development project, illustrating the ontology-mediated models validation in a commercial setting. © Springer Science+Business Media, LLC 2011.
Lun-De Liao, I-Jan Wang, Che-Jui Chang, Bor-Shyh Lin, Chin-Teng Lin & Tseng, KC 1970, 'Human cognitive application by using Wearable Mobile Brain Computer Interface', TENCON 2010 - 2010 IEEE Region 10 Conference, 2010 IEEE Region 10 Conference (TENCON 2010), IEEE, Fukuoka, JAPAN, pp. 346-351.
View/Download from: Publisher's site
Mairiza, D & Zowghi, D 1970, 'An ontological framework to manage the relative conflicts between security and usability requirements', 2010 Third International Workshop on Managing Requirements Knowledge, 2010 Third International Workshop on Managing Requirements Knowledge (MARK), IEEE, Sydney, Australia, pp. 1-6.
View/Download from: Publisher's site
View description>>
Non Functional Requirements (NFRs) are relative, so are the conflicts among them. In our previously developed catalogue of NFRs conflicts it can be observed that a number of specific pairs of NFRs are claimed to be in conflicts in some cases but they are also claimed not to be in conflict in the other cases. These relative conflicts occur because the positive or negative relationships among NFRs are not always clear and obvious. These relationships might change depending on the meaning of NFRs within the system being developed. This paper focuses on the application of ontology in managing the relative conflicts among NFRs, particularly the relative conflicts between security and usability requirements. The aim is to develop a framework to identify, characterize, and define corresponding resolution strategies for the security-usability conflicts. This paper thus describes the sureCM framework to manage these conflicts; summarizes the security-usability conflicts ontology; and demonstrates how the ontology will be used as a basis to assist analysts in managing conflicts between security and usability requirements.
Mairiza, D, Zowghi, D & Nurmuliani, N 1970, 'An investigation into the notion of non-functional requirements', Proceedings of the 2010 ACM Symposium on Applied Computing, SAC'10: The 2010 ACM Symposium on Applied Computing, ACM, Sierre, Switzerland, pp. 311-318.
View/Download from: Publisher's site
View description>>
Although Non-Functional Requirements (NFRs) are recognized as very important contributors to the success of software projects, studies to date indicate that there is still no general consensus in the software engineering community regarding the notion of NFRs. This paper presents the result of an extensive and systematic analysis of the extant literature over three NFRs dimensions: (1) definition and terminology; (2) types; and (3) relevant NFRs in various types of systems and application domains. Two different perspectives to consider NFRs are described. A comprehensive catalogue of NFRs types as well as the top five NFRs that are frequently considered are presented. This paper also offers a novel classification of NFRs based on types of systems and application domains. This classification could assist software developers in identifying which NFRs are important in a particular application domain and for specific systems.
Mairiza, D, Zowghi, D & Nurmuliani, N 1970, 'Towards a catalogue of conflicts among non-functional requirements', ENASE 2010 - Proceedings of the 5th International Conference on Evaluation of Novel Approaches to Software Engineering, International Conference on Evaluation of Novel Approaches to Software Engineering, SciTePress, Athens, Greece, pp. 20-29.
View description>>
Two of the most significant characteristics of non-functional requirements (NFRs) are 'interacting' and 'relative'. Interacting means NFRs tend to interfere, conflict, and contradict with one other while relative means the interpretation of NFRs may vary depending on many factors, such as the context of the system being developed and the extent of stakeholder involvement. For the purpose of understanding the interacting characteristic of NFRs, several potential conflict models have been presented in the literature. These models represent the positive or negative inter-relationships among various types of NFRs. However, none of them deal with the relative characteristic of NFRs. In fact, multiple interpretations of NFRs in the system being developed may lead to positive or negative inter-relationships that are not always obvious. As a result, the existing potential conflict models remain in disagreement with one other. This paper presents the result of an extensive and systematic investigation of the extant literature over the notion of NFRs and conflicts among them. A catalogue of NFRs conflicts with respect to the NFRs relative characteristic is presented. The relativity of conflicts is characterized by three categories: absolute conflict; relative conflict; and never conflict. This catalogue could assist software developers in identifying the conflicts among NFRs, performing further conflict analysis, and identifying the potential strategies to resolve those conflicts.
Marshall, J & Zowghi, D 1970, 'Software and the Social Production of Disorder', PROCEEDINGS OF THE 2010 IEEE INTERNATIONAL SYMPOSIUM ON TECHNOLOGY AND SOCIETY: SOCIAL IMPLICATIONS OF EMERGING TECHNOLOGIES, IEEE International Symposium on Technology and Society, IEEE, Wollongong, Australia, pp. 284-291.
View/Download from: Publisher's site
View description>>
Software development is inherently an ordering process. When implemented in a workplace it orders the ways that people go about their work, the work they do, and the ways they interact and communicate with each other. This new mode of ordering may conflict with existing orders, existing distributions of power and knowledge, and arrangements of groups, and between groups. Ordering is almost always the subject of dispute, so software development can easily become enmeshed in the politicking between competing groups with deleterious effects. Removing all these conflicts may not be possible, as they can be an essential part of the ways relevant groups interact. Better communication, for example, may actually increase conflict, and not produce harmony. Rather than thinking of order and disorder as mutually exclusive polarities, it is more effective and realistic to think of them as constituting an âorder/disorder complexâ and to expect disorder to appear alongside the ordering. This paper explores the problems of ordering and disordering through a study of changes in the Australian Customsâ âIntegrated Cargo Systemâ. We suggest that acceptance of some untidied mess, or openness to both dispute and unclarity, may be useful in implementing functional software.
McGregor, C 1970, 'Stream computing and its application to critical care', 2010 IEEE 23rd International Symposium on Computer-Based Medical Systems (CBMS), 2010 IEEE 23rd International Symposium on Computer-Based Medical Systems (CBMS), IEEE.
View/Download from: Publisher's site
Mearns, H, Leaney, J & Verchere, D 1970, 'Critique of Network Management Systems and Their Practicality', 2010 Seventh IEEE International Conference and Workshops on Engineering of Autonomic and Autonomous Systems, 2010 7th IEEE International Conference and Workshops on Engineering of Autonomic and Autonomous Systems (EASe), IEEE, Oxford, UK, pp. 51-59.
View/Download from: Publisher's site
View description>>
Networks have become an integral part of the computing landscape, forming a global interconnection of a staggering number of heterogeneous systems and services. Current research focuses on policy based management and autonomous systems and involves the utilisation of very different languages and technologies in concert. This paper examines four current proposals for autonomous network management and analyses them using architectural modelling, against a measure of practicality, as expressed by scalability, reliability and maintainability. © 2010 IEEE.
Mearns, H, Leaney, J & Verchere, D 1970, 'The Architectural Evolution of Telecommunications Network Management Systems', 2010 17th IEEE International Conference and Workshops on Engineering of Computer Based Systems, 2010 17th IEEE International Conference and Workshops on Engineering of Computer Based Systems, IEEE, Oxford, England, pp. 281-285.
View/Download from: Publisher's site
View description>>
Telecommunications Network Management Systems ( TNMSs) have had to respond to enormous change, as telecommunication networks have changed from the early (digital) timeframe of being ISDN based, (devoted dominantly to voice), to the current timeframe, providing diverse services of data, voice, video, along with a very different business environment. The goal of this paper is to understand the probable impact of changes required, in order to then understand how current TNMSs are able to respond to the changes, and thus evolve. To evaluate the nature, complexity and risks of changes to TNMSs, we have used an evolutionary taxonomy and methodology which classifies change according to complexity. The data for the evaluation is based on the changes to the architecture of networks and TNMSs over the past 20 years. We have identified the major dimensions of change, and evaluated how TNMSs have responded to those changes. © 2010 IEEE.
Meerbeek, B, Bingley, P, Rijnen, W & van den Hoven, E 1970, 'Pipet', Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI '10: 6th Nordic Conference on Human-Computer Interaction, ACM, Reykjavik, Iceland, pp. 335-342.
View/Download from: Publisher's site
View description>>
To support reminiscing in the home, people collect an increasing amount of digital media on numerous devices. When sharing their media with other people, distribution of the media over different devices can be problematic. In this paper, we address this problem by designing an innovative interaction concept for cross-device interaction to support groups in sharing photos using multiple devices. We designed and implemented the Pipet concept. Results of a comparative study show that Pipet resulted in a pragmatic and hedonic user experience. © 2010 ACM.
Meng, H-D, Ma, J-H & Xu, G-D 1970, 'Experimental Research on Impacts of Dimensionality on Clustering Algorithms', 2010 International Conference on Computational Intelligence and Software Engineering, 2010 International Conference on Computational Intelligence and Software Engineering (CiSE), IEEE.
View/Download from: Publisher's site
View description>>
Experiments are carried out on datasets with different dimensions selected from UCI datasets by using two classical clustering algorithms. The results of the experiments indicate that when the dimensionality of the real dataset is less than or equal to 30, the clustering algorithms based on distance are effective. For high-dimensional datasets - dimensionality is greater than 30, the clustering algorithms are of weaknesses, even if we use dimension reduction methods, such as Principal Component Analysis (PCA). ©2010 IEEE.
Meng, Y, Wang, Z, Xu, X & Wang, X 1970, 'A Generalized Service Resource Management Framework', 2010 International Conference on Service Sciences, 2010 International Conference on Service Sciences, IEEE, pp. 329-334.
View/Download from: Publisher's site
View description>>
This paper presents a service resource management system. It is oriented to both single and integrated service resources and manages every stage of service resource lifecycle efficiently which includes registering, publication, usage, destruction processes. To satisfy different specific service area, the developer only needs to configure the template provided by the service resource management system, which will greatly shorten the development cycle and increase the development efficiency. The paper is organized as follows: firstly, the paper presents the concept of the service resources, and introduces how to classify and depict them. Secondly, using UML use case analyzes the requirement of the service resource management. Thirdly, the service resource management system design is put forward. At the end, it introduces how to apply the system to a specific service area briefly. © 2010 IEEE.
Merigo, JM 1970, 'A METHOD FOR DECISION MAKING BASED ON PROBABILISTIC INFORMATION AND DISTANCE MEASURES', EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2nd International Conference on Education and New Learning Technologies (EDULEARN), IATED-INT ASSOC TECHNOLOGY EDUCATION & DEVELOPMENT, Barcelona, SPAIN.
Merigo, JM 1970, 'A METHOD FOR LINGUISTIC DECISION MAKING IN EDUCATIONAL MANAGEMENT', EDULEARN10: INTERNATIONAL CONFERENCE ON EDUCATION AND NEW LEARNING TECHNOLOGIES, 2nd International Conference on Education and New Learning Technologies (EDULEARN), IATED-INT ASSOC TECHNOLOGY EDUCATION & DEVELOPMENT, Barcelona, SPAIN.
Merigo, JM 1970, 'Fuzzy generalized aggregation operators in a unified model between the probability, the weighted average and the OWA operator', International Conference on Fuzzy Systems, 2010 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), IEEE, Barcelona, SPAIN.
View/Download from: Publisher's site
Merigó, JM 1970, 'A generalized model between the OWA operator, the weighted average and the probability', Proceedings of the 2010 Spring Simulation Multiconference, SpringSim '10: 2010 Spring Simulation Conference, Society for Computer Simulation International.
View/Download from: Publisher's site
View description>>
We introduce a new model that unifies the probability, the weighted average and the OWA operator in a general framework based on the use of generalized means. We present the generalized probabilistic ordered weighted averaging weighted averaging (GPOWAWA) operator. The main advantage of this model is that it unifies these three concepts considering the degree of importance that each one has in the aggregation. We study some of its main properties and particular cases such as the POWAWA, the quadratic POWAWA, the generalized probabilistic weighted average, the generalized OWAWA and generalized probabilistic OWA operator. We end the paper presenting a further generalization by using quasi-arithmetic means obtaining the Quasi-POWAWA operator. © 2010 SCS.
MERIGÓ, JM 1970, 'INDUCED GENERALIZED PROBABILISTIC OWAWA OPERATOR', Computational Intelligence in Business and Economics, Proceedings of the MS'10 International Conference, WORLD SCIENTIFIC, Barcelona, SPAIN, pp. 73-82.
View/Download from: Publisher's site
Merigó, JM 1970, 'Using the probabilistic weighted average in decision making with distance measures', WCE 2010 - World Congress on Engineering 2010, World Congress on Engineering (WCE 2010), INT ASSOC ENGINEERS-IAENG, Imperial Coll London, London, UNITED KINGDOM, pp. 1-4.
View description>>
We develop a new decision making method based on distance measures that uses the probabilistic weighted averaging (PWA) operator. The PWA operator is an aggregation operator that unifies the weighted average and the probability in the same formulation and considering the degree of importance that each concept has in the aggregation. We introduce the probabilistic weighted averaging distance (PWAD) operator. It is a new aggregation operator that uses probabilities, weighted averages and distance measures. We study some of its main properties and particular cases such as the arithmetic weighted Hamming distance and the arithmetic probabilistic Hamming distance. We also develop an application in a decision making problem concerning the selection of investment strategies.
Merigo, JM & Casanovas, M 1970, 'A NEW DECISION MAKING METHOD BASED ON DISTANCE MEASURES AND ITS APPLICATION IN EDUCATIONAL MANAGEMENT', 4TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE (INTED 2010), 4th International Technology, Education and Development Conference (INTED), IATED-INT ASSOC TECHNOLOGY EDUCATION & DEVELOPMENT, Valencia, SPAIN, pp. 987-998.
Merigo, JM & Casanovas, M 1970, 'DEALING WITH UNCERTAIN INFORMATION IN THE INDUCED PROBABILISTIC OWA OPERATOR', INTELLIGENT DECISION MAKING SYSTEMS, VOL. 2, 4th International Conference on Intelligent Systems and Knowledge Engineering (ISKE 2009), WORLD SCIENTIFIC PUBL CO PTE LTD, BELGIUM, Hasselt Univ, Hasselt, pp. 607-612.
Merigo, JM & Casanovas, M 1970, 'FUZZY AGGREGATION OPERATORS AND ITS APPLICATION IN THE SELECTION OF PROFESSORS', 4TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE (INTED 2010), 4th International Technology, Education and Development Conference (INTED), IATED-INT ASSOC TECHNOLOGY EDUCATION & DEVELOPMENT, Valencia, SPAIN, pp. 975-986.
MERIGÓ, JM & CASANOVAS, M 1970, 'DECISION MAKING WITH THE GENERALIZED PROBABILISTIC WEIGHTED AVERAGING DISTANCE OPERATOR', Computational Intelligence in Business and Economics, Proceedings of the MS'10 International Conference, WORLD SCIENTIFIC, Barcelona, SPAIN, pp. 541-548.
View/Download from: Publisher's site
Merigó, JM & Casanovas, M 1970, 'The induced probabilistic OWA distance and its application in decision making', Proceedings of the 2010 Spring Simulation Multiconference - Emerging M and S Applications in Industry and Academia Symposium, EAIA, pp. 180-185.
View description>>
We present the induced probabilistic ordered weighted averaging distance (IPOWAD) operator. It is a new distance measure that uses probabilistic information and induced aggregation operators. Thus, this model is able to assess problems where we have some kind of objective information and the attitudinal character of the decisionmaker is very complex and can be assessed with orderinducing variables that represent this attitude. We study some of it main properties and a wide range of particular cases including the arithmetic probabilistic distance, the arithmetic induced OWAD, the probabilistic distance, the normalized probabilistic distance, the probabilistic OWAD and many others. We also develop an application of the IPOWAD in a decision-making model regarding investment selection. © 2010 Simulation Councils, Inc.
Merigó, JM & Casanovas, M 1970, 'The induced probabilistic OWA distance and its application in decision making', Proceedings of the 2010 Spring Simulation Multiconference, SpringSim '10: 2010 Spring Simulation Conference, Society for Computer Simulation International.
View/Download from: Publisher's site
View description>>
We present the induced probabilistic ordered weighted averaging distance (IPOWAD) operator. It is a new distance measure that uses probabilistic information and induced aggregation operators. Thus, this model is able to assess problems where we have some kind of objective information and the attitudinal character of the decision-maker is very complex and can be assessed with order-inducing variables that represent this attitude. We study some of it main properties and a wide range of particular cases including the arithmetic probabilistic distance, the arithmetic induced OWAD, the probabilistic distance, the normalized probabilistic distance, the probabilistic OWAD and many others. We also develop an application of the IPOWAD in a decision-making model regarding investment selection. © 2010 SCS.
Merigo, JM & Engemann, KJ 1970, 'Probabilistic aggregation operators with the induced generalized OWA operator', International Conference on Fuzzy Systems, 2010 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), IEEE, Barcelona, SPAIN.
View/Download from: Publisher's site
Merigó, JM & Engemann, KJ 1970, 'A unified model between the OWA operator and the weighted average in decision making with Dempster-Shafer theory', WCE 2010 - World Congress on Engineering 2010, World Congress on Engineering (WCE 2010), INT ASSOC ENGINEERS-IAENG, Imperial Coll London, London, UNITED KINGDOM, pp. 83-87.
View description>>
We present a new decision making model by using the Dempster-Shafer belief structure that uses probabilities, weighted averages and the ordered weighted averaging (OWA) operator. Thus, we are able to represent the decision making problem considering objective and subjective information and the attitudinal character of the decision maker. For doing so, we use the ordered weighted averaging - weighted average (OWAWA) operator. It is an aggregation operator that unifies the weighted average and the OWA in the same formulation. As a result, we form the belief structure - OWAWA (BS-OWAWA) aggregation. We study some of its main properties and particular cases. We also present an application of the new approach in a decision making problem concerning political management.
MERIGÓ, JM & ENGEMANN, KJ 1970, 'FUZZY DECISION MAKING WITH PROBABILITIES AND INDUCED AGGREGATION OPERATORS', Computational Intelligence in Business and Economics, Proceedings of the MS'10 International Conference, WORLD SCIENTIFIC, Barcelona, SPAIN, pp. 323-332.
View/Download from: Publisher's site
MERIGÓ, JM & GIL-LAFUENTE, AM 1970, 'DECISION MAKING TECHNIQUES IN A UNIFIED MODEL BETWEEN THE WEIGHTED AVERAGE AND THE OWA OPERATOR', Computational Intelligence in Business and Economics, Proceedings of the MS'10 International Conference, WORLD SCIENTIFIC, Barcelona, SPAIN, pp. 181-188.
View/Download from: Publisher's site
MERIGÓ, JM & GIL-LAFUENTE, AM 1970, 'THE INDUCED GENERALIZED OWAWA DISTANCE OPERATOR', Computational Intelligence in Business and Economics, Proceedings of the MS'10 International Conference, WORLD SCIENTIFIC, Barcelona, SPAIN, pp. 11-18.
View/Download from: Publisher's site
Milton, J & Kennedy, PJ 1970, 'Entropy Profiles of Ranked and Random Populations', GECCO-2010 COMPANION PUBLICATION: PROCEEDINGS OF THE 12TH ANNUAL GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, Annual Genetic and Evolutionary Computation Confere, ACM Inc., Portland, Oregon, USA, pp. 1843-1849.
View/Download from: Publisher's site
View description>>
The paper describes the concept of entropy profile, how it is derived, its relationship to the number partition problem and to the information extracted from an objective function. It is hoped that discussion and criticism of this idea may shed light on why some problem representations are NP hard and other, very similar problems are relatively simple. Entropy profiles illustrate the difference between ranked and randomly ordered populations of individuals in a GA in a way which quantifies the information extracted from the objective function by the ranking process. The entropy profiles of random populations are shown to arise from the fact that there are many more of such âpathsâ through the entropy coordinate space than periodic or ranked paths. Additionally, entropy profile provides a measurable difference between periodic lowâfrequency sequences, periodic highâfrequency sequences, random sequences and those which are in some way structured, ie by an objective function or other signal. The entropy coordinate space provides a visualisation and explanation of why these profiles differ and perhaps, by way of the integer partition phase transition, also a means to understand why some problems are hard while other seemingly similar problems are straightforward to solve.
Ming-Feng Han, Lin, C-T & Jyh-Yeong Chang 1970, 'A compensatory neurofuzzy system with online constructing and parameter learning', 2010 IEEE International Conference on Systems, Man and Cybernetics, 2010 IEEE International Conference on Systems, Man and Cybernetics - SMC, IEEE, pp. 552-556.
View/Download from: Publisher's site
View description>>
A compensatory neurofuzzy system (CNFS) with on-line learning ability is proposed in this paper. The proposed CNFS model uses a compensatory layer to raise the diversity of fuzzy rules by compensatory weights. The compensatory layer can automatically compare with each fuzzy rule and select higher resources for more important fuzzy rule. An online learning algorithm, which consists of structure learning and parameter learning, is also presented. The structure learning depends on the fuzzy measure to determine the number of fuzzy rules. The parameter learning, based on the gradient descent method, can adjust the shape of the membership function and the weights of the compensatory layer. To demonstrate the capability of the proposed CNFS, it is applied to the Iris, and Wisconsin breast cancer classification datasets from the VCI Repository. Experimental results show that the proposed CNFS for pattern classification can achieve good classification performance. ©2010 IEEE.
Nadimi, V, Azadeh, A, Pazhoheshfar, P & Saberi, M 1970, 'An Adaptive-Network-Based Fuzzy Inference System for Long-Term Electric Consumption Forecasting (2008-2015): A Case Study of the Group of Seven (G7) Industrialized Nations: U.S.A., Canada, Germany, United Kingdom, Japan, France and Italy', 2010 Fourth UKSim European Symposium on Computer Modeling and Simulation, 2010 European Modelling Symposium (EMS), IEEE, pp. 301-305.
View/Download from: Publisher's site
View description>>
This paper presents an adaptive-network-based fuzzy inference system (ANFIS) for long-term natural Electric consumption prediction. Six models are proposed to forecast annual Electric demand. 104 ANFIS have been constructed and tested in order to finding best ANFIS for Electric consumption. The proposed models consist of input variables such as Gross Domestic Product (GDP) and Population (POP). All of trained ANFIS are compared with respect to mean absolute percentage error (MAPE). To meet the best performance of the intelligent based approaches, data are preprocessed (scaled) and finally outputs are post-processed (returned to its original scale). To show the applicability and superiority of the ANFIS, actual electric consumption are considered in industrialized nations including U.S.A, Canada, Germany, United Kingdom, Japan, France and Italy from 1980 to 2007. With aid of autoregressive model, GDP and population project by 2015 and then with yield value and best ANFIS model, Electric consumption predict by 2015. © 2010 IEEE.
Nadimi, V, Azadeh, A, Rouzbahman, M, Saberi, M & Shabibi, SA 1970, 'An Adaptive Network Based Fuzzy Inference System algorithm for assessment and improvement of job security among operators with respect to HSE-Ergonomics program', 2010 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications, 2010 IEEE International Conference on Computational Intelligence for Measurement Systems and Applications (CIMSA), IEEE, pp. 7-12.
View/Download from: Publisher's site
View description>>
Researchers have been continuously trying to improve human performance with respect to Health, Safety and Environment (HSE) and ergonomics (hence HSEE). Performance measurement and assessment of operators are fundamental to management planning and control activities, and accordingly, have received considerable attention by both management practitioners and theorists. There has been several efficiency frontier analysis methods reported in the literature. However, each of these methodologies has its strength as well as major limitations. This study proposes a non-parametric efficiency frontier analysis methods based on Adaptive Network-Based Fuzzy Inference System (ANFIS) for measuring efficiency as a complementary tool for performance assessment and improvement of operators. The proposed ANFIS algorithm is able to find a stochastic frontier based on a set of input-output observational data and do not require explicit assumptions about the functional structure of the stochastic frontier. Furthermore, it uses a similar approach to econometric methods for calculating the efficiency scores. The proposed approach is applied to a set of operators in a petrochemical unit to show its applicability and superiority. In fact, this study proposes an adaptive intelligence algorithm for measuring and improving job security among operators with respect to HSE-Ergonomics in a petrochemical unit. To achieve the objectives of this study, standard questionnaires with respect to HSE-Ergonomics are completed by operators. The average results for each category of HSE-Ergonomics are used as inputs and work job security is used as output for the algorithm. Moreover, this algorithm is used to rank operators performance with respect to HSE-Ergonomics. Finally, normal probability technique is used to identify outlier operators. This is the first study that introduces an integrated intelligence algorithm for assessment and improvement of human performance with respect to HSE-Ergonomics progr...
Nguyen, TTS, Lu, HY & Lu, J 1970, 'Ontology-style Web usage model for semantic Web applications', 2010 10th International Conference on Intelligent Systems Design and Applications, 2010 10th International Conference on Intelligent Systems Design and Applications (ISDA), IEEE, Egypt, pp. 784-789.
View/Download from: Publisher's site
View description>>
Current semantic recommender systems aim to exploit the website ontologies to produce valuable web recommendations. However, Web usage knowledge for recommendation is presented separately and differently from the domain ontology, this leads to the complexity of using inconsistent knowledge resources. This paper aims to solve this problem by proposing a novel ontology-style model of Web usage to represent the non-taxonomic visiting relationship among the visited pages. The output of this model is an ontology-style document which enables the discovered web usage knowledge to be sharable and machine-understandable in semantic Web applications, such as recommender systems. A case study is presented to show how this model is used in conjunction of the web usage mining and web recommendation. Two real-world datasets are used in the case study.
Nizami, S, Green, JR, Eklund, JM & McGregor, C 1970, 'Heart disease classification through HRV analysis using Parallel Cascade Identification and Fast Orthogonal Search', 2010 IEEE International Workshop on Medical Measurements and Applications, 2010 IEEE International Workshop on Medical Measurements and Applications (MeMeA), IEEE, pp. 134-139.
View/Download from: Publisher's site
View description>>
Heart rate variability (HRV) is an established indicator of cardiac health. Recent developments have shown the potential of nonlinear metrics for pattern classification of various heart conditions. Evidence indicates that the combination of multiple linear and nonlinear features leads to increased classification accuracy. In our paper, we demonstrate HRV classification using two dynamic nonlinear techniques called Parallel Cascade Identification (PCI) and Fast Orthogonal Search (FOS). We investigate the use of these two techniques for feature extraction from publicly available Physionet electrocardiogram (ECG) data to differentiate between normal sinus rhythm of the heart and 3 undesired conditions: arrhythmia, supraventricular arrhythmia, and congestive heart failure. Results compare well with previous studies which have used more features over the same dataset. We hypothesize that combining PCI and FOS features with traditional HRV features will show further improvement in classification accuracy and so can assist in real-time patient monitoring. ©2010 IEEE.
Othman, SH & Beydoun, G 1970, 'A Disaster Management Metamodel (DMM) Validated', KNOWLEDGE MANAGEMENT AND ACQUISITION FOR SMART SYSTEMS AND SERVICES, 11th International workshop on Knowledge Management and Acquisition for Smart Systems and Services, SPRINGER-VERLAG BERLIN, Daegu, SOUTH KOREA, pp. 111-125.
Othman, SH & Beydoun, G 1970, 'Metamodel-based Decision Support System for Disaster Management.', ICSOFT (2), 5th International Conference on Software and Data Technologies (ICSOFT 2010), SciTePress, Univ Piraeus, Athens, GREECE, pp. 403-412.
Othman, SH & Beydoun, G 1970, 'Metamodelling approach to support disaster management knowledge sharing', ACIS 2010 Proceedings - 21st Australasian Conference on Information Systems.
View description>>
Handling uncertain events that could happen anytime and anywhere and dealing with many complex systems interconnected physically and socially makes Disaster Management (DM) a multidisciplinary endeavor and a very difficult domain to model. In this paper we present a development and validation of a Disaster Management Metamodel (DMM), a language that we develop specific for describing DM domain. The metamodel, a precise definition of the constructs and rules needed for creating the semantic models of DM domain consists of four views based on four DM phases including Mitigation, Preparedness, Response and Recovery-phase classes of concept. A Model Importance Factor (MIF) criterion is used to identify 10 existing disaster management models to evaluate the expressiveness and the completeness of DMM. The paper presents the synthesis process and the resulting metamodel, as a foundational component to create a Disaster Management Decision Support System (DMDSS) to unify, facilitate and expedite access to DM expertise. © 2010 [Othman&Beydoun].
Othman, SH & Beydoun, G 1970, 'Metamodelling Approach towards a Disaster Management Decision Support System', ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, PT II, 10th International Conference on Artificial Intelligence and Soft Computing (ICAISC 2010), SPRINGER-VERLAG BERLIN, Zakopane, POLAND, pp. 614-621.
Pan, R, Xu, G & Dolog, P 1970, 'User and document group approach of clustering in tagging systems', Proceedings ABIS 2010 - 18th Intl. Workshop on Personalization and Recommendation on the Web and Beyond.
View description>>
In this paper, we propose a spectral clustering approach for users and documents group modeling in order to capture the common preference and relatedness of users and documents, and to reduce the time complexity of similarity calculations. In experiments, we investigate the selection of the optimal amount of clusters. We also show a reduction of the time consuming in calculating the similarity for the recommender systems by selecting a centroid first, and then compare the inside item on behalf of each group. keywords: User Profile, Document Profile, Spectral Clustering, Group Profile, Modularity Metric.
Peng, D, Long, W, Huang, T & Huo, H 1970, 'ESA: An Efficient and Stable Approach to Querying Reverse k-Nearest-Neighbor of Moving Objects', Springer Berlin Heidelberg, pp. 303-311.
View/Download from: Publisher's site
Percival, J, McGregor, C, Percival, N, Kamaleswaran, R & Tuuha, S 1970, 'A framework for nursing documentation enabling integration with HER and real-time patient monitoring', 2010 IEEE 23rd International Symposium on Computer-Based Medical Systems (CBMS), 2010 IEEE 23rd International Symposium on Computer-Based Medical Systems (CBMS), IEEE, pp. 468-473.
View/Download from: Publisher's site
View description>>
This paper proposes a framework for mobile nursing documentation enabling the integration of clinical intervention data with both electronic health record systems and real-time intelligent decision support systems for patient monitoring. A brief discussion on the networking and information security concerns is presented in order to provide context for the mobile application design decisions surround data transmission and storage. The framework is demonstrated using an initial case study in a neonatal intensive care unit. © 2010 IEEE.
Pileggi, S, Palau, CE & Esteve, M 1970, 'Multimode WSN: Improving Robustness, Fault Tolerance and Performance of Randomly Deployed Wireless Sensor Network', 2010 2nd International Conference on Computational Intelligence, Communication Systems and Networks, 2010 2nd International Conference on Computational Intelligence, Communication Systems and Networks (CICSyN 2010), IEEE, Liverpool, ENGLAND, pp. 112-117.
View/Download from: Publisher's site
Pileggi, SF 1970, 'A Novel Domain Ontology for Sensor Networks', 2010 Second International Conference on Computational Intelligence, Modelling and Simulation, 2010 Second International Conference on Computational Intelligence, Modelling and Simulation (CIMSiM), IEEE, pp. 443-447.
View/Download from: Publisher's site
View description>>
Semantic Sensor Web is a progressive concept that would improve current Sensor Web model introducing a semantic layer in which the semantic or meaning of information is formally defined. Semantic environment allows a novel approach for interoperability among systems as well as new perspectives for computation (e.g. Ontology-driven and Ontology-aware). One of the central and key issues for the concrete realization of Semantic Sensor Web is the engineering of semantic knowledge for sensor systems. In this paper a novel modular extensible ontology is proposed. This semantic schema is designed according to a methodology that assumes multiple knowledge levels as well as multiples capabilities in terms of both representation and analysis. © 2010 IEEE.
Pileggi, SF, Palau, CE & Esteve, M 1970, 'Building semantic sensor web: Knowledge and interoperability', Proceedings of the International Workshop on Semantic Sensor Web, SSW 2010, in Conjunction with IC3K 2010, pp. 15-22.
View description>>
Semantic Sensor Web would be an evolving extension of Sensor Web that introduces a semantic layer in which semantics or meanings of information are formally defined according to well-defined semantic schemas (Ontology). Semantics should improve the capabilities of collecting, retrieving, sharing, manipulating and analyzing sensor data (or associate phenomena) providing a new interoperability model: semantic interoperability introduces the interpretation of means of data allowing the engineering of novel architectures based on standard reasoners.
Pileggi, SF, Palau, CE & Esteve, M 1970, 'On the convergence between Wireless Sensor Network and RFID: Industrial environment', WiOpt 2010 - 8th Intl. Symposium on Modeling and Optimization in Mobile, Ad Hoc, and Wireless Networks, pp. 430-436.
View description>>
The convergence between Wireless Sensor Network and RFID technology enables the development of flexible and integrated architectures that could currently represent a competitive solution for several application scenarios. This paper proposes an advanced heterogeneous wireless network designed for industrial environments: typical sensor applications (personal and environmental parameters monitoring), RFID based services (e.g. objects identification) and convergent applications (localization and tracking) are merged. Several research topics are addressed (resource optimization, low power communication within hostile environments, etc). Proposed model would be a generalized solution that assures high performance in terms of reliability, robustness, and flexibility: main architecture component, Multi-Modal Wireless Sensor Node (MM-WSNode), is provided with multiple sensing, communication and data process resource that allows several working modes in function of environmental conditions detected.
Purba, JH, Lu, J, Ruan, D & Zhang, G 1970, 'PROBABILISTIC SAFETY ASSESSMENT IN NUCLEAR POWER PLANTS BY FUZZY NUMBERS', COMPUTATIONAL INTELLIGENCE: FOUNDATIONS AND APPLICATIONS, International FLINS Conference on Computational Intelligence: Foundations and Applications, World Scientific Publishing Co. Pte. Ltd., Chengdu, China, pp. 256-262.
View description>>
Probabilistic safety assessment in nuclear power plants (NPPs) greatly considers plant safety and optimal plant design. Plant specific data are usually recommended to analyze safety in NPPs. However, such NPP specific data are not always available in practice. This paper presents an approach by combining fuzzy numbers and expert justification to assess an NPP probabilistic failure rate in the absence of statistical data. The proposed approach illustrates a case study for high pressure core spray systems of boiling water reactors.
Qumer, A & Henderson-Sellers, B 1970, 'Empirical Evaluation of the Agile Process Lifecycle Management Framework.', RCIS, International Conference on Research Challenges in Information Science, IEEE, Nice, France, pp. 213-222.
View/Download from: Publisher's site
View description>>
While many organisations are interested in adopting agile methods suitable to their local circumstances, there is little guidance available on how to do so. To address this important issue, we have developed an agile software process lifecycle management framework (APLM). The APLM framework is intended for use by agile coaches, managers and consultants to facilitate the integration of agile practices into both software project development environments. This paper presents the outcomes of an empirical evaluation of the APLM framework that has been conducted by involving a practitioners' group composed of fourteen experienced agile experts from industry. The main objective of this empirical study is to determine to what extent each component of the agile process lifecycle management (APLM) framework is relevant, valuable and sufficient to achieve its purpose from an industry practitioner's perspective. © 2010 IEEE.
Qumer, A & Henderson-Sellers, B 1970, 'Framework as Software Service (FaSS) - An Agile e-Toolkit to Support Agile Method Tailoring.', ICSOFT (2), International Conference on Software and Data Technologies, SciTePress, Athens, Greece, pp. 167-172.
View description>>
In a real software application development environment, a pre-defined or fixed methodology, whether plan-based or agile, is unlikely to be successfully adopted 'off-the-shelf. Agile methods have recognised that a method should be tailored to each situation. The purpose of this paper is to present an agile e-toolkit software service to facilitate the tailoring of agile processes in the overall context of agile method adoption and improvement. The agile e-toolkit is a web-based tool to store and manage agile practices extracted from various agile methods and frameworks. The core component of the e-toolkit is the agile knowledge-base or repository. The agile knowledge-base contains agile process fragments. Agile consultants or teams can then use agile process fragments stored in the agile knowledge-base for the tailoring of situation-specific agile processes by using a situational method engineering approach. The e-toolkit software service has been implemented using a service-oriented cloud computing technology platform (Software as a Service-SaaS). The agile e-toolkit specifications and software application details have been summarized in this paper.
Rong, B, Liu, B, Wu, Y, Gagnon, G, Gui, L & Zhang, W 1970, 'Mobile Location Finding Using ATSC Mobile/Handheld Digital TV RF Watermark Signals', 2010 IEEE 72nd Vehicular Technology Conference - Fall, 2010 IEEE Vehicular Technology Conference (VTC 2010-Fall), IEEE.
View/Download from: Publisher's site
View description>>
This paper investigates the use of ATSC M/H digital television (DTV) signal for location finding. In comparison to satellite based location finding system, DTV signals have higher field strength, wider bandwidth, lower frequency band, and DTV transmission towers are pervasively available everywhere. They can be used for indoor and mobile location finding in major cities where satellite based system might not function well. The ATSC receiver can obtain the multiple transmitter impulse responses and signal arrival times using the embedded RF watermark (RFWM) signal, and then derives its geographic coordinates based on the position of ATSC transmitters. As a critical step of this process, the transmitter identification in mobile environment has significant impact on the overall accuracy of location finding. In this paper, we present extensive analytical and simulation results to demonstrate the performance of RFWM technology over mobile channels. ©2010 IEEE.
Sajid, A, Nayyar, A & Mohsin, A 1970, 'Modern trends towards requirement elicitation', Proceedings of the 2010 National Software Engineering Conference, NSEC '10: National Software Engineering Conference 2010, ACM.
View/Download from: Publisher's site
Saqib, M & Lee, C 1970, 'Traffic control system using wireless sensor network', International Conference on Advanced Communication Technology, ICACT, pp. 352-357.
View description>>
The Real time locating system (RTLS) determines and tracks the location of assets and people. This paper presents a novel application to estimate the position and velocity of vehicle using wireless sensor network. Two Anchor nodes are used as reader along roadside and total distance between them is known. Whenever a moving vehicle with tag comes in between the common part of the operating range of two anchor nodes, exchange of information is done using Symmetric double sided two way ranging algorithm, which gives us position information. Using position information at several interval of time, velocity can be easily obtained. Position and velocity is obtained and displayed on base station. Kalman filtering is used to estimate the position and velocity from noisy measurements. Performance evaluation is done comparing vehicle position speed true values with experimental and estimated values.
Saunders, R, Gemeinboeck, P, Lombard, A, Bourke, D & Kocabali, B 1970, 'Curious whispers: An embodied artificial creative system', Proceedings of the International Conference on Computational Creativity, ICCC-10, pp. 100-109.
View description>>
Creativity, whether or not it is computational, doesn't occur in a vacuum, it is a situated, embodied activity that is connected with cultural, social, personal and physical contexts. Artificial creative systems are computational models that attempt to capture personal, social and cultural aspects of human creativity. The physical embodiment of artificial creative systems presents significant challenges and opportunities. This paper introduces the 'Curious Whispers' project, an attempt to embody an artificial creative system as a collection of autonomous mobile robots that communicate through simple 'songs'. The challenges of developing an autonomous robotic platform suitable for constructing artificial creative systems are discussed. We conclude by examining some of the opportunities of this embodied approach to computational creativity.
Shao-Hang Hung, Chih-Feng Chao, Shu-Kai Wang, Bor-Shyh Lin & Chin-Teng Lin 1970, 'VLSI implementation for Epileptic Seizure Prediction System based on wavelet and chaos theory', TENCON 2010 - 2010 IEEE Region 10 Conference, 2010 IEEE Region 10 Conference (TENCON 2010), IEEE, Fukuoka, JAPAN, pp. 364-368.
View/Download from: Publisher's site
Shao-Hang Hung, Chih-Feng Chao, Yu-Chun Yan, Bor-Shyh Lin & Chin-Teng Lin 1970, 'Independent Component Analysis Hard-IP integration system on programmable chip (SOPC) platform', TENCON 2010 - 2010 IEEE Region 10 Conference, 2010 IEEE Region 10 Conference (TENCON 2010), IEEE, Fukuoka, JAPAN, pp. 1705-1709.
View/Download from: Publisher's site
Singh, R, Verma, P & Singh, AK 1970, 'TC-GXML - A Transcoder for HTML to XML Grammar', 2010 International Conference on Data Storage and Data Engineering, 2010 International Conference on Data Storage and Data Engineering (DSDE), IEEE, IEEE, pp. 34-38.
View/Download from: Publisher's site
Su, H, Chen, L, Ye, Y, Sun, Z & Wu, Q 1970, 'A Refinement Approach to Handling Model Misfit in Semi-supervised Learning', Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer Berlin Heidelberg, Chongqing, China, pp. 75-86.
View/Download from: Publisher's site
View description>>
Semi-supervised learning has been the focus of machine learning and data mining research in the past few years. Various algorithms and techniques have been proposed, from generative models to graph-based algorithms. In this work, we focus on the Cluster-and-Label approaches for semi-supervised classification. Existing cluster-and-label algorithms are based on some underlying models and/or assumptions. When the data fits the model well, the classification accuracy will be high. Otherwise, the accuracy will be low. In this paper, we propose a refinement approach to address the model misfit problem in semi-supervised classification. We show that we do not need to change the cluster-and-label technique itself to make it more flexible. Instead, we propose to use successive refinement clustering of the dataset to correct the model misfit. A series of experiments on UCI benchmarking data sets have shown that the proposed approach outperforms existing cluster-and-label algorithms, as well as traditional semi-supervised classification techniques including Selftraining and Tri-training. © 2010 Springer-Verlag.
ten Bhömer, M, Helmes, J, O'Hara, K & van den Hoven, E 1970, '4Photos', Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, NordiCHI '10: 6th Nordic Conference on Human-Computer Interaction, ACM, pp. 52-61.
View/Download from: Publisher's site
View description>>
In this paper, we describe the iterative design and user study of "4Photos", a multi-screen table centrepiece allowing media content to be shared and enjoyed in a social setting. It was our intention to design an object with the purpose to gather qualitative data concerning the social effects of new ways of democratic, serendipitous and playful photo sharing. To facilitate this we used online photo repository content that most often gets experienced in an individual setting. Using 4Photos we positioned this content within a social setting and observed how the presentation of these images enabled new ways of "phototalk' to arise. We describe the design process, the final concept and reflect upon observed practices that emerged from people's usage of 4Photos. We then present several design implications and discuss future directions for continuation of this research. © 2010 ACM.
Tzyy-Ping Jung, Kuan-Chih Huang, Chun-Hsiang Chuang, Jian-Ann Chen, Li-Wei Ko, Tzai-Wen Chiu & Chin-Teng Lin 1970, 'Arousing feedback rectifies lapse in performance and corresponding EEG power spectrum', 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, 2010 32nd Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2010), IEEE, Buenos Aires, ARGENTINA, pp. 1792-1795.
View/Download from: Publisher's site
Verma, P, Singh, R, Singh, AK, Yadav, V & Pandey, A 1970, 'An enhanced speech-based Internet browsing system for visually challenged', 2010 International Conference on Computer and Communication Technology (ICCCT), 2010 International Conference on Computer and Communication Technology (ICCCT), IEEE, IEEE, pp. 724-730.
View/Download from: Publisher's site
Voinov, A 1970, ''Integronsters' and the special role of data', Modelling for Environment's Sake: Proceedings of the 5th Biennial Conference of the International Environmental Modelling and Software Society, iEMSs 2010, pp. 1139-1149.
View description>>
In many cases model integration treats models as software components only, ignoring the fluid relationship between models and reality, the evolving nature of models and their constant modification and re-calibration. As a result, with integrated models we find increased complexity, where changes that used to impact only relatively contained models of subsystems, now propagate throughout the whole integrated system. This makes it harder to keep the overall complexity under control and, in a way, defeats the purpose of modularity, when efficiency is supposed to be gained from independent development of modules. Treating models only as software in solving the integration challenge may give birth to 'integronsters' - constructs that are perfectly valid as software products but ugly and useless as models. We argue that one possible remedy is to learn to use data as modules and integrate them into the models. Then the data that are available for module calibration can serve as an intermediate linkage tool, sitting between modules and providing a moduleindependent baseline dynamics, which is then incremented when scenarios are to be run. In this case it is not the model output that is directed into the next model input, but model output is presented as a variation around the baseline trajectory, and it is this variation that is then fed into the next module down the chain. The Chesapeake Bay Program suite of models is used to illustrate these problems and the possible solutions.
Wang, X, Wang, Z, Xu, X, Liu, A & Chu, D 1970, 'A Service Composition Approach for the Fulfillment of Temporally Sequential Requirements', 2010 6th World Congress on Services, 2010 IEEE Congress on Services (SERVICES), IEEE, pp. 559-565.
View/Download from: Publisher's site
View description>>
Traditional service composition approaches focus on selecting and composing multiple service components together to fulfill one single requirement. But in most real-world scenarios, there are multiple requirements raised by multiple consumers and they form a discrete and uneven flow (i.e., a temporal sequence). Due to the limited number of available services and their limited capacities, how to ensure the equilibrium between the satisfaction degrees of these temporally sequential requirements becomes an important issue to be addressed. This paper proposes an equilibrium-oriented service composition approach taking into account both the limitedness of service capacity and utilization of historical data. The temporal sequential requirements are divided gradually along with the formation of length-flexible time-segments one by one. Based on this segmentation, service capacity is preserved proportionally for the estimated future requirements, and multiple requirements within one segment are ensured to get relatively equal chances of being satisfied with relatively equal quality. Experiments reveal improved sustainability and superior temporal stability of service quality compared with applying traditional methods to this scenario. © 2010 IEEE.
Wang, Y-K, Pal, NR, Lin, C-T & Chen, S-A 1970, 'Analyzing effect of distraction caused by dual-tasks on sharing of brain resources using SOM', The 2010 International Joint Conference on Neural Networks (IJCNN), 2010 International Joint Conference on Neural Networks (IJCNN), IEEE.
View/Download from: Publisher's site
View description>>
Drivers' distraction is widely recognized as a leading cause of car accidents. To investigate the distracting effect of dual-tasks involving driving and answering mathematical equations in the stimulus onset asynchrony (SOA) conditions, we design five different cases: two cases involving single-tasks and three cases involving dual-tasks. We have found that there is no statistically significant change in the behavioral data among the three dual-tasks. This raises an important question - is there any detectable effect of the dual tasks on the brain waves? To answer this, we use the Self-Organizing Map (SOM) to recognize the changes, if any, in the Electroencephalography (EEG) dynamics associated with such dual-tasks. Our SOM analysis based on independent components corresponding to EEG signals extracted from Frontal and Motor areas revealed that single- and dual-tasks have distinguishable signatures in the EEG signals. Specifically, each of the two single-task conditions is clustered in a distinct spatial area of the map. Two of the dual-tasks also exhibit distinct spatial clusters, while the third case although shows differences from the other two, the neurons corresponding to this case are sub-clustered reflecting the fact that different subjects may give different priorities to the tasks when confronted with two tasks simultaneously. SOM-based exploratory analysis reveals the existence of distinct EEG signatures among the distracting and non-distracting tasks, although there is no any noticeable difference in the behavioral data among these cases. © 2010 IEEE.
Wei, B, Jin, Z & Zowghi, D 1970, 'Knowledge Merging under Multiple Attributes', KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, Knowledge Science, Engineering and Management, Springer-Verlag Berlin Heidelberg, Belfast, Northern Ireland, pp. 555-560.
View/Download from: Publisher's site
View description>>
Knowledge merging is the process of synthesizing multiple knowledge models into a common model. Available methods concentrate on resolving conflicting knowledge. While, we argue that besides the inconsistency, some other attributes may also affect the resulting knowledge model. This paper proposes an approach for knowledge merging under multiple attributes, i.e. Consistency and Relevance. This approach introduces the discrepancy between two knowledge models and defines different discrepancy functions for each attribute. An integrated distance function is used for assessing the candidate knowledge models.
Xiong, J, Gui, L, Liu, B & Ge, Y 1970, 'On digital TV broadcasting coverage scheme in carriages of Chinese High Speed Railway', 2010 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), 2010 IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB), IEEE.
View/Download from: Publisher's site
View description>>
This study introduces a wireless scheme to solve the coverage problem in carriages of Chinese High Speed Railway (CRH) trains. In the proposed scheme, digital TV programs can be retransmitted by CMMB (China Mobile Multimedia Broadcasting) transmitters after being transmitted from the base stations to the trains. The in-car channel profile is analyzed and presented. And the modified linear minimum mean-square error (LMMSE) algorithm proposed by Zheng is adopted to estimate the channel in the CMMB receiver with 2MHz bandwidth. Simulation shows that the CMMB receiver has good reception performance and the proposed broadcasting coverage scheme is feasible.
Xiong, J, Gui, L, Liu, B & Miao, R 1970, 'A Simplified Method to Get the Optimal Coefficients of Lagrange-Constrained Polynomial Interpolators', 2010 International Conference on Communications and Mobile Computing, 2010 International Conference on Communications and Mobile Computing (CMC), IEEE, pp. 119-122.
View/Download from: Publisher's site
View description>>
This paper presents a simplified algorithm to get the optimal coefficients of Lagrange-constrained polynomial interpolators. The Lagrange-constrained interpolators are widely used in communication systems. They are used to compensate timing offsets in time domain and to recover carrier frequency in frequency domain. There are methods to get the optimal coefficients of Lagrange- constrained polynomial interpolators. However, all these methods are very complicated. The proposed method exploits the correlations between the interpolators' optimal coefficients caused by constraints and the symmetry characteristic. The number of the undefined coefficients, including the optimal coefficients, is efficiently reduced without any degradation of the interpolators' performance1. © 2010 IEEE.
Xu, G, Zong, Y, Dolog, P & Zhang, Y 1970, 'Co-clustering Analysis of Weblogs Using Bipartite Spectral Projection Approach', Knowledge-based And Intelligent Information And Engineering Systems, Pt Iii, 14th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, Springer Berlin Heidelberg, Cardiff, WALES, pp. 398-407.
View/Download from: Publisher's site
View description>>
Web clustering is an approach for aggregating Web objects into various groups according to underlying relationships among them. Finding co-clusters of Web objects is an interesting topic in the context of Web usage mining, which is able to capture the un
Xu, M, Chen, L, He, X, Xu, C & Jin, JS 1970, 'Adaptive local hyperplanes for MTV affective analysis', Proceedings of the Second International Conference on Internet Multimedia Computing and Service, ICIMCS '10: The Second International Conference on Internet Multimedia Computing and Service, ACM, Harbin, China, pp. 167-170.
View/Download from: Publisher's site
View description>>
Affective analysis attracts increasing attention in multimedia domain since affective factors directly reflect audiences' attention, evaluation and memory. Existing study focuses on mapping low-level affective features to high-level emotions by applying machine learning methods. Therefore, choosing effective features and developing efficient machine learning algorithms become vital for affective analysis. In this paper, we investigate the effectiveness of a novel classification approach, called Adaptive Local Hyperplanes (ALH), in affective analysis. The reason ALH is appealing in affective analysis is two-fold. Firstly, affective features are not equally important for emotion categories; ALH inherently assigns feature weights based on discriminative ability of each feature. Secondly, ALH achieves competitive performance with state-of-the-art classifiers (e.g., SVM) while it is designed for multi-class classification. Consequently, it is worthwhile to explore the usage of ALH in affective analysis. MTV data are used in this study. As the first effort of applying ALH to affective analysis, the results presented in this paper provide a foundation for future research in affective analysis. Copyright 2010 ACM.
Ye, D, Zhang, M & Sutanto, D 1970, 'DGF: Decentralized Group Formation for Task Allocation in Complex Adaptive Systems', ADVANCES IN PRACTICAL MULTI-AGENT SYSTEMS, 12th International Conference on Principles of Practice in Multi-Agent Systems (PRIMA 2009), Springer Berlin Heidelberg, Nagoya, JAPAN, pp. 3-19.
View/Download from: Publisher's site
Ye, D, Zhang, M, Bai, Q & Ito, T 1970, 'Self-organisation in an Agent Network via Multiagent Q-Learning', KNOWLEDGE MANAGEMENT AND ACQUISITION FOR SMART SYSTEMS AND SERVICES, 11th International workshop on Knowledge Management and Acquisition for Smart Systems and Services, Springer Berlin Heidelberg, Daegu, SOUTH KOREA, pp. 14-26.
View/Download from: Publisher's site
Yu, Y-H, Lai, P-C, Ko, L-W, Chuang, C-H, Kuo, B-C & Lin, C-T 1970, 'An EEG-based classification system of Passenger's motion sickness level by using feature extraction/selection technologies', The 2010 International Joint Conference on Neural Networks (IJCNN), 2010 International Joint Conference on Neural Networks (IJCNN), IEEE, Barcelona, SPAIN.
View/Download from: Publisher's site
Zawawi, RA, Akpolat, H & Bagia, R 1970, 'Managing Knowledge in Aircraft Engineering', Proceedings of The 2nd International Conference on Logistics and Transport & The 1st International Conference on Business and Economics, ICLT 2010 & ICBE 2010 'Managing Finance and Risk in Global Supply Chain', UP Organizer and Publication Co., Ltd., Queenstown, New Zealand, pp. 951-960.
View description>>
In this paper, the authors analyse knowledge management (KM) practices in civil aviation industry and introduce a framework for better management of knowledge in aircraft engineering (AE). After comprehensive review of KM literature, this paper offers insights into the existing KM practices in AE using a case study in the Saudi Arabian Aviation industry (SAAI). The KM research data was collected through discussions and interviews as well as through observations during one of the authorâs employment as aircraft engineer in the SAAI. Synthesis of these results with the KM literature was used to identify the gaps between the KM theory and current practices in AE. Finally, an operationsbased knowledge management (OBKM) system framework was developed to address these gaps and overcome ineffectiveness in current practices.
Zhang, G, Dillon, TS, Cai, K-Y, Ma, J, Lu, J & Soc, IEEEC 1970, 'delta-Equalities of Complex Fuzzy Relations', 2010 24TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED INFORMATION NETWORKING AND APPLICATIONS (AINA), IEEE International Conference on Advanced Information Networking and Applications, IEEE Computer Society Conference Publishing Services (CPS), Perth, Australia, pp. 1218-1224.
View/Download from: Publisher's site
View description>>
A complex fuzzy relation is defined as a fuzzy relation whose membership function takes values in the unit circle on a complex plane. This paper first investigates various operation properties of a complex fuzzy relation. It then defines the distance measure of two complex fuzzy relations that can measure the differences between the grades as well as the phases of two complex fuzzy relations. This distance measure is used to define δ-equalities of complex fuzzy relations that coincide with those of fuzzy relations already defined in the literature if complex fuzzy relations reduce to real-valued fuzzy relations. Two complex fuzzy relations are said to be δ-equal if the distance between them is less than 1-δ. This paper shows how various operations between complex fuzzy relations, including T-norms and S-norms, affect given δ-equalities of complex fuzzy relations. Finally, fuzzy inference is examined in the framework of δ-equalities of complex fuzzy relations. © 2010 IEEE.
Zhang, G-L, Lu, H-Y & Zhang, G-Q 1970, 'A new hybrid evolutionary algorithm with quasi-simplex technique', 2010 International Conference on Machine Learning and Cybernetics, 2010 International Conference on Machine Learning and Cybernetics (ICMLC), IEEE, Qingdao, China, pp. 1811-1816.
View/Download from: Publisher's site
View description>>
This paper proposes a new parallel search algorithm using an evolutionary algorithm and quasi-simplex techniques (EAQST) for non-linear constrained function optimization. EAQST produces the offspring in parallel by using the Gaussian mutation, the Cauchy
Zhang, M, Ye, D, Bai, Q, Sutanto, D & Muttaqi, K 1970, 'A Hybrid Multi-Agent Framework for Load Management in Power Grid Systems', ADVANCES IN PRACTICAL MULTI-AGENT SYSTEMS, 12th International Conference on Principles of Practice in Multi-Agent Systems (PRIMA 2009), Springer Berlin Heidelberg, Nagoya, JAPAN, pp. 129-143.
View/Download from: Publisher's site
Zhang, R, Wei, J, Lu, J & Zhang, G 1970, 'A Decision Support System for Ore Blending Cost Optimization Problem of Blast Furnaces', Advances in Intelligent Decision Technologies - Proceedings of the Second KES International Symposium IDT 2010, The Second KES International Symposium IDT, Springer Berlin Heidelberg, Baltimore, USA, pp. 143-152.
View/Download from: Publisher's site
View description>>
In iron and steel enterprises, it is difficult to obtain the lowest-cost optimal solution to an ore blending problem for blast furnaces by using the traditional trial-fault-trial (TFT) method because of the complexity of materials and burden of workflow. Here, we develop a set of decision support systems (DSS) software to solve the problem. Using the basics of analyzing business flow and the working process of ore blending, we pre-process the data for materials and elements, abstract a non-linear model of ore blending for a blast furnace, design the architecture for ore blending cost optimization DSS which integrates a database, a model base and a knowledge base, and solve the problem. The system has made economic gains since it was implemented in Xiangtan Iron & Steel Group Co. Ltd., China, in September 2008.
Zhang, T, Lin, H, Liang, L, Lu, J & Zhang, G 1970, 'A knowledge-based efficiency assessment system for distribution network using data envelopment analysis', 2010 Fourth International Conference on Research Challenges in Information Science (RCIS), 2010 Fourth International Conference on Research Challenges in Information Science (RCIS), IEEE, Nice, France, pp. 331-336.
View/Download from: Publisher's site
View description>>
Distribution network is the most important asset in electric utilities, to increase its efficiency, there is a need to effectively assess the efficiency of distribution network and provide solution for improvement. This paper presents a knowledge-based efficiency assessment system for distribution network using data envelopment analysis (DEA). From an inputoutput view, DEA method is used to calculate the efficiency of distribution lines and obtain gap information. Knowledge base is used to store facts and rules, facts include input-outputs data of DEA and other information about structure and operation of distribution lines. The rules can be empirical rules from domain expert or extracted from industry guidelines or government standards by knowledge worker. Considering the service requirements of power supply in rules, DEA assessment results can be effectively used or provided in the solution of improvement by reasoning. The knowledge base technology and DEA integration represents a step towards a real challenge of the near future. In this system, the final decision is based on DEA assessment results and reasoning. The suggested solution can assist decision maker in making planning to further strengthen distribution network in an efficient and effective manner. © 2010 IEEE.
Zhang, T, Lu, J, Zhang, G & Ding, Q 1970, 'Fault diagnosis of transformer using association rule mining and knowledge base', 2010 10th International Conference on Intelligent Systems Design and Applications, 2010 10th International Conference on Intelligent Systems Design and Applications (ISDA), IEEE, Cairo, Egypt, pp. 737-742.
View/Download from: Publisher's site
View description>>
Association rule mining makes interesting associations and/or correlations among large sets of data. Those associations can be refined as decision rules to be used and stored in a knowledge base system. In this paper, an approach based on association rule and knowledge base is proposed and implemented in the fault diagnosis of a transformer system. According to the features of association rule, the Apriori algorithm is adopted and modified to generate decision rules from power transformer information for building knowledge base, then the rules can be refined to diagnose the fault of the transformer through reasoning, and a prototype system is developed. This approach based on association rule is described in detail and the application is illustrated by an example. A comparison with the IEC (International Electrotechnical Commission) three-ratio method shows the proposed method can provide better accuracy in performance.
Zong, Y, Xu, G, Dolog, P, Zhang, Y & Liu, R 1970, 'Co-clustering for Weblogs in Semantic Space', Web Information System Engineering-wise 2010, 11th International Conference on Web Information Systems Engineering, Springer Berlin Heidelberg, Hong Kong, PEOPLES R CHINA, pp. 120-127.
View/Download from: Publisher's site
View description>>
Web clustering is an approach for aggregating web objects into various groups according to underlying relationships among them. Finding co-clusters of web objects in semantic space is an interesting topic in the context of web usage mining, which is able
Zowghi, D & Jin, Z 1970, 'A Framework for the Elicitation and Analysis of Information Technology Service Requirements and Their Alignment with Enterprise Business Goals', 2010 IEEE 34th Annual Computer Software and Applications Conference Workshops, 2010 IEEE 34th Annual Computer Software and Applications Conference Workshops (COMPSACW), IEEE, Seoul, South Korea, pp. 269-272.
View/Download from: Publisher's site
View description>>
As the economies of the world have become increasingly dependant on Information Technology (IT) services, there is a need for service designers and developers to focus on co-value creation between service providers and service consumers. Developers need to conduct a more rigorous and systematic identification, elicitation, and analysis of IT service requirements than ever before so that the resulting IT services are closely aligned with the enterprise business requirements. Research in Services Science from the business and management discipline has mostly focused on the delivery and management of services experience from the business perspective. Much of the research focus in Service Oriented Computing (SOC) so far has been on the design and delivery of services (especially Web Services), but engineering of IT service requirements has received much less attention. The overall aims of the proposed research is the design and development of an integrated framework and its supporting toolset for the systematic identification, elicitation, and analysis of IT service requirements that satisfy consumers’ needs and are closely aligned with their enterprise business goals.