Bremner, MJ, Mora, C & Winter, A 2009, 'Are Random Pure States Useful for Quantum Computation?', PHYSICAL REVIEW LETTERS, vol. 102, no. 19.
View/Download from: Publisher's site
View description>>
We show the following: a randomly chosen pure state as a resource for measurement-based quantum computation iswith overwhelming probabilityof no greater help to a polynomially bounded classical control computer, than a string of random bits. Thus, unlike the familiar ``cluster states, the computing power of a classical control device is not increased from P to BQP (bounded-error, quantum polynomial time), but only to BPP (bounded-error, probabilistic polynomial time). The same holds if the task is to sample from a distribution rather than to perform a bounded-error computation. Furthermore, we show that our results can be extended to states with significantly less entanglement than random states.
Chitambar, E & Duan, R 2009, 'Nonlocal Entanglement Transformations Achievable by Separable Operations', Physical Review Letters, vol. 103, no. 11, pp. 1-4.
View/Download from: Publisher's site
View description>>
The weird phenomenon of "quantum nonlocality without entanglement" means that local quantum operations assisted by classical communication constitute a proper subset of the class of separable quantum operations. Despite considerable recent advances, little is known to what extent the class of separable operations differs from local quantum operations and classical communication. In this Letter we show that separable operations are generally stronger than local quantum operations and classical communication when distilling a mixed state into a pure entangled state and thus confirm the existence of entanglement monotones that can increase under separable operations. Our finding can also be interpreted as confirming the ability of separable operations to enhance the entanglement of mixed states relative to certain measures, a sensible but important fact that has never been rigorously proven before.
Chitambar, E, Duan, R & Shi, Y 2009, 'Tripartite to Bipartite Entanglement Transformations and Polynomial Identity Testing', Phys. Rev. A, vol. 81, p. 052310.
View/Download from: Publisher's site
View description>>
We consider the problem of deciding if a given three-party entangled purestate can be converted, with a non-zero success probability, into a giventwo-party pure state through local quantum operations and classicalcommunication. We show that this question is equivalent to the well-knowncomputational problem of deciding if a multivariate polynomial is identicallyzero. Efficient randomized algorithms developed to study the latter can thus beapplied to the question of tripartite to bipartite entanglementtransformations.
Devitt, SJ, Fowler, AG, Tilma, T, Munro, WJ & Nemoto, K 2009, 'Classical Processing Requirements for a Topological Quantum Computing System', International Journal of Quantum Information, vol. 8, no. 1-2, pp. 121-147.
View/Download from: Publisher's site
View description>>
Dedicated research into the design and construction of a large scale QuantumInformation Processing (QIP) system is a complicated task. The design of anexperimentally feasible quantum processor must draw upon results in multiplefields; from experimental efforts in system control and fabrication through tofar more abstract areas such as quantum algorithms and error correction.Recently, the adaptation of topological coding models to physical systems inoptics has illustrated a possible long term pathway to truly large scale QIP.As the topological model has well defined protocols for Quantum ErrorCorrection (QEC) built in as part of its construction, a more grounded analysisof the {\em classical} processing requirements is possible. In this paper weanalyze the requirements for a classical processing system, designedspecifically for the topological cluster state model. We demonstrate that viaextensive parallelization, the construction of a classical 'front-end' systemcapable of processing error correction data for a large topological computer ispossible today.
Devitt, SJ, Nemoto, K & Munro, WJ 2009, 'Quantum Error Correction for Beginners', Rep. Prog. Phys., vol. 76, no. 7, p. 076001.
View/Download from: Publisher's site
View description>>
Quantum error correction (QEC) and fault-tolerant quantum computationrepresent one of the most vital theoretical aspect of quantum informationprocessing. It was well known from the early developments of this excitingfield that the fragility of coherent quantum systems would be a catastrophicobstacle to the development of large scale quantum computers. The introductionof quantum error correction in 1995 showed that active techniques could beemployed to mitigate this fatal problem. However, quantum error correction andfault-tolerant computation is now a much larger field and many new codes,techniques, and methodologies have been developed to implement error correctionfor large scale quantum algorithms. In response, we have attempted to summarizethe basic aspects of quantum error correction and fault-tolerance, not as adetailed guide, but rather as a basic introduction. This development in thisarea has been so pronounced that many in the field of quantum information,specifically researchers who are new to quantum information or people focusedon the many other important issues in quantum computation, have found itdifficult to keep up with the general formalisms and methodologies employed inthis area. Rather than introducing these concepts from a rigorous mathematicaland computer science framework, we instead examine error correction andfault-tolerance largely through detailed examples, which are more relevant toexperimentalists today and in the near future.
Duan, R, Feng, Y & Ying, M 2009, 'Perfect Distinguishability of Quantum Operations', PHYSICAL REVIEW LETTERS, vol. 103, no. 21.
View/Download from: Publisher's site
Duan, R, Feng, Y, Xin, Y & Ying, M 2009, 'Distinguishability of Quantum States by Separable Operations', IEEE TRANSACTIONS ON INFORMATION THEORY, vol. 55, no. 3, pp. 1320-1330.
View/Download from: Publisher's site
View description>>
In this paper, we study the distinguishability of multipartite quantum states by separable operations. We first present a necessary and sufficient condition for a finite set of orthogonal quantum states to be distinguishable by separable operations. An analytical version of this condition is derived for the case of (D - 1) pure states, where D is the total dimension of the state space under consideration. A number of interesting consequences of this result are then carefully investigated. Remarkably, we show there exists a large class of 2 ⊗ 2 separable operations not being realizable by local operations and classical communication. Before our work, only a class of 3⊗ 3 nonlocal separable operations was known [Bennett et al, Phys. Rev. A 59, 1070 (1999)]. We also show that any basis of the orthogonal complement of a multipartite pure state is indistinguishable by separable operations if and only if this state cannot be a superposition of one or two orthogonal product states, i.e., has an orthogonal Schmidt number not less than three, thus generalize the recent work about indistinguishable bipartite subspaces [Watrous, Phys. Rev. Lett. 95, 080505 (2005)]. Notably, we obtain an explicit construction of indistinguishable subspaces of dimension 7 (or 6) by considering a composite quantum system consisting of two qutrits (resp., three qubits), which is slightly better than the previously known indistinguishable bipartite subspace with dimension 8. © 2009 IEEE.
Feng, Y, Duan, R & Ying, M 2009, 'LOCALLY UNDETERMINED STATES, GENERALIZED SCHMIDT DECOMPOSITION, AND APPLICATION IN DEISTRIBUTED COMUTING', QUANTUM INFORMATION & COMPUTATION, vol. 9, no. 11-12, pp. 997-1012.
View description>>
Multipartite quantum states that cannot be uniquely determined by their reduced states of all proper subsets of the parties exhibit some inherit `high-order' correlation. This paper elaborates this issue by giving necessary and sufficient conditions for a pure multipartite state to be locally undetermined, and moreover, characterizing precisely all the pure states sharing the same set of reduced states with it. Interestingly, local determinability of pure states is closely related to a generalized notion of Schmidt decomposition. Furthermore, we find that locally undetermined states have some applications to the well-known consensus problem in distributed computation. To be specific, given some physically separated agents, when communication between them, either classical or quantum, is unreliable and they are not allowed to use local ancillary quantum systems, then there exists a totally correct and completely fault-tolerant protocol for them to reach a consensus if and only if they share a priori a locally undetermined quantum state
Ferrie, C & Emerson, J 2009, 'Framed Hilbert space: hanging the quasi-probability pictures of quantum theory', New J. Phys., vol. 11, p. 063040.
View/Download from: Publisher's site
View description>>
Building on earlier work, we further develop a formalism based on themathematical theory of frames that defines a set of possible phase-space orquasi-probability representations of finite-dimensional quantum systems. Weprove that an alternate approach to defining a set of quasi-probabilityrepresentations, based on a more natural generalization of a classicalrepresentation, is equivalent to our earlier approach based on frames, andtherefore is also subject to our no-go theorem for a non-negativerepresentation. Furthermore, we clarify the relationship between thecontextuality of quantum theory and the necessity of negativity inquasi-probability representations and discuss their relevance as criteria fornon-classicality. We also provide a comprehensive overview of knownquasi-probability representations and their expression within the frameformalism.
Ferrie, C, Morris, R & Emerson, J 2009, 'Necessity of negativity in quantum theory', Phys. Rev. A, vol. 82, no. 4, p. 044103.
View/Download from: Publisher's site
View description>>
A unification of the set of quasiprobability representations using themathematical theory of frames was recently developed for quantum systems withfinite-dimensional Hilbert spaces, in which it was proven that suchrepresentations require negative probability in either the states or theeffects. In this article we extend those results to Hilbert spaces of infinitedimension, for which the celebrated Wigner function is a special case. Hence,this article presents a unified framework for describing the set of possiblequasiprobability representations of quantum theory, and a proof that thepresence of negativity is a necessary feature of such representations.
Hsieh, M-H & Wilde, MM 2009, 'Public and private communication with a quantum channel and a secret key', Physical Review A, vol. 80, no. 2.
View/Download from: Publisher's site
Hsieh, M-H & Wilde, MM 2009, 'Trading classical communication, quantum communication, and entanglement in quantum Shannon theory', IEEE Transactions on Information Theory, vol. 56, no. 9, pp. 4705-4730, September 2010, vol. 56, no. 9, pp. 4705-4730.
View/Download from: Publisher's site
View description>>
We give trade-offs between classical communication, quantum communication,and entanglement for processing information in the Shannon-theoretic setting.We first prove a unit-resource capacity theorem that applies to the scenariowhere only the above three noiseless resources are available for consumption orgeneration. The optimal strategy mixes the three fundamental protocols ofteleportation, super-dense coding, and entanglement distribution. We thenprovide an achievable rate region and a matching multi-letter converse for thedirect static capacity theorem. This theorem applies to the scenario where alarge number of copies of a noisy bipartite state are available (in addition toconsumption or generation of the above three noiseless resources). Our codingstrategy involves a protocol that we name the classically-assisted stateredistribution protocol and the three fundamental protocols. We finally providean achievable rate region and a matching mutli-letter converse for the directdynamic capacity theorem. This theorem applies to the scenario where a largenumber of uses of a noisy quantum channel are available in addition to theconsumption or generation of the three noiseless resources. Our coding strategycombines the classically-enhanced father protocol with the three fundamentalunit protocols.
Hsieh, M-H, Yen, W-T & Hsu, L-Y 2009, 'High performance entanglement-assisted quantum LDPC codes need little entanglement', IEEE Transactions on Information Theory, vol. 57, no. 3, pp. 1761-1769.
View/Download from: Publisher's site
View description>>
Though the entanglement-assisted formalism provides a universal connectionbetween a classical linear code and an entanglement-assisted quantumerror-correcting code (EAQECC), the issue of maintaining large amount of puremaximally entangled states in constructing EAQECCs is a practical obstacle toits use. It is also conjectured that the power of entanglement-assistedformalism to convert those good classical codes comes from massive consumptionof maximally entangled states. We show that the above conjecture is wrong byproviding families of EAQECCs with an entanglement consumption rate thatdiminishes linearly as a function of the code length. Notably, two families ofEAQECCs constructed in the paper require only one copy of maximally entangledstate no matter how large the code length is. These families of EAQECCs thatare constructed from classical finite geometric LDPC codes perform very wellaccording to our numerical simulations. Our work indicates that EAQECCs are notonly theoretically interesting, but also physically implementable. Finally,these high performance entanglement-assisted LDPC codes with low entanglementconsumption rates allow one to construct high-performance standard QECCs withvery similar parameters.
Lanyon, BP & Langford, NK 2009, 'Experimentally generating and tuning robust entanglement between photonic qubits', New Journal of Physics, vol. 11, no. 1, pp. 013008-013008.
View/Download from: Publisher's site
Lee, T 2009, 'A note on the sign degree of formulas'.
View description>>
Recent breakthroughs in quantum query complexity have shown that any formulaof size n can be evaluated with O(sqrt(n)log(n)/log log(n)) many quantumqueries in the bounded-error setting [FGG08, ACRSZ07, RS08b, Rei09]. Inparticular, this gives an upper bound on the approximate polynomial degree offormulas of the same magnitude, as approximate polynomial degree is a lowerbound on quantum query complexity [BBCMW01]. These results essentially answer in the affirmative a conjecture of O'Donnelland Servedio [O'DS03] that the sign degree--the minimal degree of a polynomialthat agrees in sign with a function on the Boolean cube--of every formula ofsize n is O(sqrt(n)). In this note, we show that sign degree is super-multiplicative under functioncomposition. Combining this result with the above mentioned upper bounds on thequantum query complexity of formulas allows the removal of logarithmic factorsto show that the sign degree of every size n formula is at most sqrt(n).
LEEDHAM, G, MA, Y & BLUMENSTEIN, M 2009, 'HANDWRITTEN SHORTHAND AND ITS FUTURE POTENTIAL FOR FAST MOBILE TEXT ENTRY', International Journal of Pattern Recognition and Artificial Intelligence, vol. 23, no. 05, pp. 1031-1051.
View/Download from: Publisher's site
View description>>
Handwritten shorthand systems were devised to enable writers to record information on paper at fast speeds, ideally at the speed of speech. While they have been in existence for many years it is only since the 17th Century that widespread usage appeared. Several shorthand systems flourished in the first half of the 20th century until the introduction and widespread use of electronic recording and dictation machines in the 1970's. Since then, shorthand usage has been in rapid decline, but has not yet become a lost skill. Pitman shorthand has been shown to possess unique advantages as a means of fast text entry which is particularly applicable to hand-held devices in mobile environments. This paper presents progress and critical research issues for a Pitman/Renqun Shorthand Online Recognition System. Recognition and transcription experiments are reported which indicate that a correct recognition and transcription rate of around 90% is currently possible.
Ramelow, S, Ratschbacher, L, Fedrizzi, A, Langford, NK & Zeilinger, A 2009, 'Discrete Tunable Color Entanglement', Physical Review Letters, vol. 103, no. 25.
View/Download from: Publisher's site
Saito, S, Tilma, T, Devitt, SJ, Nemoto, K & Semba, K 2009, 'Experimentally realizable controlled NOT gate in a flux qubit/resonator system', Physical Review B, vol. 80, no. 22.
View/Download from: Publisher's site
Shepherd, D & Bremner, MJ 2009, 'Temporally unstructured quantum computation', PROCEEDINGS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, vol. 465, no. 2105, pp. 1413-1439.
View/Download from: Publisher's site
View description>>
We examine theoretic architectures and an abstract model for a restricted class of quantum computation, called here temporally unstructured ('instantaneous') quantum computation because it allows for essentially no temporal structure within the quantum dynamics. Using the theory of binary matroids, we argue that the paradigm is rich enough to enable sampling from probability distributions that cannot, classically, be sampled efficiently and accurately. This paradigm also admits simple interactive proof games that may convince a sceptic of the existence of truly quantum effects. Furthermore, these effects can be created using significantly fewer qubits than are required for running Shor's algorithm. This journal is © 2009 The Royal Society.
Tomamichel, M, Colbeck, R & Renner, R 2009, 'Duality Between Smooth Min- and Max-Entropies', IEEE Trans. on Inf. Theory, 56 (2010) 4674-4681, vol. 56, no. 9, pp. 4674-4681.
View/Download from: Publisher's site
View description>>
In classical and quantum information theory, operational quantities such asthe amount of randomness that can be extracted from a given source or theamount of space needed to store given data are normally characterized by one oftwo entropy measures, called smooth min-entropy and smooth max-entropy,respectively. While both entropies are equal to the von Neumann entropy incertain special cases (e.g., asymptotically, for many independent repetitionsof the given data), their values can differ arbitrarily in the general case. In this work, a recently discovered duality relation between (non-smooth)min- and max-entropies is extended to the smooth case. More precisely, it isshown that the smooth min-entropy of a system A conditioned on a system Bequals the negative of the smooth max-entropy of A conditioned on a purifyingsystem C. This result immediately implies that certain operational quantities(such as the amount of compression and the amount of randomness that can beextracted from given data) are related. Such relations may, for example, haveapplications in cryptographic security proofs.
Ying, M, Feng, Y, Duan, R & Ji, Z 2009, 'An Algebra of Quantum Processes', ACM TRANSACTIONS ON COMPUTATIONAL LOGIC, vol. 10, no. 3, pp. 1-36.
View/Download from: Publisher's site
View description>>
We introduce an algebra qCCS of pure quantum processes in which communications by moving quantum states physically are allowed and computations are modeled by super-operators, but no classical data is explicitly involved. An operational semantics of qCCS is presented in terms of (nonprobabilistic) labeled transition systems. Strong bisimulation between processes modeled in qCCS is defined, and its fundamental algebraic properties are established, including uniqueness of the solutions of recursive equations. To model sequential computation in qCCS, a reduction relation between processes is defined. By combining reduction relation and strong bisimulation we introduce the notion of strong reduction-bisimulation, which is a device for observing interaction of computation and communication in quantum systems. Finally, a notion of strong approximate bisimulation (equivalently, strong bisimulation distance) and its reduction counterpart are introduced. It is proved that both approximate bisimilarity and approximate reduction-bisimilarity are preserved by various constructors of quantum processes. This provides us with a formal tool for observing robustness of quantum processes against inaccuracy in the implementation of its elementary gates. © 2009 ACM.
Alavi, A, Cavanagh, B, Tuxworth, G, Meedeniya, A, Mackay-Sim, A & Blumenstein, M 1970, 'Automated classification of dopaminergic neurons in the rodent brain', 2009 International Joint Conference on Neural Networks, 2009 International Joint Conference on Neural Networks (IJCNN 2009 - Atlanta), IEEE, pp. 81-88.
View/Download from: Publisher's site
View description>>
Accurate morphological characterization of the multiple neuronal classes of the brain would facilitate the elucidation of brain function and the functional changes that underlie neurological disorders such as Parkinson's diseases or Schizophrenia. Manual morphological analysis is very timeconsuming and suffers from a lack of accuracy because some cell characteristics are not readily quantified. This paper presents an investigation in automating the classification of dopaminergic neurons located in the brainstem of the rodent, a region critical to the regulation of motor behaviour and is implicated in multiple neurological disorders including Parkinson's disease. Using a Carl Zeiss Axioimager Z1 microscope with Apotome, salient information was obtained from images of dopaminergic neurons using a structural feature extraction technique. A data set of 100 images of neurons was generated and a set of 17 features was used to describe their morphology. In order to identify differences between neurons, 2-dimensional and 3-dimensional image representations were analyzed. This paper compares the performance of three popular classification methods in bioimage classification (Support Vector Machines (SVMs), Back Propagation Neural Networks (BPNNs) and Multinomial Logistic Regression (MLR)), and the results show a significant difference between machine classification (with 97% accuracy) and human expert based classification (72% accuracy). © 2009 IEEE.
Bogdanov, A & Qiao, Y 1970, 'On the Security of Goldreich’s One-Way Function', APPROXIMATION, RANDOMIZATION, AND COMBINATORIAL OPTIMIZATION, 12th International Workshop on Approximation Algorithms for Combinatorial Optimization Problems/13th International Workshop on Randomization and Computation, Springer Berlin Heidelberg, Berkeley, CA, pp. 392-405.
View/Download from: Publisher's site
Bremner, MJ 1970, 'Are random pure states useful for quantum computation?', Institute of Mathematics and its Applications Conference on Quantum Computing and Complexity of Quantum Simulation, London.
View description>>
Quantum computing represents the prodigiously fertile union of quantum physics with the theory of computation and especially issues of computational complexity. It is known that quantum processes can offer solutions to some information processing tasks that are exponentially more efficient than any known classical methods. Perhaps the most celebrated example is Shor's 1994 quantum algorithm for integer factorisation. In recent years there has been a surge of activity in our understanding of quantum computational power and its prospective applicability and limitations. A variety of problems in diverse areas of mathematics, has been identified (so-called BQP-complete problems) that have efficient quantum algorithms and also embody the full power of efficient quantum computation. In quantum many body physics (including study of quantum circuits, of local hamiltonians, and of further formalisms such as measurement based computing) some problems have been shown, surprisingly, to admit efficient classical solution while others (e.g. certain ground state properties of local hamiltonians) are likely to be computationally intractible, having been shown to be so-called QMA-complete. Quantum entanglement is often regarded as an essential ingredient in these considerations and there has been considerable development in understanding its scaling behaviour in many body systems. This conference is devoted to recent theoretical developments in these areas and related issues. Invited speakers will be requested to include overview material (in addition to recent research) with the aim of making the essential ideas of these important developments accessible to a broader audience of QIP researchers.
Bremner, MJ 1970, 'Instantaneous quantum computation', Institute of Mathematics and its Applications Conference on Quantum Computing and Complexity of Quantum Simulation, London.
View description>>
Quantum computing represents the prodigiously fertile union of quantum physics with the theory of computation and especially issues of computational complexity. It is known that quantum processes can offer solutions to some information processing tasks that are exponentially more efficient than any known classical methods. Perhaps the most celebrated example is Shor's 1994 quantum algorithm for integer factorisation. In recent years there has been a surge of activity in our understanding of quantum computational power and its prospective applicability and limitations. A variety of problems in diverse areas of mathematics, has been identified (so-called BQP-complete problems) that have efficient quantum algorithms and also embody the full power of efficient quantum computation. In quantum many body physics (including study of quantum circuits, of local hamiltonians, and of further formalisms such as measurement based computing) some problems have been shown, surprisingly, to admit efficient classical solution while others (e.g. certain ground state properties of local hamiltonians) are likely to be computationally intractible, having been shown to be so-called QMA-complete. Quantum entanglement is often regarded as an essential ingredient in these considerations and there has been considerable development in understanding its scaling behaviour in many body systems. This conference is devoted to recent theoretical developments in these areas and related issues. Invited speakers will be requested to include overview material (in addition to recent research) with the aim of making the essential ideas of these important developments accessible to a broader audience of QIP researchers.
Bremner, MJ 1970, 'Instantaneous quantum computation', Twelth Workshop on Quantum Information Processing, Santa Fe, New Mexico, USA.
View description>>
Fifteen years ago, Shor's efficient quantum algorithms for factoring integers and evaluating discrete logarithms launched the field of quantum information processing (QIP) into the public consciousness. QIP is now one of the most active and fastest-growing research areas in computer science and physics, spanning topics such as quantum computation, quantum communication, and quantum cryptography. QIP 2009 is the twelfth in a series of international workshops dedicated to disseminating recent theoretical advances in this field.
Bremner, MJ 1970, 'Most quantum states are useless for measurement-based quantum computation', Twelth Workshop on Quantum Information Processing, Santa Fe, New Mexico, USA.
View description>>
Fifteen years ago, Shor's efficient quantum algorithms for factoring integers and evaluating discrete logarithms launched the field of quantum information processing (QIP) into the public consciousness. QIP is now one of the most active and fastest-growing research areas in computer science and physics, spanning topics such as quantum computation, quantum communication, and quantum cryptography. QIP 2009 is the twelfth in a series of international workshops dedicated to disseminating recent theoretical advances in this field.
Devetak, I, Brun, TA & Hsieh, M-H 1970, 'Entanglement-Assisted Quantum Error-Correcting Codes', New Trends in Mathematical Physics: Selected Contributions of the XVth International Congress on Mathematical Physics, International Congress on Mathematical Physics, Springer Netherlands, Rio de Janeiro, Brazil, pp. 161-172.
View/Download from: Publisher's site
View description>>
We develop the theory of entanglement-assisted quantum error correcting codes (EAQECCs), a generalization of the stabilizer formalism to the setting in which the sender and receiver have access to pre-shared entanglement. Conventional stabilizer codes are equivalent to self-orthogonal symplectic codes. In contrast, EAQECCs do not require self-orthogonality, which greatly simplifies their construction. We show how any classical quaternary block code can be made into a EAQECC. Furthermore, the error-correcting power of the quantum codes follows directly from the power of the classical codes.
Hsieh, M, Yen, W & Hsu, L 1970, 'Performance of Entanglement-assisted Quantum LDPC Codes', The 21th Quantum Information Technology Symposium (QIT21), The University of Electro-Communications, Japan.
Hsieh, M-H & Wilde, MM 1970, 'Optimal Trading of Classical Communication, Quantum Communication, and Entanglement', THEORY OF QUANTUM COMPUTATION, COMMUNICATION, AND CRYPTOGRAPHY, 4th Workshop on Theory of Quantum Computation, Communication and Cryptography, Springer Berlin Heidelberg, Waterloo, CANADA, pp. 85-93.
View/Download from: Publisher's site
Lee, T & Shraibman, A 1970, 'Disjointness is Hard in the Multiparty Number-on-the-Forehead Model', computational complexity, Springer Science and Business Media LLC, pp. 309-336.
View/Download from: Publisher's site
Lee, T, Schechtman, G & Shraibman, A 1970, 'Lower Bounds on Quantum Multiparty Communication Complexity', 2009 24th Annual IEEE Conference on Computational Complexity, 2009 24th Annual IEEE Conference on Computational Complexity (CCC), IEEE.
View/Download from: Publisher's site
Liu, W, Li, S & Renz, J 1970, 'Combining RCC-8 with qualitative direction calculi: Algorithms and complexity', IJCAI International Joint Conference on Artificial Intelligence, International Joint Conference on Artificial Intelligence, AAAI Press, Pasadena, California, USA, pp. 854-859.
View description>>
Increasing the expressiveness of qualitative spatial calculi is an essential step towards meeting the requirements of applications. This can be achieved by combining existing calculi in a way that we can express spatial information using relations from both calculi. The great challenge is to develop reasoning algorithms that are correct and complete when reasoning over the combined information. Previous work has mainly studied cases where the interaction between the combined calculi was small, or where one of the two calculiwas very simple. In this paper we tackle the important combination of topological and directional information for extended spatial objects. We combine some of the best known calculi in qualitative spatial reasoning (QSR), the RCC8 algebra for representing topological information, and the Rectangle Algebra (RA) and the Cardinal Direction Calculus (CDC) for directional information. Although CDC is more expressive than RA, reasoning with CDC is of the same order as reasoning with RA. We show that reasoning with basic RCC8 and basic RA relations is in P, but reasoning with basic RCC8 and basic CDC relations is NP-Complete.
Nguyen, V, Blumenstein, M & Leedham, G 1970, 'Global Features for the Off-Line Signature Verification Problem', 2009 10th International Conference on Document Analysis and Recognition, 2009 10th International Conference on Document Analysis and Recognition, IEEE, pp. 1300-1304.
View/Download from: Publisher's site
View description>>
Global features based on the boundary of a signature and its projections are described for enhancing the process of automated signature verification. The first global feature is derived from the total 'energy' a writer uses to create their signature. The second feature employs information from the vertical and horizontal projections of a signature, focusing on the proportion of the distance between key strokes in the image, and the height/width of the signature. The combination of these features with the Modified Direction Feature (MDF) and the ratio feature showed promising results for the off-line signature verification problem. When being trained using 12 genuine specimens and 400 random forgeries taken from a publicly available database, the Support Vector Machine (SVM) classifier obtained an average error rate (AER) of 17.25%. The false acceptance rate (FAR) for random forgeries was also kept as low as 0.08%. © 2009 IEEE.
Son, JB, Lee, J, Blumenstein, M, Loo, Y-C, Guan, H & Panuwatwanich, K 1970, 'Generating Historical Condition Ratings for the Reliable Prediction of Bridge Deteriorations', IABSE Symposium, Bangkok 2009: Sustainable Infrastructure - Environment Friendly, Safe and Resource Efficient, IABSE Symposium, Bangkok 2009: Sustainable Infrastructure - Environment Friendly, Safe and Resource Efficient, International Association for Bridge and Structural Engineering (IABSE).
View/Download from: Publisher's site
View description>>
<p>Bridge Management Systems (BMSs) have been developed since the early 1990s as a decision support system (DSS) for effective Maintenance, Repair and Rehabilitation (MR&R) activities in a large bridge network. Historical condition ratings obtained from biennial bridge inspections are major resources for predicting future bridge deteriorations via BMSs. However, available historical condition ratings are very limited in all bridge agencies. This constitutes the major barrier for predicting reliably future structural performances. To alleviate this problem, the Backward Prediction Model (BPM) technique for generating the missing historical condition ratings has been developed, and its reliability has been verified using existing condition ratings available from the Maryland Department of Transportation, USA. The function of the BPM is to establish the correlations between the known condition ratings and non-bridge factors including climate, traffic volumes and population growth. Such correlations can then be used to obtain the bridge condition ratings of the missing years. Based on these generated datasets, the currently available bridge deterioration model can be used to predict future bridge conditions. The existing 4 National Bridge Inventory and 9 BPM-generated historical condition ratings are used as input data to compare the prediction accuracy using deterministic bridge deterioration models. The comparison results show that prediction error decreases as more historical data become available. This suggests that the BPM can be used to generate additional historical condition ratings, which is essential for bridge deterioration models to achieve more accurate prediction results. However, there are still significant limitations identified in the current bridge deterioration models. Hence, further research is necessary to improve the prediction accuracy of bridge deterioration models.</p>
Wilde, MM & Hsieh, M 1970, 'Superactive entanglement generation with a zero-capacity quantum channel and a non-distillable shared state', The 9th Asian Conference on Quantum Information Science (AQIS09), Nanjing, China.
Wilde, MM & Hsieh, M-H 1970, 'Entanglement generation with a quantum channel and a shared state', Proceedings of the 2010 IEEE International Symposium on Information Theory, pp. 2713-2717, Austin, Texas, USA, IEEE International Symposium onInformation Theory Proceedings, IEEE, Austin, USA, pp. 2713-2717.
View/Download from: Publisher's site
View description>>
We introduce a new protocol, the channel-state coding protocol, to quantumShannon theory. This protocol generates entanglement between a sender andreceiver by coding for a noisy quantum channel with the aid of a noisy sharedstate. The mother and father protocols arise as special cases of thechannel-state coding protocol, where the channel is noiseless or the state is anoiseless maximally entangled state, respectively. The channel-state codingprotocol paves the way for formulating entanglement-assisted quantumerror-correcting codes that are robust to noise in shared entanglement.Finally, the channel-state coding protocol leads to a Smith-Yardsuperactivation, where we can generate entanglement using a zero-capacityerasure channel and a non-distillable bound entangled state.