Agius, A, Morelato, M, Moret, S, Chadwick, S, Jones, K, Epple, R, Brown, J & Roux, C 2018, 'Dataset of coded handwriting features for use in statistical modelling', Data in Brief, vol. 16, pp. 1010-1024.
View/Download from: Publisher's site
View description>>
© 2017 The Authors The data presented here is related to the article titled, “Using handwriting to infer a writer's country of origin for forensic intelligence purposes” (Agius et al., 2017) [1]. This article reports original writer, spatial and construction characteristic data for thirty-seven English Australian1 writers and thirty-seven Vietnamese writers. All of these characteristics were coded and recorded in Microsoft Excel 2013 (version 15.31). The construction characteristics coded were only extracted from seven characters, which were: ‘g’ ‘h’ ‘th’ ‘M’ ‘0’ ‘7’ and ‘9’. The coded format of the writer, spatial and construction characteristics is made available in this Data in Brief in order to allow others to perform statistical analyses and modelling to investigate whether there is a relationship between the handwriting features and the nationality of the writer, and whether the two nationalities can be differentiated. Furthermore, to employ mathematical techniques that are capable of characterising the extracted features from each participant.
Agius, A, Morelato, M, Moret, S, Chadwick, S, Jones, K, Epple, R, Brown, J & Roux, C 2018, 'Using handwriting to infer a writer’s country of origin for forensic intelligence purposes', Forensic Science International, vol. 282, pp. 144-156.
View/Download from: Publisher's site
View description>>
© 2017 Elsevier B.V. Forensic science has traditionally focused the majority of its resources and objectives towards addressing Court-related questions. However, this view restricts the contribution of forensic science to one process and results in a loss of information as the investigative and intelligence roles are largely neglected. A forensic science discipline suffering from this imbalance is handwriting examination, which may be characterised as a time consuming and subjective process that is mostly carried out towards the end of the investigation for the purpose of judicial proceedings. Individual and habitual characteristics are the major handwriting features exploited, however alternate information concerning the author's native language could potentially be used as a key element in an intelligence framework. This research focussed on the detection of characteristics that differentiate Vietnamese and English Australian writers based on their English handwriting. The study began with the extraction of handwriting characteristics from the writing of people from the two populations. The data was analysed using a logistic regression model and a classification and regression tree (CRT). Each recognised four class characteristics that were capable of distinguishing between the two nationalities. The logistic regression and CRT models were both capable of correctly predicting 93% of cases. Their predictive capabilities were then tested and supported using blind exemplars in order to mirror casework settings. It appeared that when using their respective class characteristics, the two models were capable of differentiating English Australians from Vietnamese in the data set. This proof of concept research demonstrated the plausibility of exploiting this additional information from a handwriting trace and taking advantage of it in an intelligence-led framework.
Al-Asfi, M, McNevin, D, Mehta, B, Power, D, Gahan, ME & Daniel, R 2018, 'Assessment of the Precision ID Ancestry panel', International Journal of Legal Medicine, vol. 132, no. 6, pp. 1581-1594.
View/Download from: Publisher's site
View description>>
© 2018 Springer-Verlag GmbH Germany, part of Springer Nature Abstract The ability to provide accurate DNA-based forensic intelligence requires analysis of multiple DNA markers to predict the biogeographical ancestry (BGA) and externally visible characteristics (EVCs) of the donor of biological evidence. Massively parallel sequencing (MPS) enables the analysis of hundreds of DNA markers in multiple samples simultaneously, increasing the value of the intelligence provided to forensic investigators while reducing the depletion of evidential material resulting from multiple analyses. The Precision ID Ancestry Panel (formerly the HID Ion AmpliSeq™ Ancestry Panel) (Thermo Fisher Scientific) (TFS)) consists of 165 autosomal SNPs selected to infer BGA. Forensic validation criteria were applied to 95 samples using this panel to assess sensitivity (1 ng-15 pg), reproducibility (inter- and intra-run variability) and effects of compromised and forensic casework type samples (artificially degraded and inhibited, mixed source and aged blood and bone samples). BGA prediction accuracy was assessed using samples from individuals who self-declared their ancestry as being from single populations of origin (n = 36) or from multiple populations of origin (n = 14). Sequencing was conducted on Ion 318™ chips (TFS) on the Ion PGM™ System (TFS). HID SNP Genotyper v4.3.1 software (TFS) was used to perform BGA predictions based on admixture proportions (continental level) and likelihood estimates (sub-population level). BGA prediction was accurate at DNA template amounts of 125pg and 30pg using 21 and 25 PCR cycles respectively. HID SNP Genotyper continental level BGA assignments were concordant with BGAs for self-declared East Asian, African, European and South Asian individuals. Compromised, mixed source and admixed samples, in addition to sub-population level prediction, requires more extensive analysis.
Benson, N, Oliveria Dos Santos, R, Griffiths, K, Cole, N, Doble, P, Roux, C & Blanes, L 2018, 'Erratum to “The development of a stabbing machine for forensic textile damage analysis” [FSI (2017) 132–139]>', Forensic Science International, vol. 285, pp. 161-161.
View/Download from: Publisher's site
Berger, B, Berger, C, Heinrich, J, Niederstätter, H, Hecht, W, Hellmann, A, Rohleder, U, Schleenbecker, U, Morf, N, Freire-Aradas, A, McNevin, D, Phillips, C & Parson, W 2018, 'Dog breed affiliation with a forensically validated canine STR set', Forensic Science International: Genetics, vol. 37, pp. 126-134.
View/Download from: Publisher's site
View description>>
We tested a panel of 13 highly polymorphic canine short tandem repeat (STR) markers for dog breed assignment using 392 dog samples from the 23 most popular breeds in Austria, Germany, and Switzerland. This STR panel had originally been selected for canine identification. The dog breeds sampled in this study featured a population frequency ≥1% and accounted for nearly 57% of the entire pedigree dog population in these three countries. Breed selection was based on a survey comprising records for nearly 1.9 million purebred dogs belonging to more than 500 different breeds. To derive breed membership from STR genotypes, a range of algorithms were used. These methods included discriminant analysis of principal components (DAPC), STRUCTURE, GeneClass2, and the adegenet package for R. STRUCTURE analyses suggested 21 distinct genetic clusters. Differentiation between most breeds was clearly discernable. Fourteen of 23 breeds (61%) exhibited maximum mean cluster membership proportions of more than 0.70 with a highest value of 0.90 found for Cavalier King Charles Spaniels. Dogs of only 6 breeds (26%) failed to consistently show only one major cluster. The DAPC method yielded the best assignment results in all 23 declared breeds with 97.5% assignment success. The frequency-based assignment test also provided a high success rate of 87%. These results indicate the potential viability of dog breed prediction using a well-established and sensitive set of 13 canine STR markers intended for forensic routine use.
Braun, M, Kirkup, L & Chadwick, S 2018, 'The impact of inquiry orientation and other elements of cultural framework on student engagement in first year laboratory programs', International Journal of Innovation in Science and Mathematics Education, vol. 26, no. 4, pp. 30-48.
View description>>
Inquiry-oriented approaches to learning have gradually entered science laboratory programs, aiming to deliver an authentic experience of doing science, enhance student engagement with the material, and bring greater emphasis on generic skills underpinning graduate attributes. Although such approaches have demonstrated pedagogical advantages and improved student engagement, it is not clear how the advantages should be weighted against other elements of what may be regarded as the laboratory program's cultural framework. We analysed two large-enrolment introductory tertiary programs: physics and chemistry at the University of Technology Sydney. The programs differed in the level of inquiry orientation but also in approaches to design, logistics and relevancy. We found that, based on student survey responses, the putative advantages of a deeper inquiry orientation in the physics laboratory were insufficient to compensate for the apparent advantages arising from the other elements of the cultural framework in the chemistry laboratory.
Casey, E, Ribaux, O & Roux, C 2018, 'Digital transformations and the viability of forensic science laboratories: Crisis-opportunity through decentralisation', Forensic Science International, vol. 289, pp. e24-e25.
View/Download from: Publisher's site
Chadwick, S, de la Hunty, M & Baker, A 2018, 'Developing Awareness of Professional Behaviors and Skills in the First-Year Chemistry Laboratory', Journal of Chemical Education, vol. 95, no. 6, pp. 947-953.
View/Download from: Publisher's site
View description>>
Copyright © 2018 American Chemical Society and Division of Chemical Education, Inc. Students in first-year chemistry classes come from a variety of backgrounds, with many students unaware of the qualities and behaviors of a professional scientist. Throughout their degree, students will gradually develop their cognitive skills, but they may not be adequately taught or assessed on their professional behavioral skills as a scientist until late in the undergraduate course. By assessing the professional skills of students in first-year chemistry practical classes, this innovation commenced the development of students' professional identity from the beginning of their university experience. The skills that were assessed included preparedness, cooperation in the group activities, working safely in the laboratory, and time management. By engaging students with professional behaviors and what it means to be a scientist during their first semester, students can potentially carry this through their whole undergraduate degree. This task was received positively by students and staff with over 50% of students believing it increased their confidence in the laboratory. Staff also saw a significant improvement in student behavior and engagement because of this task.
Chadwick, S, Moret, S, Jayashanka, N, Lennard, C, Spindler, X & Roux, C 2018, 'Investigation of some of the factors influencing fingermark detection', Forensic Science International, vol. 289, pp. 381-389.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. The primary aims of fingermark detection research are to improve the quality and increase the rate of detection of identifiable impressions. This is usually performed through the development of new methods and technologies to provide alternatives to or improve current procedures. While research of this nature is important to pursue, it fails to address the underlying question related to the factors that affect the detection of a latent fingermark. There has been significant research that has examined the differences between techniques, donors and fingermark age, as well as the composition of latent fingermarks. However, they tend not to focus on determining how these factors influence the quality of the developed mark. This study involved the development and evaluation of over 14,000 natural fingermarks deposited on a variety of surfaces to examine the effect of substrate, age, donor variability (both inter- and intra-), depletions and type of finger on fingermark development. Fingermarks were deposited on four substrates (two non-porous and two porous) and developed with either indanedione-zinc (IND-Zn) or cyanoacrylate followed by rhodamine 6G staining (CA + R6G). Three independent assessors graded each mark on the quality of development using an absolute scale proposed by the UK Centre for Applied Science and Technology (CAST). The data generated from these assessments were then analysed for trends or other useful insights. The results from this work reaffirm that individual substrate characteristics (and the choice of development technique) play a significant role in determining the number and quality of marks developed. It was found that fingermarks were more likely to be detected on porous substrates and to also be of a higher quality than on non-porous. The effect of fingermark donor variability was also explored, with significant differences observed between donors and within donors. This research shows that current detectio...
Cheung, EYY, Gahan, ME & McNevin, D 2018, 'Prediction of biogeographical ancestry in admixed individuals', Forensic Science International: Genetics, vol. 36, pp. 104-111.
View/Download from: Publisher's site
Cheung, EYY, Gahan, ME & McNevin, D 2018, 'Predictive DNA analysis for biogeographical ancestry', Australian Journal of Forensic Sciences, vol. 50, no. 6, pp. 1-8.
View/Download from: Publisher's site
View description>>
© 2018, © 2018 Australian Academy of Forensic Sciences. Establishment of national DNA databases in Australia and overseas has increased the number of criminal convictions, yet a high volume of serious crime cases remain with no suspect profile nor any DNA database matches. In these circumstances prediction of biogeographical ancestry (BGA) and externally visible characteristics can assist by providing forensic intelligence in conjunction with, or in place of, eyewitness testimonies. To predict the BGA of an individual requires: genetic markers selected for their ability to differentiate between BGAs; representative BGA reference populations; and a prediction algorithm (‘classifier’) that predicts the BGA of an unknown individual based on genetic markers in the reference populations. The human genome contains autosomal ancestry informative markers that are easily harvested from publicly accessible collections of genotypes with associated ancestry information. A number of classification methods are available including Bayesian approaches and distance-based algorithms. BGA is likely to be continuous rather than discrete and some methods are inappropriate for the prediction of admixed BGA. As predictive services become available to the public and private sectors, there is a risk of results being misinterpreted if an inappropriate tool is applied. Understanding the underlying marker sets, reference populations and classification algorithms is required to prevent ill-informed predictions.
de la Hunty, M, Moret, S, Chadwick, S, Lennard, C, Spindler, X & Roux, C 2018, 'An effective Physical Developer (PD) method for use in Australian laboratories', Australian Journal of Forensic Sciences, vol. 50, no. 6, pp. 1-6.
View/Download from: Publisher's site
View description>>
© 2018, © 2018 Australian Academy of Forensic Sciences. Physical Developer (PD) is an underutilized technique for the development of latent marks on porous surfaces that have been wet, or as a subsequent technique in a development sequence. It is a multistep technique that works by selectively reducing silver ions to silver metal at nucleating sites in fingermark residue. Its use is associated with a plethora of issues, largely surrounding the inherent instability of the working solution. Recently, one of the components of the working solution, Synperonic N, has ceased production, and the recommended replacement is Tween 20. This article addresses factors during PD processing using Tween 20, other than reagent formulations that should be considered when using the technique.
El-Sayed, H, Sankar, S, Daraghmi, Y-A, Tiwari, P, Rattagan, E, Mohanty, M, Puthal, D & Prasad, M 2018, 'Accurate Traffic Flow Prediction in Heterogeneous Vehicular Networks in an Intelligent Transport System Using a Supervised Non-Parametric Classifier', Sensors, vol. 18, no. 6, pp. 1696-1696.
View/Download from: Publisher's site
View description>>
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. Heterogeneous vehicular networks (HETVNETs) evolve from vehicular ad hoc networks (VANETs), which allow vehicles to always be connected so as to obtain safety services within intelligent transportation systems (ITSs). The services and data provided by HETVNETs should be neither interrupted nor delayed. Therefore, Quality of Service (QoS) improvement of HETVNETs is one of the topics attracting the attention of researchers and the manufacturing community. Several methodologies and frameworks have been devised by researchers to address QoS-prediction service issues. In this paper, to improve QoS, we evaluate various traffic characteristics of HETVNETs and propose a new supervised learning model to capture knowledge on all possible traffic patterns. This model is a refinement of support vector machine (SVM) kernels with a radial basis function (RBF). The proposed model produces better results than SVMs, and outperforms other prediction methods used in a traffic context, as it has lower computational complexity and higher prediction accuracy.
El-Sayed, H, Sankar, S, Prasad, M, Puthal, D, Gupta, A, Mohanty, M & Lin, C-T 2018, 'Edge of Things: The Big Picture on the Integration of Edge, IoT and the Cloud in a Distributed Computing Environment', IEEE Access, vol. 6, pp. 1706-1717.
View/Download from: Publisher's site
View description>>
© 2013 IEEE. A centralized infrastructure system carries out existing data analytics and decision-making processes from our current highly virtualized platform of wireless networks and the Internet of Things (IoT) applications. There is a high possibility that these existing methods will encounter more challenges and issues in relation to network dynamics, resulting in a high overhead in the network response time, leading to latency and traffic. In order to avoid these problems in the network and achieve an optimum level of resource utilization, a new paradigm called edge computing (EC) is proposed to pave the way for the evolution of new age applications and services. With the integration of EC, the processing capabilities are pushed to the edge of network devices such as smart phones, sensor nodes, wearables, and on-board units, where data analytics and knowledge generation are performed which removes the necessity for a centralized system. Many IoT applications, such as smart cities, the smart grid, smart traffic lights, and smart vehicles, are rapidly upgrading their applications with EC, significantly improving response time as well as conserving network resources. Irrespective of the fact that EC shifts the workload from a centralized cloud to the edge, the analogy between EC and the cloud pertaining to factors such as resource management and computation optimization are still open to research studies. Hence, this paper aims to validate the efficiency and resourcefulness of EC. We extensively survey the edge systems and present a comparative study of cloud computing systems. After analyzing the different network properties in the system, the results show that EC systems perform better than cloud computing systems. Finally, the research challenges in implementing an EC system and future research directions are discussed.
French, J, Arscott, E, Morgan, R & Meakin, G 2018, 'Reply to letter to the editor: Response to “A study of the perception of verbal expressions of the strength of evidence”', Science & Justice, vol. 58, no. 4, pp. 299-299.
View/Download from: Publisher's site
Goodwin, C, Higgins, D, Tobe, SS, Austin, J, Wotherspoon, A, Gahan, ME & McNevin, D 2018, 'Singleplex quantitative real-time PCR for the assessment of human mitochondrial DNA quantity and quality', Forensic Science, Medicine and Pathology, vol. 14, no. 1, pp. 70-75.
View/Download from: Publisher's site
View description>>
© 2018, Springer Science+Business Media, LLC, part of Springer Nature. Mitochondrial DNA (mtDNA) can provide a means for forensic identity testing when genotyping of nuclear DNA (nuDNA) targets is not possible due to degradation or lack of template. For degraded samples, an indication of the quantity and quality of mtDNA is essential to allow selection of appropriately sized targets for hypervariable region (HVR) analysis, which may conserve sample and resources. Three human-specific mtDNA targets of increasing length (86, 190 and 452 base pairs) were amplified by singleplex quantitative real-time PCR (qPCR), capable of providing an index of mtDNA degradation from fragment length information. Quantification was achieved by preparation of a standard curve for each target, using a purified mtDNA standard containing all three targets of interest, which produced a linear, accurate and precise result from 1×108 to 10 copies. These novel assays demonstrated excellent sensitivity, specificity and reproducibility in line with the minimum information for qPCR experiments (MIQE) guidelines. Further, a separate inhibition control reaction was included to guide sample clean-up and ensure the validity of degradation assays. This protocol assists the selection and analysis of appropriately sized targets to maximize the chance of obtaining an informative result in downstream assays like sequencing.
Khuu, A, Chadwick, S, Moret, S, Spindler, X, Gunn, P & Roux, C 2018, 'Impact of one-step luminescent cyanoacrylate treatment on subsequent DNA analysis', Forensic Science International, vol. 286, pp. 1-7.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. Fingermarks can be exploited for both their ridge detail and touch DNA. One-step luminescent cyanoacrylate (CA) fuming techniques used for fingermark enhancement, such as PolyCyano UV (Foster + Freeman Ltd) and Lumicyano™ (Crime Science Technology), claim to be compatible with DNA analysis as they reduce the need for post‐staining to increase contrast of the developed fingermark. The aim of this study was to determine the impact that these one-step luminescent cyanoacrylates have on DNA analysis and how they compare to conventional CA techniques. Four donors each deposited five sets of natural fingermarks, to which a known amount of washed saliva cells was dispensed onto half of each set of fingermarks. Each set was treated with either a conventional CA technique or a one‐step luminescent CA technique prior to collection and processing of DNA, with one set left as a non-fumed control. It was found that DNA was still recoverable and detectable following each of the treatments. Lumicyano™ had a similar impact on DNA profiles as conventional CA fuming and with post‐stain, however, the degradation effect of PolyCyano UV on DNA was greater than the conventional treatments. For quantities of DNA such as that from touch DNA, the use of PolyCyano UV to enhance fingermarks may impact subsequent DNA analysis by causing allele drop out at larger fragment sizes.
Liu, T, Zhang, W, McLean, P, Ueland, M, Forbes, SL & Su, SW 2018, 'Electronic Nose-Based Odor Classification using Genetic Algorithms and Fuzzy Support Vector Machines', International Journal of Fuzzy Systems, vol. 20, no. 4, pp. 1309-1320.
View/Download from: Publisher's site
View description>>
© 2018, Taiwan Fuzzy Systems Association and Springer-Verlag GmbH Germany, part of Springer Nature. Electronic nose devices consisting of a matrix of sensors to sense the smell of various target gases have received considerable attention during the past two decades. This paper presents an efficient classification algorithm for a self-designed electronic nose, which integrates both genetic algorithms (GAs) and fuzzy support vector machines (FSVMs) to detect the target odor. GAs are applied to select the informative features and the optimal model parameters of FSVMs. FSVMs are adopted as fitness evaluation criterion and the sequent odor classifier, which can reduce the outlier effects and provide a robust and accurate classification. This proposed algorithm has been compared with some commonly used learning algorithms, such as support vector machine, the k-nearest neighbors and other combination algorithms. This study is based on experimental data collected from the response of the UTS NOS.E, which is the electronic nose system developed by the University of Technology Sydney NOS.E team. In comparison with other approaches, the experiment results show that the proposed odor classification algorithm can significantly improve the classification accuracy by selecting high-quality features and reach to 92.05% classification accuracy.
Maitre, M, Horder, M, Kirkbride, KP, Gassner, A-L, Weyermann, C, Roux, C & Beavis, A 2018, 'A forensic investigation on the persistence of organic gunshot residues', Forensic Science International, vol. 292, pp. 1-10.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. Gunshot residues (GSR) are a potential form of forensic traces in firearm-related events. In most forensic laboratories, GSR analyses focus on the detection and characterisation of the inorganic components (IGSR), which are mainly particles containing mixtures of lead, barium and antimony originating from the primer. The increasing prevalence of heavy metal-free ammunition challenges the current protocols used for IGSR analysis. To provide complementary information to IGSR particles, the current study concentrated on the organic components (OGSR) arising from the combustion of the propellant. The study focused on four compounds well-known as being part of OGSR: ethylcentralite (EC), methylcentralite (MC), diphenylamine (DPA), N-nitrosodiphenylamine (N-nDPA). This study assessed the retention of these OGSR traces on a shooter's hands. The overall project aim was to provide appropriate information regarding OGSR persistence, which can be suitable to be integrated into the interpretation framework of OGSR as recommended by the recent ENFSI Guideline for Evaluative Reporting in Forensic Science. The persistence was studied through several intervals ranging from immediately after discharge to four hours and two ammunition calibres were chosen:.40 S&W calibre, used by the NSW Police Force; and.357 Magnum, which is frequently encountered in Australian casework. This study successfully detected the compounds of interest up to four hours after discharge. The trends displayed a large decrease in the amount detected during the first hour. A large variability was also observed due to numerous factors involved in the production, deposition and collection of OGSR.
Maitre, M, Kirkbride, KP, Horder, M, Roux, C & Beavis, A 2018, 'Thinking beyond the lab: organic gunshot residues in an investigative perspective', Australian Journal of Forensic Sciences, vol. 50, no. 6, pp. 1-7.
View/Download from: Publisher's site
View description>>
© 2018, © 2018 Australian Academy of Forensic Sciences. Gunshot residues (GSR) are a common form of evidence in cases involving questions related to the association of a person of interest (POI) to a firearm-related event. GSR analyses currently focus on the detection and characterisation of the inorganic components of GSR (IGSR), which are typically particles composed of lead, barium and antimony originating from the primer. However, certain particles cannot be assigned to IGSR with a high degree of confidence due to possibility of being derived from industrial or domestic sources. Moreover, the increasing prevalence of the use of heavy metal-free ammunition challenges the current protocols used for IGSR analysis. In order to provide complementary evidence to IGSR particles, the current study focused on detecting the organic components (OGSR) arising from ammunition propellant. As the study focuses on the persistence of OGSR, three compounds well known as being part of OGSR were selected: ethyl centralite (EC), diphenylamine (DPA) and N-nitrosodiphenylamine (NnDPA). The study assessed the retention of OGSR traces on a person’s hands up to 1 h after they had discharged a firearm.
McNevin, D 2018, 'Bayesian interpretation of discrete class characteristics', Forensic Science International, vol. 292, pp. 125-130.
View/Download from: Publisher's site
View description>>
Bayesian interpretation of forensic evidence has become dominated by the likelihood ratio (LR) with a large LR generally considered favourable to the prosecution hypothesis, HP, over the defence hypothesis, HD. However, the LR simply quantifies by how much the prior odds ratio of the probability of HP relative to HD has been improved by the forensic evidence to provide a posterior ratio. Because the prior ratio is mostly neglected, the posterior ratio is largely unknown, regardless of the LR used to improve it. In fact, we show that the posterior ratio will only favour HP when LR is at least as large as the number of things that could possibly be the source of that evidence, all being equally able to contribute. This restriction severely limits the value of evidence to the prosecution when only a single, discrete class characteristic is used to match a subset of these things to the evidence. The limitation can be overcome by examining more than one individual characteristic, as long as they are independent of each other, as they are for the genotypes at multiple loci combined for DNA evidence. We present a criterion for determining how many such characteristics are required. Finally, we conclude that a frequentist interpretation is inappropriate as a measure of the strength of forensic evidence precisely because it only estimates the denominator of the LR.
Morelato, M, Broséus, J, De Grazia, A, Tahtouh, M, Esseiva, P & Roux, C 2018, 'Forensic drug intelligence and the rise of cryptomarkets. Part II: Combination of data from the physical and virtual markets', Forensic Science International, vol. 288, pp. 201-210.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. Technology provides new ways to access customers and suppliers while enhancing the security of off-line criminal activity. Since the first cryptomarket, Silk Road, in 2011, cryptomarkets have transformed the traditional drug sale by facilitating the creation of a global network of vendors and buyers. Due to the fragmented nature of traces that result from illegal activities, combining the results of concurrent processes based on traces of different nature should provide supplementary benefit to understand the drug market. This article compares the data of the Australian virtual market (in particular data extracted from cryptomarkets) to the data related to traditional market descriptors, namely national seizures and arrests, prevalence data, shipping countries of seized post shipments as well as outcomes of specific surveys targeting users’ behaviour online. Results revealed the domestic nature of the online illicit drug trade in Australia which is dominated by amphetamine-type substances (ATS), in particular methylamphetamine and cannabis. These illicit drugs were also the most seized drugs on the physical market. This article shows that the combination of different information offers a broader perspective of the illicit drug market in Australia and thus provides stronger arguments for policy makers. It also highlights the links between the virtual and physical markets.
Moret, S, Scott, E, Barone, A, Liang, K, Lennard, C, Roux, C & Spindler, X 2018, 'Metal-Organic Frameworks for fingermark detection — A feasibility study', Forensic Science International, vol. 291, pp. 83-93.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. Metal-Organic Frameworks (MOFs) are porous crystalline structures, currently used as sensors, separation membranes, and as catalysts. Due to their physicochemical and optical properties, they have been recently proposed for fingermark detection. This study further explored their potential for fingermark detection. Natural fingermarks, as well as charged and protein-enriched marks, were used to test the efficiency of the technique. Various parameters, such as precursor concentration, pH, immersion time and detection protocols, were investigated and optimised. The performance of the optimised MOF-based method was then compared to that of routinely used techniques. The results obtained indicated that MOFs can effectively detect fingermarks, especially protein-rich marks such as marks contaminated with body fluids. However, after comparison and evaluation against benchmark techniques, results were judged to be inferior to those from currently employed detection methods. However, with further research and optimisation MOFs may be promising as an alternative to current powder suspension techniques.
Robertson, J & Roux, C 2018, 'The forensic scientist of the future – are universities prepared?', Australian Journal of Forensic Sciences, vol. 50, no. 4, pp. 305-306.
View/Download from: Publisher's site
Roux, C, Ribaux, O & CRISPINO, F 2018, 'Forensic science 2020 – the end of the crossroads?', Australian Journal of Forensic Sciences, vol. 50, no. 6, pp. 1-12.
View/Download from: Publisher's site
View description>>
© 2018, © 2018 Informa UK Limited, trading as Taylor & Francis Group. Forensic science has been at the crossroads for over a decade. While this situation is a fertile ground for discussion, security problem solving and the sound administration of justice cannot be put on hold until solutions pleasing everyone emerge. In all practical reality, forensic science will continue to be applied because it is simply the most reliable way to reconstruct the past through the exploitation of relics of criminal activities and by logical treatment of the collected information. In this paper, it is argued that instead of exclusively focusing on error management and processes, we should also question the very ontological nature of forensic science. Not only should the dominant conception of forensic sciences as a patchwork of disciplines assisting the criminal justice system be challenged, but forensic science’s own fundamental principles should also be better enunciated and promoted so they can be more broadly accepted and understood. Such changes invite operations, education and research to become more collective and interdisciplinary. This is necessary to fully exploit the investigative, epidemiological, court and social functions of forensic science. We ought to ask the question: will forensic science reach the end of the crossroads soon?.
Scudder, N, McNevin, D, Kelty, SF, Walsh, SJ & Robertson, J 2018, 'Forensic DNA phenotyping: Developing a model privacy impact assessment', Forensic Science International: Genetics, vol. 34, pp. 222-230.
View/Download from: Publisher's site
View description>>
Forensic scientists around the world are adopting new technology platforms capable of efficiently analysing a larger proportion of the human genome. Undertaking this analysis could provide significant operational benefits, particularly in giving investigators more information about the donor of genetic material, a particularly useful investigative lead. Such information could include predicting externally visible characteristics such as eye and hair colour, as well as biogeographical ancestry. This article looks at the adoption of this new technology from a privacy perspective, using this to inform and critique the application of a Privacy Impact Assessment to this emerging technology. Noting the benefits and limitations, the article develops a number of themes that would influence a model Privacy Impact Assessment as a contextual framework for forensic laboratories and law enforcement agencies considering implementing forensic DNA phenotyping for operational use.
Scudder, N, McNevin, D, Kelty, SF, Walsh, SJ & Robertson, J 2018, 'Massively parallel sequencing and the emergence of forensic genomics: Defining the policy and legal issues for law enforcement', Science & Justice, vol. 58, no. 2, pp. 153-158.
View/Download from: Publisher's site
View description>>
© 2017 The Chartered Society of Forensic Sciences Use of DNA in forensic science will be significantly influenced by new technology in coming years. Massively parallel sequencing and forensic genomics will hasten the broadening of forensic DNA analysis beyond short tandem repeats for identity towards a wider array of genetic markers, in applications as diverse as predictive phenotyping, ancestry assignment, and full mitochondrial genome analysis. With these new applications come a range of legal and policy implications, as forensic science touches on areas as diverse as ‘big data’ privacy and protected health information. Although these applications have the potential to make a more immediate and decisive forensic intelligence contribution to criminal investigations, they raise policy issues that will require detailed consideration if this potential is to be realised. The purpose of this paper is to identify the scope of the issues that will confront forensic and user communities.
Seckiner, D, Mallett, X, Roux, C, Meuwly, D & Maynard, P 2018, 'Forensic image analysis – CCTV distortion and artefacts', Forensic Science International, vol. 285, pp. 77-85.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. As a result of the worldwide deployment of surveillance cameras, authorities have gained a powerful tool that captures footage of activities of people in public areas. Surveillance cameras allow continuous monitoring of the area and allow footage to be obtained for later use, if a criminal or other act of interest occurs. Following this, a forensic practitioner, or expert witness can be required to analyse the footage of the Person of Interest. The examination ultimately aims at evaluating the strength of evidence at source and activity levels. In this paper, both source and activity levels are inferred from the trace, obtained in the form of CCTV footage. The source level alludes to features observed within the anatomy and gait of an individual, whilst the activity level relates to activity undertaken by the individual within the footage. The strength of evidence depends on the value of the information recorded, where the activity level is robust, yet source level requires further development. It is therefore suggested that the camera and the associated distortions should be assessed first and foremost and, where possible, quantified, to determine the level of each type of distortion present within the footage. A review of the ‘forensic image analysis’ review is presented here. It will outline the image distortion types and detail the limitations of differing surveillance camera systems. The aim is to highlight various types of distortion present particularly from surveillance footage, as well as address gaps in current literature in relation to assessment of CCTV distortions in tandem with gait analysis. Future work will consider the anatomical assessment from surveillance footage.
Ueland, M, Forbes, SL & Stuart, BH 2018, 'Seasonal variation of fatty acid profiles from textiles associated with decomposing pig remains in a temperate Australian environment', Forensic Chemistry, vol. 11, pp. 120-127.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. A methodology to examine the human post-mortem decomposition process has been developed through the monitoring of chemical changes to decomposition fluids absorbed by clothing. Model surface burials using clothed pigs were established during summer and winter seasons in a temperate region of Australia. Three clothing materials were investigated: cotton, polyester and cotton-polyester. Lipid decomposition products were extracted from the textiles and the fatty acid composition measured as a function of burial time using gas chromatography – mass spectrometry (GC-MS). Two derivatisation methods for the fatty acids were compared to establish the most effective approach and it was established that a trimethylsilylation derivatisation method is the optimal preparation technique. The summer trials revealed two rates of transformation of fatty acids from unsaturated to saturated forms, with a faster rate of change occurring earlier in the trials. A different pattern of behaviour was observed for the fatty acids detected during the winter trial, with a decrease in saturated fatty acids initially observed, followed by the conversion of unsaturated to saturated fatty acids until the end of trial. The initial change observed during the winter trial was attributed to a dehydrogenation process caused by microbiological enyzymatic activity. The study has demonstrated the feasibility of examining lipid decomposition products collected in clothing from burials to provide insight into the conditions and length of burial.
Ward, J 2018, 'The past, present and future state of missing persons investigations in Australia', Australian Journal of Forensic Sciences, vol. 50, no. 6, pp. 1-15.
View/Download from: Publisher's site
Watherston, J, McNevin, D, Gahan, ME, Bruce, D & Ward, J 2018, 'Current and emerging tools for the recovery of genetic information from post mortem samples: New directions for disaster victim identification', Forensic Science International: Genetics, vol. 37, pp. 270-282.
View/Download from: Publisher's site
View description>>
© 2018 Elsevier B.V. DNA profiling has emerged as the gold standard for the identification of victims in mass disaster events providing an ability to identify victims, reassociate remains and provide investigative leads at a relatively low cost, and with a high degree of discrimination. For the majority of samples, DNA-based identification can be achieved in a fast, streamlined and high-throughput manner. However, a large number of remains will be extremely compromised, characteristic of mass disasters. Advances in technology and in the field of forensic biology have increased the options for the collection, sampling, preservation and processing of samples for DNA profiling. Furthermore, recent developments now allow a vast array of new genetic markers and genotyping techniques to extract as much genetic information from a sample as possible, ensuring that identification is not only accurate but also possible where material is degraded, or limited. Where historically DNA profiling has involved comparison with ante mortem samples or relatives, now DNA profiling can direct investigators towards putative victims or relatives, for comparison through the determination of externally visible characteristics, or biogeographical ancestry. This paper reviews the current and emerging tools available for maximising the recovery of genetic information from post mortem samples in a disaster victim identification context.
Zhan, X, Adnan, A, Zhou, Y, Khan, A, Kasim, K & McNevin, D 2018, 'Forensic characterization of 15 autosomal STRs in four populations from Xinjiang, China, and genetic relationships with neighboring populations', Scientific Reports, vol. 8, no. 1.
View/Download from: Publisher's site
View description>>
AbstractThe Xinjiang Uyghur Autonomous Region of China (XUARC) harbors 47 ethnic groups including the Manchu (MCH: 0.11%), Mongols (MGL: 0.81%), Kyrgyz (KGZ: 0.86%) and Uzbek (UZK: 0.066%). To establish DNA databases for these populations, allele frequency distributions for 15 autosomal short tandem repeat (STR) loci were determined using the AmpFlSTR Identifiler PCR amplification kit. There was no evidence of departures from Hardy–Weinberg equilibrium (HWE) in any of the four populations and minimal departure from linkage equilibrium (LE) for a very small number of pairwise combinations of loci. The probabilities of identity for the different populations ranged from 1 in 1.51 × 1017 (MCH) to 1 in 9.94 × 1018 (MGL), the combined powers of discrimination ranged from 0.99999999999999999824 (UZK) to 0.9999999999999999848 (MCH) and the combined probabilities of paternal exclusion ranged from 0.9999979323 (UZK) to 0.9999994839 (MCH). Genetic distances, a phylogenetic tree and principal component analysis (PCA) revealed that the MCH, KGZ and UZK are genetically closer to the Han population of Liaoning and the Mongol population of Mongolia while the MGL are closer to Han, Japanese, Korean, Malaysian, Hong Kong Han and Russians living in China.