PhD training in Data Science
- Vision of the PhD training in Data Science for future world leaders
- PhD training by years : 2023 - 2024 | 2022 - 2023 | 2021 - 2022 | 2020 - 2021 | 2019 - 2022
- Case studies : Daniel Widdowson (2020 - 2024) and Jonathan Balasingham (2021 - 2025)
- History : doctoral network AI for Future Digital Health and its leadership team
- PhD projects of 17 students : 2019 cohort | 2020 cohort | 2021 cohort
Vision of the PhD training in Data Science
- The PhD training covers the foundational skills to work with all real big datasets.
- Most topics are motivated by the latest developments in Geometric Data Science.
- Our sessions are joint with the Data Science Theory and Applications group seminar.
- We discuss cases when publications and real databases go embarrassingly wrong.
- More advanced results are presented at the interdisciplinary MIF++ seminar.
- In the last 30 min PhD students are invited to practice their conference talks.
Back to Top of this page | Back to Home page
Cohort-based training for PhD students
- Vitaliy Kurlin leads the following cohort-based PhD training programme in the doctoral network from October 2019 :
every year 20 two-hour weekly sessions cover the fundamental concepts of Data Science for justified applications. - Students with complementary skills can be paired to encourage peer learning, e.g. PhD students from the networks EPSRC CDT in Distributed Algorithms, DTC Materials Chemistry, LIV.INNO CDT for Innovation in Data Intensive Science.
- Anonymous student feedback (pdf, 1M) on all lectures and tutorials by Vitaliy Kurlin and Olga Anosova in the first year 2019-2020. Many students gave talks at the BCS Specialist Group on Artificial Intelligence workshops in 2020 and 2021.
- Most sessions in 2023-2024 will be delivered by Dr Olga Anosova. The sessions from February 2024 will cover the recent advances of Geometric Data Science, some of which have been published in NeurIPS 2022, CVPR 2023, ICML 2023.
- From 9th February 2024, the PhD training is hybrid on Fridays between 11.00-12.30 (UK time) in Computer Science (Ashton 2.08) and on zoom jointly with the DSTA group seminar. If you would like to join, please e-mail Vitaliy Kurlin.
Back to Top of this page | Back to Home page
Geometric Data Science in Spring 2024 (Fridays between 11-12.30 in Ashton 2.08)
If you would like to join us on zoom, please e-mail. Tutorial talks on Geometric Data Science were delivered at
- January 2024 : 30 min at the JMM (Joint Mathematics Meetings), San Francisco (US)
- March 2024 : 90 min at Foundational Aspects of Neuro-Symbolic Computing, Santiago (Chile)
- March 2024 : 60 min at the Materials Informatics workshop in IMSI, Chicago (US)
The mini-symposia covering this and related topics are planned at the following conferences:
- May 2024 : 4 hours at the SIAM Mathematical Aspects of Materials Science, Pittsburgh (US)
- July 2024 : 2 hours at the ECM9 (European Congress of Mathematicians), Sevilla (Spain)
- 10 May 2024 : reduce dimensions or not to reduce? [dimensionality reduction] (Vitaliy Kurlin) Video (54 min)
Outline. Many algorithms for dimensionality reduction such as t-SNE and UMAP are stochastic, meaning that their outputs differ at different runs on different machines for the same input. The output of many deterministic algorithms such as PCA (Principal Component Analysis) can discontinuously change under small perturbations of input data. Mathematicians proved that any dimensionality reduction (a function R^m->R^n for all m>n) is either discontinuous (makes close points distant) or collapses an unbounded region to a single point (loses an infinite amount of data). Hence, any justified comparison of high-dimensional data can be done only in the original high-dimensional space, not by any lower-dimensional projection. Low-dimensional maps can be useful for visualisation if their coordinates are explicitly defined. - 3 May 2024 : Richard Feynman's visual hint [the Crystal Isometry Principle] (Vitaliy Kurlin) Video (87 min)
Outline. The first chapter "Atoms and motion" of Feynman's lectures on physics included Fig.1-7 with a table of 7 cubic crystals and their smallest inter-atomic distances. The Eureka moment happened in May 2021 when comparing these 7 distances inspired much larger scale comparisons of all crystals by geometric invariants without chemical elements. For all 660+ thousand periodic crystals in the Cambridge Structural Database, 200+ billion pairwise comparisons by the AMD and PDD invariants were completed within two days on a modest desktop computer. The main conclusion is the Crystal Isometry Principle saying that any real periodic crystal is uniquely determined by its atomic geometry without chemistry. - 26 April 2024 : can we sense rigid shapes? [isometry classification of clouds] (Vitaliy Kurlin) Video (86 min)
Outline. The SSS theorem from school geometry says that any triangles (clouds of 3 unordered points) are congruent (isometric) if and only if they have the same three sides ordered by length. An extension of this theorem to more points in higher dimensions was practical only for clouds of m ordered points, which are uniquely determined up to isometry by a matrix of m x m distances. If points are unordered, comparing m! matrices under all permutations of m points is impractical. We will introduce a complete (under rigid motion) and Lipschitz continuous invariant for all clouds of m unordered points, which is computable in polynomial time of m in any fixed Euclidean space, published in CVPR 2023. - 19 April 2024 : how different are rigid objects? [isometries and invariants] (Olga Anosova)
Outline. When we navigate in a street around any fixed or dynamic obstacles, we unconsciously check whether any object is the same despite looking different in our moving coordinate system. This pattern recognition answers the fundamental question "same or different" by using invariants. An invariant is an intrinsic property of an object that is preserved under equivalences such as rigid motion that keeps our object the same. The strongest (complete) invariant is a DNA-style code that completely and unambiguously identifies any object of a certain type under a given equivalence. - 22 March 2024 : why do we bother with matrices? [linear maps and bases] (Olga Anosova)
Outline. We'll look at the connection between linear transformations and matrices and will discuss the geometric meaning of linear dependence. Then we'll see how the change of a basis helps in Natural Language Processing (NLP). Finally, we'll review the key concepts and main engine behind the Google's PageRank search algorithm: eigenvectors and eigenvalues. - 8 March 2024 : how is everything related? [correlation and regression] (Olga Anosova)
Outline. We will start by investigating a correlation between random variables or their samples as a relationship measure. Then we will discuss various methods of regression analysis from the simple linear regression to a smart kernel trick. - 1 March 2024 : dreams and pitfalls of clustering [k-means and HDBSCAN] (Olga Anosova)
Outline. We will look at different types of clustering: hierarchical, centroid-based (k-means), density-based DBSCAN (density-based spatial clustering of applications with noise), and their combination HDBSCAN (hierarchical DBSCAN). - 23 February 2024 : rules in a world of coordinates [vector operations] (Vitaliy Kurlin)
Outline. What is possible and impossible to do with vectors? Simpson's paradox of reversing a trend in statistics requires understanding the geometry of the vector addition. The scalar product explains complicated inequalities. What does the determinant determine? The vector product can be written as one 3x3 determinant and extended to higher dimensions. - 16 February 2024 : how to measure and not to measure [metric axioms] (Olga Anosova)
Outline. All real data objects differ at least slightly because of noise apart from copied-pasted duplicates in the datasets that were discussed last time. Similarities and differences between data objects can be rigorously quantified in terms of a distance metric that should satisfy three metric axioms. If one of these axioms fails, e.g. the triangle inequality, it was recently proved that the results of clustering, for example, k-means and DBSCAN algorithms may not be trustworthy. - 9 February 2024 : a scientific approach to big data [equivalence relations] (Vitaliy Kurlin)
Outline. The key question is "same or different, and how different?" Ignoring this question allows (near-)duplicates that can skew any good data with digital waste. The unfortunate examples include well-known databases in computer vision, graphics, chemistry, and biology. After an important equivalence on data objects is agreed upon, any finite dataset can be considered a finite discrete sample in a (usually continuously infinite) moduli space of equivalence classes of data objects.
Back to Top of this page | Back to Home page
Fundamentals of Data Science in Autumn 2023
- 7 December 2023 : from normality to reality (Olga Anosova)
- 30 November 2023 : hypotheses and significance (Olga Anosova)
- 23 November 2023 : probability distributions in ML (Olga Anosova)
- 9 November 2023 : missing or incomplete data (Gabriela Czanner) video (48 min)
- 2 November 2023 : probabilistic paradoxes (Olga Anosova)
- 26 October 2023 : introduction to probability (Olga Anosova)
- 19 October 2023 : descriptive vs inferential statistics (Olga Anosova)
- 12 October 2023 : introduction to Data Science (Olga Anosova)
- 5 October 2023 : Crystal Isometry Principle (Vitaliy's seminar at Leicester)
- 3 October 2023 : introduction to persistence (Vitaliy Kurlin) video (55 min)
Back to Top of this page | Back to Home page
Advanced topics in Data Science in Spring 2023
- 18 April 2023 : state-of-the-art machine learning (Adam Coxson)
- 28 March 2023 : hyperparameter optimisation (Adam Coxson)
- 21 March 2023 : unsupervised machine learning (Adam Coxson)
- 14 March 2023 : neural network fundamentals (Adam Coxson)
- 7 March 2023 : Principal Component Analysis (Phil Smith)
- 28 February 2023 : eigenvalues and eigenvectors (Phil Smith)
- 21 February 2023 : isometries and orthogonal maps (Phil Smith)
- 14 February 2023 : fundamentals of linear algebra (Phil Smith)
Back to Top of this page | Back to Home page
Fundamentals of Data Science in Autumn 2022
- 13 December 2022 : clustering algorithms (Phil Smith)
- 6 December 2022 : correlation and regression (Phil Smith)
- 29 November 2022 : vector spaces and metrics (Phil Smith)
- 22 November 2022 : equivalence relations (Phil Smith)
- 15 November 2022 : continuous probability (Phil Smith)
- 1 November 2022 : perils of intuition (Phil Smith)
- 25 October 2022 : discrete probability (Phil Smith)
- 18 October 2022 : descriptive statistics (Phil Smith)
- 11 October 2022 : introduction to Data Science (Vitaliy Kurlin)
Back to Top of this page | Back to Home page
Advanced topics in Data Science in Spring 2022
- 24 May 2022 : introduction to Machine Learning, part 2 (Adam Coxson)
- 10 May 2022 : introduction to Machine Learning, part 1 (Adam Coxson)
- 26 April 2022 : The Crystal Isometry Principle (Vitaliy Kurlin's virtual LIV.DAT seminar at Liverpool Physics)
- 29 March 2022 : The Crystal Isometry Principle (Vitaliy Kurlin's online colloquium at Open University)
- 22 March 2022 : Principal Component Analysis and Singular Value Decomposition (Matt Bright)
- 15 March 2022 : orthonormalisation and the covariance matrix (Matt Bright)
- 8 March 2022 : how to change a linear basis of a vector space (Matt Bright)
- 1 March 2022 : the determinant and other isometry invariants (Matt Bright)
Talk Grouping bacteria using data obtained from antibiotic laboratory testing by PhD student Alessandro Gerada - 22 February 2022 : isometries and orthogonal maps (Matt Bright)
- 15 February 2022 : matrices of linear maps (Matt Bright)
- 8 February 2022 : clustering algorithms (Matt Bright)
- 1 February 2022 : correlation and regression (Matt Bright)
Back to Top of this page | Back to Home page
Introductory topics in Data Science in Autumn 2021
- 7 December 2021 : introduction to Bayesian networks and Monte-Carlo methods (Olga Anosova)
Talk Predicting Influenza A Viral Host Using PSSM and Word Embeddings by PhD student Yanhua Xu - 30 November 2021 : from frequentist statistics to the Bayesian approach (Olga Anosova)
Talk Data Science-based approach to solid crystalline materials by PhD student Daniel Widdowson - 23 November 2021 : statistical hypotheses (Matt Bright)
- 16 November 2021 : probability distributions (Matt Bright)
- 9 November 2021 : probabilistic paradoxes (Olga Anosova)
- 3 November 2020 : introduction to probability (Olga Anosova)
- 27 October 2021 : important structures on mathematical objects (Matt Bright)
Talk Earth Mover's Distance on chemical compositions by PhD student Cameron Hargreaves - 20 October 2021 : thinking like a mathematican (Matt Bright)
- 13 October 2021 : descriptive statistics (Matt Bright)
- 6 October 2021 : introduction to Data Science (Vitaliy Kurlin)
Back to Top of this page | Back to Home page
Advanced topics in Data Science in Spring 2021
- 20 May 2021 : logistic regression as a link between statistics and machine learning (Olga Anosova)
- 13 May 2021 : introduction to logistic regression (Olga Anosova)
Talk Average Minimum Distances of periodic point sets by PhD student Marco Mosca - 6 May 2021 : three talks by PhD students at the PGR workshop
- 29 April 2021 : skeletons of point clouds (Vitaliy Kurlin)
- 22 April 2021 : single-edge clustering (Vitaliy Kurlin)
- 15 April 2021 : graph classifications (Vitaliy Kurlin)
- 18 March 2021 : Talk 12-lead ECG Classification Using Time Series Motifs by PhD student Hanadi Aldosari
- 11 March 2021 : Principal Component Analysis (Matt Bright)
- 4 March 2021 : how to change a linear basis (Matt Bright)
Talk Understanding ethnic inequalities in gastrointestinal infection by PhD student Iram Zahair - 26 February 2021 : invariants of linear maps (Matt Bright)
- 18 February 2021 : matrices of linear maps (Matt Bright)
- 11 February 2021 : correlation and regression (Matt Bright)
Back to Top of this page | Back to Home page
Introductory topics in Data Science in Autumn 2020
- 17 December 2020 : frequentist vs Bayesian approaches (Olga Anosova)
- 10 December 2020 : introduction to Bayesian statistics (Olga Anosova)
- 3 December 2020 : talk Biomarkers-based detection of liver cancer by PhD student Mohamed Elhalwagy
- 26 November 2020 : Earth Mover's distance (PhD student Cameron Hargreaves)
Talk Using k-modes clustering to identify different types of cyclists by PhD student Aidan Watmuff - 19 November 2020 : equivalences and metrics (Vitaliy Kurlin)
Talk `Machine learning for mass cytometry data of chronic lymphocytic leukemia by PhD student Muizdeen Raji - 12 November 2020 : statistical hypotheses (Olga Anosova)
Talk Machine learning for influenza A viral host classification by PhD student Yanhua Xu - 5 November 2020 : probability distributions (Olga Anosova)
Talk The Maintenance of Trials Methodology Research Using Machine Learning by PhD student Iqra Muhammad - 29 October 2020 : probabilistic paradoxes (Olga Anosova)
Talk Learning to Prioritise Pathology Data in the Absence of a Ground Truth by PhD student Jing Qi
Talk Wearable Sensing for Non-invasive Human Pose Estimation during Sleep by PhD student Omar Elnaggar - 22 October 2020 : introduction to probability (Vitaliy Kurlin)
- 15 October 2020 : descriptive statistics (Vitaliy Kurlin)
- 8 October 2020 : introduction to Data Science (Vitaliy Kurlin)
Back to Top of this page | Back to Home page
Advanced topics in Data Science in Spring 2020
- 14 May 2020 : skeletons of point clouds (Vitaliy Kurlin)
- 7 May 2020 : Voronoi diagrams of point clouds (Vitaliy Kurlin)
- 30 April 2020 : single-edge clustering of point clouds (Vitaliy Kurlin)
- 26 March 2020 : graph visualisations (Vitaliy Kurlin)
- 19 March 2020 : graph classifications (Vitaliy Kurlin)
- 12 March 2020 : graph representations (with a tutorial by Olga Anosova)
- 27 February 2020 : frequentist vs Bayesian (Olga Anosova),
Student presentations by Jing Qi and Theofilos Triommatis - 20 February 2020 : conditional probabilities (Olga Anosova)
- 13 February 2020 : AI for Health (Frans Coenen),
Student presentations by Matthew Carter and Vincent Beraud - 6 February 2020 : the Bayes theorem with examples (Vitaliy Kurlin)
Back to Top of this page | Back to Home page
Introductory topics in Data Science in Autumn 2019
- 10 December 2019 : Principal Component Analysis (Vitaliy Kurlin)
- 3 December 2019 : how to change a linear basis (Vitaliy Kurlin)
- 26 November 2019 : invariants of linear maps (Vitaliy Kurlin)
- 19 November 2019 : matrices of linear maps (Olga Anosova)
- 12 November 2019 : equivalence relations and vectors (Vitaliy Kurlin)
- 5 November 2019 : clustering problems and k-means (Vitaliy Kurlin)
- 29 October 2019 : correlation and regression (Vitaliy Kurlin)
- 22 October 2019 : statistical hypothese (Vitaliy Kurlin)
- 15 October 2019 : probability theory (Vitaliy Kurlin)
- 8 October 2019 : descriptive statistics (Vitaliy Kurlin)
Back to Top of this page | Back to Home page
Cases of PhD students Daniel Widdowson (2020-2024) and Jonathan Balasingham (2021-2025)
Daniel Widdowson has BSc in Mathematics (Warwick) and MSc in Computer Science (Liverpool). Daniel's MSc thesis supervised by Vitaliy Kurlin in summer 2020 led to the high-profile MATCH paper introducing ultra-fast isometry invariants (Average Minimum Distances) for mapping all periodic crystals, and to the NeurIPS 2022 paper establishing the Crystal Isometry Principle for solid crystalline materials. Daniel's PhD is supervised since October 2020 by Vitaliy Kurlin, Andy Cooper, and Jason Cole. |
Daniel's research in his own words: Crystal Structure Prediction (CSP) is a set of methods for predicting new crystalline materials given a molecule. The way crystals are stored by a computer is ambiguous, i.e., one crystal can be represented in many ways, so during CSP it is not possible to automatically detect and remove duplicates. Currently this is handled manually in a time-consuming filtering process.
Our work uses mathematical tools called isometry invariants to tackle this problem of ambiguity. Every crystal has an invariant which will not change if the crystal is represented differently, and similar crystals have similar invariants to account for atomic vibrations and measurement errors.
As part of the Materials Innovation Factory at the University of Liverpool, co-supervised by Professor Andy Cooper and in collaboration with the Cambridge Crystallographic Data Centre (CCDC), this work has shown impact and promise even outside of applications in crystal structure prediction. The CCDC curates the Cambridge Structural Database (CSD), a collection of over one million crystals collected from research all over the world. Our tools searched the database for duplicates in a process totalling over 200 billion comparisons, leading to 5 pairs of crystals currently being investigated.
These comparisons demonstrated the Crystal Isometry Principle stating that any crystal is determined uniquely by the geometry of its atomic centres.
So all crystals live in a common landscape parametrised by invariants, the ‘Crystal Isometry Space’. |
The paper Recognizing Rigid Patterns of Unlabeled Point Clouds by Complete and Continuous Isometry Invariants with no False Negatives and no False Positives in the top Computer Science venue CVPR extended the (isometry moduli space in the) classification of triangles from school geometry to arbitrary clouds of unlabeled points in any Euclidean space.
All papers co-authored by Dan: CGD 2024, DiD 2023, CVPR 2023, NeurIPS 2022, JACS 2022, DiD 2022, MATCH 2022.
Jonathan Balasingham has gained many degrees: MSc in Scientific and Data-Intensive Computing (University College London), MSc in Industrial Engineering (San Jose State University) and BSc in Computer Science (University of California, Santa Cruz). Jonathan's PhD is supervised since October 2021 by Viktor Zamaraev, Vitaliy Kurlin, Andy Cooper. |
Jonathan's research in his own words:
Working with crystalline materials presents inherent difficulty due to their periodic nature. Until recently, there have not been rigorous ways to classify and compare crystal structures. Research from the Data Science Theory and Applications group has provided two means by which to accomplish this, Average Minimum Distances and Pointwise Distance Distributions.
These mathematical tools give us the capability to quickly compare large amounts of crystals in a precise way. Because of this, we’ve been able to build tools such as a search engine and visualization software to explore crystal databases such as the Cambridge Structural Database by the CCDC.
Work from the research group also granted the expansion from pure mathematics to other domains such as machine learning where having an unambiguous representation for periodic crystals can improve the efficiency and accuracy of algorithms for tasks like property prediction. More generally, taking on a geometric view of data science applications can help reduce data requirements and make for more robust and effective models. The PhD project is successful because
- It allowed for the development of open-source crystal comparison and visualization software for crystallographers and chemists to use to explore crystals and their structure.
- Resulted in a new Transformer-based model for crystal property prediction that uses the Pointwise distance distribution as the crystal descriptor instead of incomplete and discontinuous descriptors. This work was published in Scientific Reports.
- Improved upon previous graph representations for periodic crystals by decreasing the number of vertices and edges required to represent a material and provided theoretical guarantees for its unit cell and periodic invariance as well as its completeness. This work was recently published in Integrating Materials and Manufacturing Innovation.
Back to Top of this page | Back to Home page
History : Doctoral Network in Artificial Intelligence for Future Digital Health was a doctoral training centre funded by the University of Liverpool in 2019 - 2021 to train the next generation of world-leading experts in Data Science an AI to solve data intensive problems in healthcare.
Leadership team of the network
- Vitaliy Kurlin (director) leads the Data Science Theory and Applications group in the MIF.
- Frans Coenen (co-director) supervised 40+ successful PhDs in Computer Science.
- Marta Garcia-Finana (equality, diversity, inclusion) is an expert in Biostatistics.
- Roberto Ferrero works in Electrical Engineering and Electronics.
- Paolo Paoletti works in the School of Engineering.
Back to Top of this page | Back to Home page
PhD projects: first cohort from Autumn 2019 (six students)
-
Student : Jing Qi. Supervisors :
Frans Coenen,
Girvan Burnside,
Paul Charnley.
Project : An Intelligent Assistant for Medical Doctors when Prioritising Pathology Results.
Partner: Wirral University Teaching Hospital. - Student : Omar Elnaggar. Supervisors :
Paolo Paoletti,
Frans Coenen,
Andrew Hopkinson.
Project : Deep Feature Learning for Distributed Rehabilitation Robots with Wearable Tactile Sensing.
Partner : Sensor City, Liverpool. - Student : Weiqiang Chen. Supervisors :
Yaochun Shen,
Yalin Zheng,
Stephen Kaye.
Project : An intelligent imaging technology for automatic characterisation of the refractive power of human eye.
Partner: Royal Liverpool and Broadgreen University Hospital Trust. - Student : Yanhua Xu. Supervisors :
Dominik Wojtczak,
Neil French,
Roberto Vivancos.
Project : Machine Learning of Epidemic Models.
Partner : Public Health England. - Student : Mohamed Elhalwagy. Supervisors :
Frans Coenen,
Philip Johnson,
Vinzent Rolny.
Project : Application of AI systems to develop novel diagnostic and prognostic tools for Hepatocellular Carcinoma.
Partner : Roche Diagnostics. - Student : Muizdeen Raji. Supervisors :
Nagesh Kalakonda,
Vitaliy Kurlin,
Joseph Slupsky.
Project : A study of cellular diversity in health and disease using mass cytometry.
Partner : Clatterbridge Cancer Centre, Wirral.
Back to Top of this page | Back to Home page
PhD projects: second cohort from Autumn 2020 (five students)
- Student : Hanadi Aldosari. Supervisors :
Frans Coenen,
Yalin Zheng,
Gregory Lip.
Project : The Integration of Knowledge Sources to Support Cardiovascular Disease Diagnosis.
Partner : Centre for Cardiovascular Science at the Liverpool Heart and Chest Hospital. - Student : Ramandeep Kang. Supervisors :
Martin Gairing,
Andy Jones,
Shivaram Avula.
Project : Application and development of machine learning approaches for automated analysis of brain tumours.
SupervisorsPartner: Alder Hey Children's Hospital, Liverpool. - Student : Andrew Tibbles. Supervisors :
Farnaz Nickpour,
Paolo Paoletti,
Sebastiano Fichera.
Project : Towards `Hospices Without Walls'; How Could Human Centred Design and Robotics facilitate `Enhanced Independence' and `Alternative Access' in Future Palliative Care Scenarios?
Partner: Marie Curie Hospice, Liverpool. - Student : Hannah Kockelbergh. Supervisors :
Anna Fowler,
Peter Green,
Elizabeth Soilleux.
Project : Machine learning approaches for the clinical diagnosis of autoimmune disease.
Partner: Addenbrooks Hospital, Cambridge. - Student : Daniel Widdowson. Supervisors :
Vitaliy Kurlin,
Andy Cooper,
Jason Cole.
Project : Isometry invariants of periodic point sets for an inverse design of crystalline drugs.
Partner: Cambridge Crystallographic Data Centre.
Back to Top of this page | Back to Home page
PhD projects: third cohort from Autumn 2021 (six students)
- Student : Tudor Jianu. Supervisors :
Anh Nguyen,
Sebastiano Fichera,
Pierre Berthet-Rayne.
Project : Toward Autonomous Cannulation in Endovascular Intervention.
Partner: Caranx Medical. - Student : Remi Hernandez. Supervisors :
Wahbi El-Bouri,
Yalin Zheng,
Pierre Berthet-Rayne.
Project : AI and the Eye – Integrating deep learning and in-silico simulations to optimise diagnosis and treatment of wet macular degeneration.
Partner: St Paul’s Eye Unit and Liverpool Ophthalmic Reading Centre. - Student : Anthony Nzegbuna.
Supervisors : Jiafeng Zhou, Xiaowei Huang, Mark Turner
Project : Diagnosis of Baby Neurodevelopmental Disorders Using Millimetre-waves.
Partner: Liverpool Women’s Hospital. - Student : Alessandro Gerada.
Supervisors : William Hope,
Vitaliy Kurlin,
Steve Patterson.
Project : Understanding bacterial resistance by machine learning from genetic data.
Partner: Liverpool Clinical Laboratories. - Student : Jonathan Balasingham.
Supervisors :
Viktor Zamaraev,
Vitaliy Kurlin,
Andy Cooper.
Project : AI-based exploration of crystal spaces to accelerate drug discovery.
Partner: Cambridge Crystallographic Data Centre. - Student : Nandini Gadhia.
Supervisors :
Anh Nguyen,
Vitaliy Kurlin,
Dominic Richards.
Project : Developing deep graph neural networks for prediction of drug toxicity.
Partner: STFC Hartree centre.
Back to Top of this page | Back to Home page