Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

external supervisors

Professor Fergus Gleeson, Department of Oncology, University of Oxford

James Grist, Department of Physiology, Anatomy and Genetics, University of Oxford

background

More longitudinal multimodal (containing imaging, and non-imaging) studies related to pulmonary diseases become available to researchers including UK Biobank (UKB), MIMIC, or local Oxford trials (e.g. EXPLAIN) to name a few. Thus, research on AI-driven data representations is well placed to engineer novel representations that capture temporal dimension of multimodal data. These AI-engineered representations hold the potential to go beyond static cross-sectional assessments. Instead, they can encapsulate the dynamic nature of biological processes over time. By quantifying the progression of pathological processes through time, these representations have the capability to offer more comprehensive insights compared to single, isolated observations. This aligns well with the broader research context, particularly with regards to the ongoing research project into the temporal changes within organs, tissues that contribute to the onset and progression of diseases.

The project will aim to create a data-agnostic framework that integrates multimodal imaging and non-imaging data from large-scale studies. The objective is the creation of new representations (AI-derived features) for highly dimensional, multimodal data. Given the scale and complexity of the underlying data, research on data representation promises to bridge the computational efficiency of AI and the biological insights that we seek to extract. Such new features could be investigated for other applications, such as being integrated into personalized interventions to optimize treatment strategies, reduce adverse effects, and enhance patient outcomes. Likewise, new representations of the morphology and function of organs, and the way they are altered in human diseases, could create foundations for innovative discoveries in medicine.

The project will be offered in a collaboration with one of the collaborators already involved in the research group (to be discussed with the interested candidates).

RESEARCH EXPERIENCE, RESEARCH METHODS AND TRAINING

The student will develop skills in AI/ML techniques with application to biomedical imaging, write scientific reports and journal articles and present findings at research conferences.

FIELD WORK, SECONDMENTS, INDUSTRY PLACEMENTS AND TRAINING

The student will have also opportunity to attend the research seminars offered by the department, the Big Data Institute and the Institute of Biomedical Engineering (IBME) as the primary supervisor is a member of Imaging Hub at the IBME. The student will be expected to attend relevant seminars within the department and those relevant in the wider University. Subject-specific training will be received through our group's weekly supervision meetings. Students will also attend external scientific conferences where they will be expected to present the research findings.

PROSPECTIVE  STUDENT

The ideal candidate will have a degree in Computer Science, Statistics, Engineering or a related discipline, strong programming skills, preferably Python, or Matlab/C++ and be willing to learn Python. They will have experience or an interest in machine learning (Deep Learning) and medical image analysis and experience or enthusiasm to work on clinically relevant problems.

Supervisor

  • Bartek Papiez
    Bartek Papiez

    Associate Professor, Medical Image Analysis and Machine Learning