Fusion of Talent: Celebrating the Many Roles of Women in Computing
We are thrilled to invite you to a one-day celebration of the diverse contributions of women in computing. The event will highlight the plurality of roles available across the sector, from academia to industry, placing the spotlight on a variety of career stages. There will be open conversations on the challenges which remain to increase diversity.
The event is open to all! We welcome champions of diversity: allies, minority and underrepresented groups alike. Whether you’re studying, or working in academia, government or industry, your perspective matters.
Additionally, there will be networking opportunities specifically for underrepresented genders in computing (e.g. women, non-binary and genderfluid people).
What to expect:
-
Networking breakfast
-
Keynote speaker
-
Early careers speakers
-
Poster session
-
Panel discussion
Thank you to our sponsors

-
-
08:30
→
09:00
Early Arrival (Networking Breakfast) Reception
Reception
Culham Campus, Abingdon, Oxfordshire, OX14 3DB -
09:00
→
10:30
Exhibition Set-up Zeta room (Conference Centre)
Zeta room
Conference Centre
-
09:00
→
10:30
Networking Breakfast (limited spaces) Abingdon Room (1.26) (C7)
Abingdon Room (1.26)
C7
-
09:30
→
10:30
Standard Arrival Reception
Reception
Culham Campus, Abingdon, Oxfordshire, OX14 3DB -
10:00
→
11:00
Welcome Coffee 1h Zeta room (Conference Centre)
Zeta room
Conference Centre
-
11:00
→
12:35
Morning Session John Adams Lecture Theatre (Conference Centre)
John Adams Lecture Theatre
Conference Centre
-
11:00
Welcome 15mSpeaker: Helen Brooks (Advanced Engineering Simulation)
-
11:15
Keynote talk: Fusion and Confusion - the impact of diversity in computing 50mSpeaker: Alison Kennedy (STFC)
-
12:05
Family as Anchor, Tech as Mission: What Grounds My Leadership 15mSpeaker: Christine Aramunde (UKAEA)
-
12:20
Women who pivot: Unlocking new talent in Computing 15mSpeaker: Brodie Goring (UKAEA)
-
11:00
-
12:35
→
12:45
Group Photo 10m John Adams Lecture Theatre (Conference Centre)
John Adams Lecture Theatre
Conference Centre
-
12:45
→
14:45
Lunch Phoenix Room (Conference Centre)
Phoenix Room
Conference Centre
Collect your lunch!
-
12:45
→
14:45
Poster Session Balcony (Conference Centre)
Balcony
Conference Centre
-
12:45
In Code We Trust: A Software-First Framework for Satellite Operations 5m
Thousands of satellites now orbit the Earth, forming an increasingly interconnected ecosystem of space-based services. Yet the infrastructure to coordinate these systems remains largely centralised and outdated. This research explores how software, not hardware, can transform satellite operations to enable safe, sustainable operations in orbit.
Space is a zero-trust environment, where spacecraft are managed by a wide range of organisations, nations, and commercial actors who do not inherently trust one another, and may actively distrust each other. Enabling meaningful collaboration across trust boundaries requires a new operational model where coordination is not dependent on centralised authorities, which can create single points of failure, and governance bottlenecks.
Distributed Ledger Technologies (DLTs) underpin this proposed solution, providing a software-based foundation for decentralised coordination between spacecraft. Unlike hardware-first operational approaches, DLTs offer a programmable infrastructure that supports resilience, autonomy, and interoperability. In this system, trust is reframed as a software-defined property, verified through consensus mechanisms, rather than presumed based on ownership or affiliation.
Building on prior analysis of DLTs suitable for orbital use, this research is now focused on developing a novel consensus mechanism that incorporates orbit determination data to enable physical tasking. This mechanism is critical for ensuring reliable collaboration between all spacecraft and supports the creation of a Decentralised Autonomous Community in Space, where governance is distributed and coordination is achieved through inter-satellite tasking.As satellite constellations continue to grow in scale and complexity, software won’t just support their autonomy and resilience: it will define the future of sustainable space operations.
-
12:50
Code and Convention: Policy, Power and the Making of International Cybercrime Law 5m
As cybercrime grows in scale and complexity, shaping international law to combat it requires more than technical expertise, it demands diplomatic finesse, legal imagination and most importantly, global cooperation. This poster offers a look into the making of the UN Cybercrime Treaty 2024, the first major binding international legal instrument on cybercrime.
As part of INTERPOL’s diplomatic mission, I was part of the team involved in shaping treaty text. I explore how legal language, political pressure and technical realities collide in the negotiation room and how global policy on issues like data access, jurisdiction, encryption and dual criminality is formed.
This poster is aimed at making international law accessible to non-lawyers, especially to those in computing, by unpacking how technical decisions influence legal standards, and how the language of policy can empower or constrain technological innovation. The poster invites those in the computing field to see themselves as not only builders of systems but also shapers of the laws that govern them.
The poster aims to inspire dialogue between technical and policy communities and show how diverse voices can help close the gap between the developers of digital tools and the designers of the global rules that govern their use.
-
12:55
Missing Names: Women Inventors in Computing 5m
Women have been at the forefront of computing since the beginning. But where are all the women inventors? Only 13% of inventors worldwide are women and the numbers are worse in ICT (Information and Communications Technology). This poster explores why women are underrepresented as inventors in software and AI, and how intellectual property (IP) can be a powerful tool for recognition and career growth.It also offers accessible tips for identifying patentable ideas, making IP feel less like legal jargon and more like a career asset. Whether writing code, leading projects, or designing systems, women in computing deserve to be recognized as inventors. Let’s change the narrative.
-
13:00
FitBenchmarking: a tool for comparing numerical optimization algorithms 5m
Science often uses mathematics to represent physical process and behaviour. Hence, it is important to know the values of the parameters used within these models. For example, a parameter may relate to the internal magnetic field of a material. Nonlinear least squares fitting is the most common way of determining these parameters, providing both the values and errors, and is an important part of the scientific workflow across the National Facilities. FitBenchmarking (https://fitbenchmarking.github.io/) is an open-source Python package which takes data and models from real world applications and data analysis packages. It fits the model parameters to the data by casting this as a nonlinear least-squares problem, producing several tables and reports as outputs that allow users to compare different optimization algorithms across metrics like runtime, accuracy, and energy usage. This poster will introduce FitBenchmarking and highlight some of our recently added features.
-
13:05
Multi-modal MAML: Revisiting Feature Fusion for Discriminative Generalization and Class Distribution 5m
Class distribution methods determine how classes are allocated across the meta-training, meta-validation, and meta-test sets. These methods play a critical role in influencing the generalization ability of various meta-learning algorithms, including Prototypical Networks and Model-Agnostic Meta-Learning (MAML), particularly when these models are trained from scratch on small datasets. Focusing on MAML, we hypothesize that the model fails to learn class-discriminative features on small datasets, thereby limiting its generalization performance. To address this limitation, we propose leveraging data fusion to enhance the quality of data by producing more discriminative features.
This paper introduces a novel extension to MAML—referred to as Multi-modal MAML—that incorporates multi-modality techniques by integrating two data modalities: images and texts. Although previous research has emphasized the challenges of training MAML on multi-modal data, our findings indicate that performance is significantly influenced by several key factors: tensor size, textual feature extraction techniques, and the type of fusion employed. we systematically investigate these factors by conducting experiments with three different tensor sizes, three textual feature extraction methods, and two fusion strategies—intermediate and late linear fusion—to evaluate how each combination affects MAML's ability to generalize. These experiments also validate whether the time complexity limits MAML’s ability to learn more discriminative features. Finally, we assess whether the proposed Multi-modal MAML can mitigate the impact of class distribution. -
13:10
Mathematical thinking in fusion engineering 5m
At seventeen, I could not decide between pursuing a maths or physics degree. My parents had not attended university, and my sixth form had little experience supporting STEM applications. So, the school brought in a careers advisor who asked a simple question: which subject is your favourite? I said maths. He recommended physics.
Now twenty-five, I am a graduate computational physicist working in fusion research. Fusion engineering asks, how do we make and sustain an operating fusion power plant? The scientific research that informs this engineering asks, what matters most? What do we need to investigate, understand, model and eventually predict?
But when I joined the graduate scheme two years ago, my theoretical physics background gave me a different perspective. In nature, there exists many beautiful patterns, however big or small, and there exists some mathematical theory to describe its beauty best. It is not aimless but need not be practically motivated either.
In this poster, I share my technical journey, from theoretical physics to engineering science, and demonstrate how mathematical elegance fits into applied research. I also reflect on this journey as a woman of colour from a disadvantaged socio-economic background, and lessons I have learned along the way.
-
13:15
Robot Navigation in Uncertain Environments 5m
Autonomy is a key factor in robotics that gives it the potential to be a world-shaping tool. One of the challenges we tackle as researchers is how to create robots that can make good decisions in the face of uncertainty. Robots often need to navigate environments where paths may be blocked or unknown, like forest trails hidden by dense terrain or warehouses that change layout as stock gets moved around. Real-world problems are big and difficult to plan for, especially when a robot has limited computing power. In this work, we took a robot into a forest and tasked it with navigating a complex, partially unknown environment where obstacles might be unseen. We developed a planning method that creates compact, reusable decision-making policies in advance. This means the robot can change tactics when its path turns out to be blocked, even when it cannot plan on the fly. In this scenario, our approach achieved a 95% success rate, while other methods had much lower success or entirely failed to create a plan under the same time limits. This work is important in paving the way towards reliable robot navigation in complex, uncertain environments with limited computational resources.
-
13:20
GPU offloads for gravity calculations in the SWIFT cosmology code 5m
To be compliant with modern heterogeneous HPC systems, large astronomy codes are needing to move towards GPU compatibility. SWIFT (SPH With Inter-dependent Fine-grained Tasking) is a versatile, open-source astronomy code used for a range of research areas in astronomy including galaxy formation, planetary impacts, and cosmology. A significant portion of SWIFT’s runtime is dedicated to gravity calculations. In gravity n-body codes, each particle (representing a celestial object) interacts with every other particle based on gravitational forces, making the calculations computationally intensive. However, the repetitive and non-interdependent nature of these n-body interactions makes them ideal candidates for GPU acceleration.
In this work, we build on the existing SWIFT code by replacing specific CPU-based gravity calculation functions with new GPU kernels, minimizing disruption to the rest of the code while preserving the task-based parallelism. This creates a new hybrid C and CUDA version of the code which transfers gravity calculations to the GPU, freeing up the CPU to carry out the other tasks. Our GPU-accelerated gravity kernels achieve high accuracy, with less than 1% deviation from CPU results below the Nyquist frequency. Furthermore, the utilisation of GPUs allows for a redistribution of the gravity calculations making the results more accurate.
Although we currently face a memory transfer bottleneck, optimization efforts using CUDA atomics and streams have shown promising improvements. Future work will focus on eliminating this bottleneck, further integrating GPU offloading into SWIFT’s task system, and leveraging additional GPU features to achieve an overall performance boost for the SWIFT code.
-
13:25
Development of a magnetohydrodynamic solver for high-fidelity numerical modellling of liquid metals 10m
One of the key engineering challenges in fusion reactor development lies within the breeding blanket system. In several next-generation designs, including those proposed in STEP, liquid metals such as lithium-lead are employed both as tritium breeding media and as coolants. While these materials offer advantages in thermal efficiency and radiation shielding, they introduce significant complexity due to magnetohydrodynamic (MHD) effects. As these electrically conductive fluids flow through the reactor’s intense magnetic fields, Lorentz forces suppress turbulence and fundamentally alter heat and momentum transfer. Understanding and accurately predicting MHD turbulence under these conditions is critical to the safe, efficient design of fusion components. This poster presents numerical solver development for MHD flow and investigation of MHD turbulence in liquid metal flows through high-fidelity numerical simulation.
-
13:35
Accelerating Design Space Exploration Through High Performance Compute Enabled Simulation 5m
The Centre for Modelling & Simulation (CFMS) is an independent Research & Technology Organisation, based in Bristol, specialising in accelerating industry through modern digital engineering techniques. One area of focus is developing models and simulations to accelerate the design of novel complex systems, which requires developing a deep understanding of the impact that design parameters have on the system. This poster will detail CFMS’ approach to design space exploration to improve understanding, which couples efficient sampling methods, High Performance Compute (HPC) scale simulations and automation tools and techniques for model execution. The application of this approach to two examples will be presented. Firstly, to validate candidate designs and perform process optimisation of large scale additive manufacturing of components that can cost upwards of £100k and take weeks to build; and secondly, to multiscale modelling of composite assemblies to perform uncertainty quantification to understand the effect that manufacturing variation at the microscale can have on the part performance on the macroscale. Using HPC enabled simulation for design space exploration allows engineers to get more information faster, accelerating the design process and finding novel solutions not previously considered.
-
13:40
Towards a Unified Lakehouse Platform for PSDI 5m
PSDI is the UK's nationally funded programme that provides tools and services to help researchers in the physical sciences find, share, and process data, with the explicit aim of accelerating scientific discovery and innovation. In PSDI, we work with diverse data from various sources. One of the key challenges we face is managing big data while maintaining flexibility in handling both raw and complex data in low-cost storage, and addressing issues related to data governance, performance, and consistency. To truly empower the scientific community, this data must be usable for both analytics and cutting-edge AI/ML applications.
To tackle this, we will design and build a ‘data lakehouse’ on low-cost object storage. This architecture combines the flexibility of a data lake with the transactional consistency and query performance of a data warehouse. Raw datasets from different data sources will be ingested into object storage and transformed into a common, open format like Apache Parquet, enabling efficient analytics. These datasets will then be registered as Apache Iceberg tables in a metadata catalog (e.g., Lakekeeper or Apache Polaris) to manage schema and ensure consistency. By providing a unified, governed platform with a powerful query engine (e.g., Trino, DuckDB, or Spark), this lakehouse will make diverse data more Findable, Accessible, Interoperable, and Reusable (FAIR). Ultimately, this work will make it easier for the scientific community to exploit this data for new insights and discoveries.
References:
Speaker: Amali Pawula Hewage (UKRI - STFC) -
13:45
The Unity Measure: On Unifying The Fairness Metrics Landscape in Machine Learning. 5m
This research project was inspired by the apparent challenges of integrating the social and legal concept of fairness within the technical domain of Machine Learning. Industries including the criminal justice system, healthcare, and finance have eagerly utilised the power of automated decision-making tools to support high-stake decision-making tasks. Hence, now more than ever, a pressing need has emerged to ensure that these tools are assessed under strict fairness criteria. These criteria are expressed as fairness metrics. They quantify the notion of fairness and strive to mitigate unwanted biases within a system to deter the possibility of outcomes that prompt or inflate discriminatory treatment towards historically marginalised groups. Thus, the research aimed to develop a fairness metric that provides a unified and comprehensive bias assessment. The objectives included using an axiomatic approach to scientifically assess what an ideal metric should measure and do. Then, to extract core components from existing metrics and adapt them to design a simple mathematical formulation for the new metric. Finally, to design an experimental setup that tested the metric’s behaviour, compared to existing metrics. The result was the formulation of the Unity Measure. A metric that unifies the framework by incorporating the individual component of the benefit function from the Generalised-Entropy Index metric, combined with group weighting. The experimental results show that this metric is more sensitive than existing metrics in detecting skews in dataset distributions, and its score is interpretable and insightful. Therefore, this novel metric is the key to unify the fairness metrics landscape.
-
13:50
Does The Ditchley Foundation invite individuals recommended by other contacts? 5m
The main aim during my internship with the Ditchley Foundation was to work with the graphing database Neo4j and software NeoDash, both of which utilise the graph query language Cypher. As my host organisation initiates events which are invite-only, the specific question I wanted to investigate was “Does Ditchley invite individuals recommended by other contacts?” I chose this in order to begin to develop an attendance prediction model for my final individual project. Neo4j was particularly useful for demonstrating specific parts of the database as a knowledge graph, allowing me to visualise my code and its output. My topic of investigation required me to mainly look at the ‘Contact’ and ‘Conversation’ nodes, with the ‘recommends_person’ relationship, where I carried out numerous codes in order to notice general patterns within the data; for example, success and attendance rates of different types of events hosted at Ditchley (virtual, hybrid, conference, dinner, etc.) or which gatherings generated the most ideas from the attendees. In general, such research is cruical for organisations alike Ditchley, as it could be used for pattern recognition within their databases, bringing out suggested persons for events, or enlarging it overall. Furthermore, it will enrich future events by ensuring higher participation rates, thus circulating more ideas and diversifying current partakers.
-
13:55
From Boole to Binary: The Hitchhiker's Guide to Compilers 5m
As humans, we use language as a tool to reason about the world around us. In 1854, George Boole determined that all linguistic logical operations can be fully conveyed as mathematical expressions, but with one catch: only in the domain of 1 and 0. This insight paved the way for electronic computers, and is why programming languages consist of precise logical expressions. This poster presents a handwritten compiler - a computer program that converts higher-level languages into machine code - built as a graduate development project in the C programming language. The compiler translates a small strictly-typed imperative language into assembly, and is designed to make accessible to non-specialists our computers’ everyday translation of logic into number.
-
14:00
The world of coding Feynman Integrals 5m
In particle physics, there are currently running experiments (CERN, Fermilab, PSI, etc) that look for evidence of new physics. Many of these results show deviations between the experiment and the theoretical predictions. We therefore need input from the theory side, to better understand our current Standard Model – the theory that describes the interaction between the fundamental particles of our Universe.
This heavily relies on the evaluation of Feynman diagrams – diagrammatic representation of mathematical expressions that describe interactions between particles. In order to meet the precision demanded by the experiments, Feynman integrals need to be evaluated – whose number increases exponentially as we go for more precise computations.
The evaluation of this Feynman integrals requires a strong component of algorithms and are data intensive.The aim of this poster is to show how mathematical techniques such as diagram generation, amplitude generation, tensor decomposition, partial fractioning, integration by parts and analytical results for integrals – fundamental for our calculations – can be computationally implemented.
We will cover the building blocks of the code developed during my PhD to compute Feynman amplitudes. We will also show some examples of codes developed in the particle physics community in software packages such as Mathematica, FORM and C++. -
14:05
Global-Local Information Fusion for Vision-Based Species Classification Models 5m
Training computer vision models for animal biodiversity monitoring tasks demands a vast amount of labelled data for the target ecosystem. While raw, unlabelled data can be readily acquired on a large scale through camera trap systems deployed across the globe, labelling it all with species information requires significant expert effort, leading to a bottleneck in the pipeline. Since the labels need to be specific to the local downstream use-case, simply running a large pre-existing species classifier on the unseen images is often insufficient, due to (i) distinct ecosystem environments being hard to generalise across and (ii) certain classes inevitably not being present in the given model’s knowledge base, as a result of its limited training set and the fact that each ecosystem has a unique species distribution (which is also typically long-tailed in nature). We therefore propose a model-agnostic and ecosystem-agnostic approach to tackle such labelling dilemmas by lifting information from a "global" model, before “mapping” the information to the label environment of the local model. The probabilistic pseudo-label outputs can then be used for training without any additional manual labour. Our experiments on a public, expert-labelled regional dataset demonstrate a boost of over $10\%$ in overall local-model accuracy for data regimes with a low proportion of ground-truth labels, with no loss in average per-class accuracy. We conclude that transferring information between models, without label-convention consistency or for out-of-distribution classes, can optimise the development of species classification tools for animals and ecosystems that are under-represented in “global” models.
Speaker: Rachael Laidlaw (University of Bristol)
-
12:45
-
14:45
→
15:15
Afternoon Tea / Coffee 30m Zeta Room (Conference Centre)
Zeta Room
Conference Centre
-
15:15
→
16:30
Afternoon Session John Adams Lecture Theatre (Conference Centre)
John Adams Lecture Theatre
Conference Centre
-
15:15
Overcoming Barriers to Computing Access to Undertaking Nuclear Safety: A journey into advancing Structural Integrity Through High-Performance Computing 15m
As the UK advances toward a nuclear renaissance to meet clean energy goals, ensuring the structural integrity of reactor components, particularly in fusion energy, has become increasingly critical. Fusion materials must endure extreme environments, making those developed through fission research inadequate. Moreover, existing regulatory and compliance frameworks, largely based on fission standards can impose overly conservative constraints, potentially hindering innovation in next-generation fusion technologies.
My research addresses this gap by developing high-fidelity finite element models that link microscale material behavior to component-scale performance, helping to reduce overconservatism in current nuclear design codes. By leveraging the Isambard 3 supercomputing facility with HPC system of over 55,000 CPU cores and GPU acceleration, these simulations model can efficiently process large-scale data and complex deformation mechanisms, enabling faster and more detailed insights into material behavior under extreme fusion-relevant conditions.The technical journey of my PhD began with limited knowledge of supercomputing. Coming from a background where computing was neither emphasised in school nor easily accessible with experimentalist background unfamiliar with the potential of high-performance computing, I started my research with minimal exposure to the digital tools that now underpin my work. Navigating a male-dominated field like structural integrity added another layer of challenge. My path reflects not only a steep personal learning curve, but also a broader issue: the unequal access to computing opportunities.
This talk shares both the technical insights and personal journey behind HPC-enabled nuclear safety research, highlighting how inclusive access to computing can diversify the voices shaping our energy future.
Speaker: Wan Maisarah (University of Bristol) -
15:30
Join the Fight Towards Inclusivity with Robotics Inclusive 15m
Robotics Inclusive (RI) is a UK-based, community-led organisation dedicated to advancing Inclusion, Diversity, Equity, and Accessibility (IDEA) within the robotics sector. Founded on the belief that innovation thrives in diverse environments, RI connects individuals across career stages, disciplines, and backgrounds to shape a more representative and equitable future for robotics.
Between 2023 and 2025, RI has served the robotics community through a wide array of events, from intimate roundtables on equitable hiring practices to broad-reaching networking events. Some of our initiatives include the “Headshots & Bots” series, career journey mapping, and early-career workshops at national robotics gatherings that combine professional development with community building. By partnering with institutions like the National Robotarium, UK-RAS, and leading universities, RI ensures IDEA principles are embedded in both research and practice.
By celebrating diverse pathways into robotics and fostering allyship across the community, Robotics Inclusive is helping to close the leaky talent pipeline and ensure robotics innovation benefits all of society. This talk will share our journey, highlight successful initiatives, and invite collaboration from those committed to building an inclusive robotics ecosystem.
Speakers: Alex Schutz (University of Oxford), Defne Eken (Kings College London) -
15:45
Panel Discussion 40m
-
16:25
Closing Remarks 5mSpeaker: Helen Brooks (Advanced Engineering Simulation)
-
15:15
-
08:30
→
09:00