# The numbers behind the young researchers – 2022

# BLOG: Heidelberg Laureate Forum

One of the many activities taking place at the HLF is a boat trip, where the laureates and young researchers (and members of the press and blog team) all pile onto a large boat, which sails up the Neckar river for two hours, turns round and then comes back.

As well as a chance to see the wonderful views and sample delicious food, it’s also a chance to meet and talk to people – so we took the opportunity to collar some young researchers (as we’ve done on previous boat trips in 2017 and 2018) and ask them about the numbers they use in their research.

### Sorelle Murielle Toukam Tchoumegne – 100

Sorelle’s research is in optimal control theory, aiming to find *controls* that can be applied to stochastic differential equations to maximise the output. One application of her work could be to select who to test for COVID in order to make the most efficient use of the tests. Sorelle explained how if you have **100** tests, and 90 tests are returned as negative, then in effect you’ve wasted those tests. However, by applying controls determined by Sorelle’s work, you could pick a sample to test that could return 50 or more positive tests, which is a far better use of resources.

### Katharina Kann – 7,000

Natural Language Processing (NLP) sits at the intersection of computer science, artificial intelligence and linguistics, and looks at how computers can understand and generate human languages. Some famous examples of its use are machine translation and speech processing.

There are around **7000** languages in the world, but currently, we only have the capability to use NLP with about 100 languages. Katharina’s work aims to use deep learning to bridge this gap, focussing in particular on the indigenous languages of the Americas.

### Prachi Kashikar – 1024

Initially, Prachi was skeptical of whether she could choose a number to represent her work, which involves computer vision and specifically model compression methods. Tiny devices like remote cameras can collect data easily, and if the camera has an onboard computer, techniques like image classification and activity tracking can all take place on the device itself. This has the added bonus that the device doesn’t need to communicate with the cloud, so the latency is low, and it’s more secure.

The problem here is that a small device might not have a huge amount of storage space – but compressing the vast amounts of data collected by cameras requires clever algorithms. Prachi has proposed a novel floating-point storage format, which means she can reduce the amount of storage needed from numbers of Gigabytes to numbers of Megabytes, and even down to Kilobytes – each representing compression by a factor of **1024**, which we decided was a good number to represent her work.

### Diego Millán Berdasco – 4

Diego’s research concerns abstract algebraic structures called Specht modules, which are representations of symmetric groups, and is connected to a result called James’ conjecture. A composition series is a way of breaking up a module into simpler pieces called composition factors. Diego’s working on a conjecture which means that (assuming James’ conjecture holds), the maximum number of composition factors a Specht module may have is related to the Catalan numbers (which you may recall were featured in a previous Numbers Behind the Young Researchers post).

This result has been shown to work for certain weights of these representations, and is known to fail from a certain point onwards (a counterexample has been found in one case somewhere around 900, which is the smallest case for which it’s known not to hold) – but Diego has proved it up to 3, and maybe **4**, which he’s chosen as his number.

### Rakshit Mittal – $\infty$

Rakshit has just finished a Master’s course and is about to go to Belgium to start on a PhD. His work is in modelling – describing large complicated systems with mathematical models which describe what the system is doing, while minimising the amount of computation time needed. One of his models is the brilliantly named Modified-Maximum-Mean-Minimum (MoMaMeMi) filter, which simulates complex frequency-domain filters (used in image processing).

Multi-paradigm modelling involves trying to model as many aspects of a complex system as possible – the electronics, mechanics, flow control and so on, and a model like this can take into account the relationships between the different aspects of the system and the information that’s shared between them. Rakshit’s ultimate goal is to model as many different things as possible – to model everything, which is why he’s picked the ‘number’ infinity.

### Ishita Jain – 5

Ishita is a pre-Master’s student at the University of Delhi where she’s studying the transmission dynamics of the H1N1 influenza virus. She’s working on expanding the SEIR model, which has 4 compartments, to an age-structured SVEIR model – the number **5** representing the five compartments of the model: Susceptible (S), Vaccinated (V), Exposed (E), Infected (I) and Recovered (R).

### Maicom Douglas Varella Costa – 39

Maicom works in algebraic geometry, computing the values of invariants – mathematical values associated with particular objects which allow us to understand and distinguish them. In particular he’s working on invariants of singularities, which are spaces similar to manifolds, which come from polynomials. Computing these invariants can be difficult, and Maicom is working on formulae to express the invariants more easily, using approaches from combinatorics, and objects called Newton polyhedra, which can be thought of as having a polynomial at each vertex.

He told us the story behind his number – after 8 pages of hand calculation he worked out a value for one invariant as **39**. But having applied his simplifying formula and run it through a computer, he found the real answer, which was actually -1. That’s what research is like sometimes!

### Celia Rubio Madrigal – 10010110

Celia, an undergraduate student at el Universidad Complutense de Madrid, has written theses in both Mathematics and Computer Science. Her work involves inputting boolean functions into neural networks to determine their complexity and try to chip away at the famous P vs NP problem.

One of the input functions she studies is the parity function, which sums the binary digits of a given number. When given the input (7,6,5,4,3,2,1,0), this gives the output (1,0,0,1,0,1,1,0) – hence Celia has chosen the number **10010110**.

This article was co-written by Katie Steckles and Sophie Maclean. Sophie is a mathematician and maths communicator based in London. She has previously worked as a Quantitative Trader and a Software Engineer, and now gives mathematics talks all over the UK (and Europe!) on a variety of topics. She is also a member of the team behind Chalkdust Magazine and always has a project on the go! You can follow her on Twitter at @sophiethemathmo.