Daniela Kühn has been awarded a PhD in Mathematics at the University of Hamburg in 2001. She worked as a postdoc in Hamburg and Berlin. She then started as a lecturer at the University of Birmingham, where she funded a thriving Combinatorics research group together with Deryk Osthus. In 2010 she was appointed as Mason Chair of Mathematics at Birmingham. She has received several research grants from EPSRC and from Europe, including an ERC Starting Grant and an ERC Advanced Grant. Jointly with Deryk Osthus, she has been awarded the European Prize in Combinatorics in 2003 and the Whitehead Prize by the London Mathematical Society in 2014. Further recognition for her research includes an invited lecture at the International Congress of Mathematicians in 2014 as well as a Royal Society Research Merit Award in 2015. Her research interests are Extremal and Probabilistic Combinatorics.
My main area of research is Combinatorics, with a particular emphasis on probabilistic and extremal questions. Combinatorics focuses on the study of (finite) structures, which take on discrete values. It underpins the world of information, such as computers, communication networks and the internet. Combinatorics is a rapidly expanding field of Mathematics, which has connections to many areas, e.g. Probability, Analysis, Number Theory, Theoretical Computer Science and Statistical Physics. It also plays a key role in the study of complex networks arising in nature.
In recent years Combinatorics has been transformed by the introduction and development of probabilistic methods. The basic idea behind the probabilistic method is deceptively simple and involves the consideration of appropriate probability spaces to prove the existence of a desired configuration. This approach has now yielded a sophisticated array of tools involving differential equations, martingales, large deviation bounds and quasirandom partitions.
One direction I am currently working on is concerned with decomposition problems and design theory. Roughly speaking, here the aim is to partition a large structure into many small and simple ones. This has applications to many fields, e.g. coding theory, statistics, scheduling, network architecture. Many problems in design theory can be formulated using graphs and uniform hypergraphs. Here a graph consists of a set of vertices, some pairs of which are connected by edges, and more generally, the edges of an r-uniform hypergraph span r vertices. A classical result of Kirkman guarantees the existence of so-called Steiner triple systems under appropriate divisibility conditions - in graph theoretical terms, this guarantees the decomposition of the edge set of a complete graph into edge-disjoint triangles. A recent breakthrough solves the existence problem for (complete) hypergraph designs going back to the 19th century: this guarantees a decomposition of any large complete hypergraph into complete hypergraphs of given (small) size, again assuming appropriate divisibility conditions.
We (in joint work with Glock, Lo and Osthus) developed a novel probabilistic approach to prove the existence of designs in large hypergraphs whose clique distribution satisfies appropriate uniformity constraints. As a special case, this gives a new proof of the existence of (complete) hypergraph designs. Building on this, we resolved the existence problem for F-designs for arbitrary r-uniform hypergraphs F: this shows that given any r-uniform hypergraph F, the trivially necessary divisibility conditions are sufficient to guarantee a decomposition of any sufficiently large complete r-uniform hypergraph into edge-disjoint copies of F, which generalizes a classical theorem of Wilson from graphs to hypergraphs. It subsequently also allowed us to prove a conjecture of Chung, Graham and Wilson on Euler tours in hypergraphs (joint work with Glock, Joos and Osthus).
Our methods for the above results involve a novel "iterated absorption" approach, which has had many further applications: for instance, we used it to prove the notorious tree packing conjecture of Gyarfas and Lehel for all bounded degree trees (joint work with Joos, Kim and Osthus). This conjecture asserts that given a sequence of n trees where the i'th tree has i vertices, one can pack these trees edge-disjointly into the complete graph on n vertices.
Another topic I have focussed on is that of Hamilton cycles: here a Hamilton cycle in a graph G is a cycle that contains all the vertices of G. Together with the satisfiability problem SAT and graph colouring, it is probably one of the most well-studied NP-complete problems. The main approach to the Hamilton cycle problem has been to prove natural sufficient conditions which are best possible in some sense. This is exemplified by Dirac's classical theorem: every graph G on n vertices whose minimum degree is at least n/2 contains a Hamilton cycle. Nash-Williams conjectured in the 1970s that if G is regular (i.e. all vertices have the same degree), this degree condition even ensures the existence of a decomposition of G into edge-disjoint Hamilton cycles. We developed a unified approach to prove this conjecture as well as the 1-factorization conjecture (dating back to the 1950s) and a further conjecture of Nash-Williams on packing Hamilton cycles from the 1970s (joint work with Csaba, Lo, Osthus and Treglown).
Further interests include random graphs and hypergraphs, graph minors and subdivisions, matching problems, Latin squares and associated rainbow structures, as well as algorithmic questions such as property testing.