For a complete list of my publications, follow this link.
For data sets generated in the course of my research program, follow this link.
The work of my group lies at the interface between chemistry, biology and mathematics. We use a combination of mathematical reasoning and computer simulation to try to understand a variety of phenomena, with a particular emphasis on biological and chemical systems. Along the way, we have the opportunity to study mathematical problems such as the relationship between delay and ordinary differential equations. To give a flavor of what we do, here is a commented list of some of our recent(ish) publications:
My colleague Nehal Thakor introduced me to this problem in translation control some years ago. Mammalian cells always contain the ATF4 mRNA, but normally there is very little translation of this message. When a cell is under stress, the general rate of translation decreases but, paradoxically, the rate of translation of ATF4 increases. In this paper, my student Liv Marasco developed and studied a model for this phenomenon, which turns out to depend on upstream reading frames (short sequences with a start and a stop codon that can be translated) in the ATF4 mRNA. The model that Liv developed is surprisingly simple and uses no mathematics beyond undergraduate statistics.
My group members and I have also been studying transcription for a long time. One of the reasons to do this is that we probably don't want to model transcription explicitly in a gene expression model. An average human gene is about 28 000 nucleotides long. Imagine if we had to model the addition of every single nucleotide! Of course, nobody does that. (Well, almost nobody.) A lot of models ignore the time required to transcribe a gene, but that can lead to incorrect dynamics. Traditionally, people have put a fixed delay into their models, and that may sometimes be a good approximation, but sometimes not.
The purpose of this paper was to work out delay distributions for some simple transcription models that could then be used in modeling (notably in delay-stochastic simulations, but possibly also in deterministic integro-differential equation models). We not only computed these distributions, but also thoroughly discussed their potential applications.
I have a long-standing interest in model reduction methods based on the construction of attracting invariant manifolds. The idea is that if the dynamics are quickly confined to a low-dimensional manifold of dimension d, then we can build a d-dimensional model by restricting the dynamics to this manifold. In the case of delay-differential equations (DDEs), the concept of invariance is less straightforward. For technical reasons, we talk about inertial manifolds rather than invariant manifolds, but the idea is the same. I first developed a technique to deal with these problems many years ago, in this paper:
Marc R. Roussel (1998) Approximating state-space manifolds which attract solutions of systems of delay-differential equations. J. Chem. Phys. 109, 8154–8160.
The trick used in both papers is an expansion of the delay term in small powers of the delay. This converts the DDEs into ODEs, and then methods developed for ODEs can be applied. In the original paper, I only expanded the delayed term to first order in the delay, due to a misunderstanding of an old result of El'sgol'ts, which says that integrating the DDEs obtained by expanding the delayed term to higher orders results in incorrect runaway solutions. That result only constrains integration, but it turns out that the vector field implied by the expansion is correct, as shown by Carmen Chicone. In this new paper, I pursue higher-order expansions, working with gene expression models that include both transcriptional and translational delays.
Model reduction for delay systems is a cool problem area, but the ordinary differential equation problem still holds some mysteries. To my surprise, there has been relatively little work done on open systems. In this paper, some friends and I brought our respective tools to bear on a simple problem: when can you use the steady-state approximation for the Michaelis-Menten mechanism in an open system, with inflow of substrate. The answer turned out to be more complicated than I would have thought. It depends on what you expect from the approximation. Do you want to justify the approximation from singular perturbation theory? In this case, you need to find (Tikhonov-Fenichel) parameter values that give you a critical manifold (a differentiable manifold of equilibrium points). Do you just want the steady-state approximation to be "close" to the slow manifold? That turns out to lead to less restrictive conditions, but the machinery you need to use is quite different.
When an RNA polymerase binds to a gene's promoter, it prevents another polymerase from binding to the promoter until the former has cleared this site. This introduces a delay between binding of a polymerase and the potential for another polymerase to bind. The same thing is true of ribosomes binding to a ribosome-binding site on an mRNA. Some of these clearance delays turn out to have nontrivial dynamical consequences. As an example, we treat a model of a gene expression system in which bistability is observed in a range of parameters. Leaving out the promoter clearance delays significantly increases the propensity of this model to bistability. This is yet another example where we need to have a little humility in our modeling: What sometimes seem like small differences between models can turn out to have large effects.
On the outside, most vertebrates are roughly symmetrical, but this is not the case with respect to the arrangement of the internal organs: in humans for example, your heart leans to the left, your left lung is smaller than your right, your liver is mostly on the right, and so on. Within a species, these internal asymmetries are almost invariant, with only one person in 10 000 having their organs mirrored relative to the normal arrangement. Errors in organ placement however cause significant birth defects in the heart and in other organs. So how does the developing embryo “know” which side is the left? Part of the answer to this question is that the protein Nodal specifies the left side. Nodal activates its own synthesis, as well as the synthesis of an antagonist named Lefty. The interaction of Nodal and Lefty may be responsible for the formation of a Turing pattern, or Lefty may be responsible for stopping a wave of Nodal from invading the right-hand side of the embryo. Our first paper in this area studies the dynamics of a model of Nodal and Lefty in a single cell. We find that, over very wide ranges of parameters, the system is bistable, with coexistence of left-hand high-Nodal and right-hand low-Nodal steady states. Bistability in a single-cell model would allow both wave propagation and Turing patterns in a multi-cell model we are currently developing.
In many bacteria, Hmp is the major enzyme responsible for detoxifying nitric oxide (NO), catalyzing the oxidation of NO to nitrate. In Escherichia coli, the transcription of the hmp gene is controlled in part by FNR, which represses the transcription of hmp by binding to its promoter. FNR is an iron-sulfur protein. Reaction with NO degrades FNR's iron-sulfur cluster, rendering FNR unable to bind the hmp promoter, and thus allowing the expression of Hmp, which can then remove NO from the cell. This book chapter describes a new model of this control system. The model turns out to display bistability because of a peculiarity in the kinetics of Hmp. In order to make nitrate from NO, Hmp must also bind oxygen, but the order of binding is important: If NO binds first, an inactive complex is formed. At high NO concentrations, this inactive complex will be formed preferentially, allowing for a steady state in which this complex dominates and NO accumulates in the cell to toxic levels.
Maya Mincheva and Marc R. Roussel (2007) Graph-theoretic methods for the analysis of chemical and biochemical networks. I. Multistability and oscillations in ordinary differential equation models. J. Math. Biol. 55, 61-86.
Maya Mincheva and Marc R. Roussel (2007) Graph-theoretic methods for the analysis of chemical and biochemical networks. II. Oscillations in networks with delays. J. Math. Biol. 55, 87-104.
Maya Mincheva and Marc R. Roussel (2006) A graph-theoretic method for detecting potential Turing bifurcations. J. Chem. Phys. 125, 204102:1-8.
Maya Mincheva and Marc R. Roussel (2012) Turing-Hopf instability in biochemical reaction networks arising from pairs of subnetworks. Math. Biosci. 240, 1–11.
Lecture on Graph-theoretical analysis of biochemical networks given at the BIRS workshop on Advances in Theoretical and Experimental Methods for Analyzing Complex Regulatory Networks
In the emerging science of systems biology, people are trying to understand how living cells work by elucidating all the interactions (reactions, binding, etc.) which make up a cell's biochemistry. The trouble is that we often don't know the kinetic constants associated with these interactions, so mathematical treatments which require these constants (e.g. bifurcation analysis) must use educated guesses. The alternative is to carry out qualitative analyses which don't depend on the values of the kinetic constants. In these papers, we show how graph-theoretical analyses of biochemical pathways can determine whether or not various types of behavior can occur. The data required for these analyses are precisely of the sort supplied by current systems biology databases, namely the connectivity of the reaction network, but not the values of the kinetic constants. Paper I cleans up some loose ends left by Ivanova in her development of a method for ordinary differential equations. Paper II extends these methods to systems with delays. Biochemical systems can't respond instantly to signals or to changes in conditions since it takes time to transcribe and translate genes. Accordingly, any model which has a genetic regulatory component should in general also contain delayed terms. The extension of Ivanova's approach to systems with delays turns out to be remarkably simple. Our third paper deals with the potential for Turing bifurcations in reaction-diffusion systems. Turing bifurcations are associated with pattern formation in spatially extended systems. They are thought to be responsible for at least some developmental events, e.g. the formation of animal coat patterns (spots, stripes, etc.). Our latest paper deals with Turing-Hopf instabilities, which lead to interesting spatio-temporal behaviors (waves, spatio-temporal chaos, etc.), some of which are relevant to our understanding of cardiac function, among other things.
Our 2007 paper on rapid photosynthetic oscillations contained a hypothesis regarding the mechanism leading to the oscillations that involved the coupling between photosynthesis and photorespiration via the carbon dioxide generated by the latter process. In order to test this hypothesis, we built a mathematical model for the coupled processes. We were able to find oscillations, but not in a realistic parameter range. The analysis involved the use of our graph-theoretical methods for analyzing delayed mass-action systems, so it was a nice illustration of the power of these methods both for identifying the potential for oscillations and for identifying the parameter range where oscillations may be found.
Sensitivity analysis is an important technique in the analysis of models. It tells us which parameters have the greatest effect on specified aspects of a solution. With my colleagues Brian and Maya, we have developed a sensitivity analysis for oscillating solutions of delay-differential equations, which allows us to recover sensitivities of the period and amplitude of the solution to the parameters, including the delays.