Paper manuscript

Intrinsic Noise of Covalent Modification Cycles

Claus Metzner, Max Sajitz-Hermstein, Michael Schmidberger, and Ben Fabry

Biophysics Group, Department of Physics, University of Erlangen

In covalent modification cycles, the normal form X0 of a substrate protein is activated by an enzyme A ($X_0+A\rightarrow X + A$) and deactivated by another enzyme D ($X+D\rightarrow X_0+D$). As Goldbeter and Koshland have shown in a deterministic mass action model, such cycles may serve as switches in biochemical signal pathways, since the concentration ratio [X*]/[X] can depend sensitively on the relative enzyme levels [A]/[D].

However, due to the discrete stochastic nature of reaction events, the "output signals" [X] and [X0] fluctuate, even under constant input conditions [A]=[D]=const. In this paper, we investigate theoretically the population statistics of molecular species in a covalent modification cycle (CMC). Numerically exact Monte-Carlo simulations of the stochastic reaction processes are compared with analytical models, based on stochastic differential equations and stationary solutions of the corresponding Fokker-Planck equation. For the probability density function (PDF) of the fluctuating molecule number [X](t) we find, depending on the parameter regime, Gaussian, exponential, powerlaw, or flat distributions, while the temporal autocorrelation is simple exponential.


Introduction


In most living cells, chemical signals from the environment are transduced, via trans-membrane receptors, to the interior of the cell. The activated receptors then trigger chains of chemical reactions, called signal pathways, eventually leading to the expression of selected genes in response to the external stimulus. As has been shown in recent system biological studies, covalent modification cycles (CMCs) are a very frequent type of module in such biochemical pathways. A quantitative model of the signal transmission properties of CMCs is therefore essential for an integrative understanding of signal processing in living cells.

The typical structure of a CMC is shown in Fig.1 below.

CMC1

In such systems, a substrate protein is found in two different chemical states, the normal form X0 and an activated form X (often a phosphorylized version of X0). The transformation of the two forms into each other is provided by an activating enzyme A (often a kinase), the deactivation by another enzyme D (often a phosphatase). In the activation process, the catalyst A first binds its substrate X0. The resulting enzyme-substrate complex AX0 will sometimes decay back into the original components. In the case of a successful transformation, however, a product molecule X is released and the enzyme A is recovered for further use. The deactivation process is analogous.

Obviously, the CMC reaction network can be functionally decomposed into the two enzymatic transformation processes, as marked by the shaded boxes in the figure. For such processes, the average generation rate v(S,E) of the respective product as a function of the substrate (S) and enzyme (E) concentrations is usually described by Michaelis-Menten kinetics. This theory shows that v(S) grows in proportion with S (linear regime), as long as S«k_m, with k_m called the Michaelis constant. For very large S»k_m, the production rate v(S) saturates at a maximum value $v_m$.

As demonstrated in a classical paper of Goldbeter and Koshland, the combination of the two Michaelis-Menten transformations can lead to interesting behaviour if both operate within their saturated regimes. In this case, the equilibrium ratio y=[X]/[X0] of the two substrates as a function of the ratio of enzyme levels x=[A]/[D] developes a sigmoidal shape with a sharp transition point. For this reason, CMCs are usually understood as switches in the context of biochemical signal networks.

The Goldbeter-Koshland theory is based on mass action rate equations and thus disregards fluctuations entirely. This approach is valid in all cases where the total number of molecules of each relevant chemical species is large and where noise effects are negligible. However, many actual biochemical reactions in living cells work in the limit of low molecule copy numbers, where the signal-to-noise ratio is naturally small. In addition, chemical reactions inevitably generate intrinsic noise, due to the discrete and stochastic nature of elementary molecular reactions. This intrinsic noise can be amplified by the nonlinearities of chemical reaction dynamics and, thus, produce dramatic temporal fluctuations of molecule numbers even if the average copy number is not very small. We address this question in the present paper and present first results of a statistical theory of CMCs.

The outline of the paper is as follows….


Models and Methods


A. Model parameters and assumptions

Let the reactions take place in a container of volume V, so that a concentration [S] of some substance corresponds to a molecule copy number s=[S]V. In the following, abundances of chemicals will always be described by these dimensionless molecule numbers. We also assume that the reactor is "well stired", i.e. diffusion of chemical species is infinitely fast and so spatial effects are disregarded.

In the CMC, we have to consider 6 temporaly variable molecule numbers $x_0, x, a, d, ax_0, dx$, dynamically coupled by 6 chemical reactions:

reactions

Within each enzymatic transformation unit, the 3 reaction coefficients are denoted b (binding), u (unbinding) and c (conversion). Index 1 is used for the activation and index 2 for the deactivation process. Additional parameters are the total amount of the substrate in its various forms, $x_g = x_0 + ax_0 + x + dx$, as well as the total amounts of enzymes $a_g=a+ax_0$ and $d_g=d+dx$.


B. Numerical simulation method

In order to test our analytical theory presented below, we shall compare the results with a numerically exact Monte-Carlo-Simulation of the reaction dynamics, implementing the Gillespie-algorithm. In this method, the molecule numbers of each species are integers which change abruptely due to elementary reaction events. Statistically, these elementary reactions are Poisson-processes with average event rates depending on the momentary molecule numbers, according to the chemical rate equations. Therefore, the intrinsic stochastic fluctuations of the reactions are automatically included in a realistic way.

This is in contrast to many analytical models based on Langevin equations, where the artificial fluctuation term is typically assumed as Gaussian white noise without considering realistic (self-consistent) values of the noise power. Below, we shall use the numerical simulation of the CMC to confirm our assumptions about the statistical noise properties.


C. Coarse graining the enzymatic transformation

We first focus on a single enzymatic transformation reaction, for example the activation process. Our goal is to describe it in a coarse grained approximation as a single unit with effective statistical properties. Two of these effective units will later be used to derive a stochastic differential equation for the fluctuating number $x(t)$ of X-molecules.

For this purpose, we assume for a moment that the number $x_0$ of substrate molecules X0 is constant (ideal reservoire). We are then interested in the average production rate $\overline{R}_{act}(x_0)$ of the activated protein X and in the temporal fluctuations $\Delta R_{act}(x_0,t)$ of this rate. This, in turn, will enable us to write a stochastic rate equation of the production process in the form $\dot{x}=\overline{R}_{act}(x_0)+\Delta R_{act}(x_0,t)$.

As for the average rates, we solve the mass action rate equations in the stationary state. This is standard Michaelis-Menten theory, but for completeness we include the derivation in Apendix A. The result is

(1)
\begin{align} \overline{R}_{act}(x_0) = v_m \frac{x_0}{x_0+k_m} \end{align}

with the maximum transformation "velocity"

(2)
\begin{equation} v_m=c a_g \end{equation}

and the Michaelis constant

(3)
\begin{align} k_m=\frac{c+u}{b}. \end{align}

There are different regimes, depending on the relative size of the reaction coefficients u and c. The case u»c corresponds to the "pre-equilibrium regime": The enzyme-substrate complex is unstable and frequently decays back into its constituents, before a successful conversion to the product molecule occurs. The opposite case c»u we call "sequential regime": When the unbinding of the enzyme-substrate complex has a low probability, the substrate X0 is usually converted into the product X in a sequence of only two steps.

Next we have to model the fluctuations $\Delta R_{act}(x_0,t)$ of the production rate around this average value $\overline{R}_{act}(x_0)$. The statistical properties of these fluctuations are not trivial, even if the substrate copy number $x_0$ is artificially held constant. As motivated in Appendix B, we shall approximate the production process, in a coarse grained view, as a Poisson process with average event rate $\overline{R}_{act}(x_0)$. Numerical simulations, shown below, will confirm that the probability distribution of the waiting time between successive X-production events is indeed exponentially distributed with the expected characteristic time constant.

Yet, our intention is to derive a well-justified fluctuation term, suitable for our stochastic differential equation. For this purpose, we further approximate the above Poisson process by white Gaussian noise with a proper prefactor. The derivation is presented in Appendix C.

As a result of the above coarse-graining procedure, we obtain

(4)
\begin{align} \dot{x} = \overline{R}_{act}(x_0) + \sqrt{\overline{R}_{act}(x_0)}\cdot \zeta(t), \end{align}

where $\zeta(t)$ is normalized white Gaussian noise with $<\zeta(t)\zeta(t^{\prime})> = \delta(t-t^{\prime})$. The exact form of $\overline{R}_{act}(x_0)$ depends on the parameter regime.


D. Stochastic differential equation of a CMC

Including now both the activation and deactivation processes, we obtain

(5)
\begin{align} \dot{x} = \left[ \; \overline{R}_{act}(x_0) - \overline{R}_{dea}(x) \right] + \left[ \sqrt{\overline{R}_{act}(x_0)}\cdot \zeta_a(t) + \sqrt{\overline{R}_{dea}(x)}\cdot \zeta_d(t) \right]. \end{align}

Note that the deactivation rates depend on x, not x0. To make further progress, we neglect the amount of substrates bound within enzyme-substrate complexes, so that $x_0=x_g-x$. Assuming that the noise terms of the activation and deactivation processes fluctuate statistically independent from each other, we combine both terms, adding up the variances:

(6)
\begin{align} \dot{x} = \left[ \; \overline{R}_{act}(x_g-x) - \overline{R}_{dea}(x) \right] + \left[ \sqrt{\overline{R}_{act}(x_g-x) + \overline{R}_{dea}(x)} \right]\cdot \zeta(t). \end{align}

This has the general form of a stochastic differential equation with a multiplicative noise term:

(7)
\begin{align} \dot{x} = f(x) + g(x)\cdot \zeta(t). \end{align}

Here,

(8)
\begin{align} f(x)=v_a \frac{(x_g-x)}{(x_g-x)+k_a} - v_d \frac{x}{x+k_d} \end{align}

and

(9)
\begin{align} g(x) = \sqrt{\; v_a \frac{(x_g-x)}{(x_g-x)+k_a} + v_d \frac{x}{x+k_d} }, \end{align}

with obvious definitions of $v_a, v_d, k_a, k_d$.

From the stochastic differential equation one can derive an approximate Fokker-Plack equation and then compute the stationary PDF of x(t):

(10)
\begin{align} P(x)=\frac{N}{B(x)} \exp\left[ \int_{x_{min}}^x \!\!\frac{A(s)}{B(s)}ds \right], \end{align}

where the $A(x)$ and $B(x)$ depend on $f(x)$ and $g^2(x)$, as shown in Appendix D.

E. The symmetric CMC

With $v_a, v_d, k_a, k_d$ and $x_g$, there is obviously a large parameter space to explore. In this paper, we shall restrict ourselves to just a few analytical cases of interest.

Let us therefore consider symmetric CMCs, where the activation and deactivation processes have the same parameters, i.e. $v_a=v_d=v$ and $k_a=k_d=k$. We then have

(11)
\begin{align} f(x)=v\left[\; \frac{(x_g-x)}{(x_g-x)+k} - \frac{x}{x+k} \right] \end{align}

and

(12)
\begin{align} g^2(x) = v \left[ \frac{(x_g-x)}{(x_g-x)+k} + \frac{x}{x+k} \right]. \end{align}

Since the drift term A(s) and the diffusion term B(s) are both proportional to v (compare Appendix D), it is clear that the maximum production rate v will not affect the shape of the stationary PDF. So $k$ and $x_g$ are the only important parameters left.

First, let us further specialize to the limit of a large Michaelis constant, i.e. $k>>x_g$, corresponding to the linear regime of the two enzymatic transformation reactions. This leaves us with

(13)
\begin{equation} f(x)=(v x_g / k)- (2v/k)x \end{equation}

and

(14)
\begin{equation} g^2(x) = (v x_g / k). \end{equation}

A straight forward calculation of the PDF yields a Gaussian, centered at $\overline{x}=\frac{x_g}{2}$ and with variance $\sigma_x^2=\frac{x_g}{4}$:

(15)
\begin{align} P(x) \propto e^{-\frac{2(x-(x_g/2)))^2}{x_g}} \end{align}

Next, we consider the opposite case of a small Michaelis constant, i.e. $k<<x_g$, corresponding to the saturation regime. We then have

(16)
\begin{align} f(x)=v\left[ 1-\frac{x}{x+k} \right]\;\rightarrow\; \frac{vk}{x}\;\;\mbox{for}\;\;x>>k \end{align}

and

(17)
\begin{align} g^2(x) = v\left[ 1+\frac{x}{x+k} \right]\;\rightarrow\; 2v\;\;\mbox{for}\;\;x>>k. \end{align}

So the assymptotic drift and diffusion terms are $A(x)= \frac{vk}{x}$ , $B(x)=v$, and $A(s)/B(s)=\frac{k}{x}$. Therefore,

(18)
\begin{align} \int_{x_{min}}^x \; \frac{A(s)}{B(s)}ds = k \cdot \log(x/x_{min}). \end{align}

This in turn means that

(19)
\begin{align} P(x) \propto e^{k \cdot \log(x/x_{min})} \propto (x/x_{min})^k. \end{align}

Hence, we expect an increasing power-law tail for the asymptotic PDF in the saturation regime of the symmetric CMC. The exponent of the power-law can be fractional and is determined by the dimensionless Michaelis constant.

Of course, the above analytical approximations will break down close to the boarders $0$ and $x_g$ of the valid range of x(t).

Besides the point statistics (PDF) of x(t), another quantity of interest would be the temporal auto-correlation function $C_{xx}(\tau)$ of this fluctuating concentration. It is not trivial to analytically calculate $C_{xx}(\tau)$, even for the symmetric CMC. In this paper, we consider only the simplest possible case, namely the symmetric CMC in the linear regime. As shown in appendix D, one obtains an exponentially decaying autocorrelation function with the characteristic time constant $\tau_c=\frac{k}{2v}$.


Results


A. Enzymatic transformation process

We first investigate the statistics of the enzymatic activation process, with artificially fixed number x_0 of substrate molecules. For this purpose, we perform direct Monte-Carlo simulations in different parameter regimes. All rates and times are presented in dimensionless numbers.

idealMM1

Fig.[idealMM1]: Simulated molecule numbers of the enzyme-substrate complex (green curve) and of the activated product (red curve) in the case of only one enzyme molecule. Parameters: b=u=c=1.0, $e_g=1$. The vertical arrows denote a conversion (c), binding (b) and unbinding (u) process.

Fig.[idealMM1] shows a typical example of the stochastic time evolution of the various molecule numbers. It can be seen that the single enzyme molecule sometimes undergoes binding (b) and unbinding (u) without conversion (c) to a product molecule. The time intervall $\Delta t$ between two successive conversion events is fluctuating. If the production process, in total, could be viewed as a Poisson process, one would expect an exponential PDF for those time intervalls.

dt1

Fig.[dt1]: Simulated PDF of the time intervalls between successive conversion events. Parameters: b=u=c=1.0, $x_0=1=const.$. Case (a): Only one enzyme molecule (green curve). Case (b): 10 independent enzyme molecules (red curve). The inset shows the same data in a semi-logarithmic plot.

Fig.[dt1] shows the simulation results for the same parameters as in Fig.[idealMM1]. For a single active enzyme molecule (green curve), the PDF is clearly not exponential. However, already for 10 enzyme molecules working independently (red curve), the PDF resembles a Poisson process very closely. The decay constant of the exponential agrees with the inverse of the average production rate. We have found the same behaviour for all parameter regimes investigated in this paper.


B. Distribution of molecule numbers


Simulation of the symmetric CMC with the parameters:

Substance: $x_0=x = 1000$ enzyme:A = B = 10 ,
for activation and deactivation: b = 0.01 u=c = 50; T = 10000

which means $x_g = 2000, K_m = \frac{c + u}{b} >> x_g$

Symmetric_CMC_S=1000_E=10_b=0,01_u=c=50_T=e4.jpg

The autocorrelation of this simulation also shows the expected $C_{xx}(\tau) \propto exp(-\frac{\tau}{\tau_c})$ behaviour with $\tau_c = \frac{K_m}{2 V} =10$(for our parameters)

Autokorrelation_x=x0=1000_E=10_b=0,01_u=c=50_T=9e4.jpg

Discussion and Outlook


  • Biological relevance of noise
  • Longtime correlations from powerlaw distributions of concentrations
  • Unexplored parameter regimes

Appendices


A. Average production rates

We consider an enzymatic transformation reaction of the general form:

enzTra

Using mass action rate theory, we obtain for the temporal change of the concentration y(t) of the enzyme-substrate complex:

(20)
\begin{align} \dot{y}= b\; x\; e - u\; y - c\; y \end{align}

We make the simplifying approximations that x(t) is hold constant. After a certain relaxation time, the system will reach a steady state, in which also y(t)=const. The condition $\dot{y}=0$ then leads to

(21)
\begin{align} y = x \;e \frac{b}{u+c}. \end{align}

The expression $\frac{b}{u+c} = \frac{1}{k_m}$ is defined as the inverse Michaelis constant, so that $y=\frac{x\; e}{k_m}$. Since the enzyme can either be free or bound in the complex, $e_g=e+y$, one obtains $y=\frac{x (e_g-y)}{k_m}$. Solving for y yields

(22)
\begin{align} y = e_g \frac{x}{x+k_m}. \end{align}

For the quantity of interest, the steady state generation rate $\dot{z} = c\; y$ of the product, we finally obtain

(23)
\begin{align} \dot{z} = \left( c\; e_g \right) \frac{x}{x+k_m} = v_m \frac{x}{x+k_m}. \end{align}

B. Enzymatic transformation as an effective Poisson process

In general, an individual A-enzyme molecule can undergo a series of binding/unbinding events with the (non-exhaustible) substrate X0, before the substrate is finally converted into a new X-molecule. Therefore, even though each elementary reaction step, i.e. binding, unbinding and conversion, is a Poisson process, the same is not true for the multi-step production process.

FOOTMARK: For a simple example, consider a sequence of one binding and one conversion step. The PDF of each elementary Poisson step is exponential. The PDF of the sequence is a convolution of two exponential functions, i.e. a Gamma distribution with shape parameter $k=2$.

However, many individual A-enzyme molecules, dispersed throughout the volume of the container, are simultaneously active, with independent temporal statistics. Our numerical simulations show that the superposition of many independent non-Poisson processes can resemble an effective Poisson process very closely. As expected, the characteristic time constant of this effective Poisson process is given by the inverse of the average total production rate $\overline{R}_{act}(x_0)$.

In our CMC system, the substrate copy number $x_0$, and therefore $\overline{R}_{act}(x_0)$, are not constant. So we have here not a stationary Poisson process, but one with time-varying rate.

We conclude that in systems with many independent enzyme molecules, the overall transformation process can be approximated by an inhomogeneous Poisson process.


C. Poisson process as white Gaussian noise

Assume now a Poisson process with constant average event rate $\overline{k}=\overline{R}_{act}(x_0)$. We express the temporal change of the number x(t) of product molecules in the form

(24)
\begin{align} \dot{x}=\overline{k}+\Delta k(t). \end{align}

For later convenience, we want to approximate the fluctuation term by Gaussian white noise,

(25)
\begin{align} \left< \Delta k(t) \Delta k(t^{\prime})\right>=\Gamma \delta(t-t^{\prime}). \end{align}

What is the proper choice for the prefactor $\Gamma$, so that the major statistical properties of a Poisson process are consistently reproduced ?

To answer this question, we consider the number n(T) of X-molecules which are produced during an intervall of length T:

(26)
\begin{align} n(T)=\int_0^T \dot{x}(t) dt = \overline{k}T +\int_0^T \Delta k(t) dt = \overline{n}+\Delta n. \end{align}

In the ensemble average, a Poisson process must fulfill

(27)
\begin{align} \left< (\Delta n)^2 \right> = \overline{n}, \end{align}

or

(28)
\begin{align} \left< \left( \int_0^T \Delta k(t) dt\right)^2 \right> = \overline{k}T. \end{align}

The left side of the above equation can be reduced to $\Gamma T$, so that we obtain $\Gamma = \overline{k}$, and therefore

(29)
\begin{align} \left< \Delta k(t) \Delta k(t^{\prime})\right>=\overline{k} \delta(t-t^{\prime}). \end{align}

Deviding this equation by $\overline{k}$ leads to

(30)
\begin{align} \left< \frac{\Delta k(t)}{\sqrt{\overline{k}}} \frac{\Delta k(t^{\prime})}{\sqrt{\overline{k}}}\right> = \delta(t-t^{\prime}). \end{align}

We now define a new random function by

(31)
\begin{align} \zeta(t)=\frac{\Delta k(t)}{\sqrt{\overline{k}}}. \end{align}

The autocorrelation of this function shows the desired properties of normalized, white Gaussian noise:

(32)
\begin{align} \left< \zeta(t) \zeta(t^{\prime}) \right> = \delta(t-t^{\prime}). \end{align}

We conclude that a proper description of a Poisson process by a stochastic differential equation should have the form

(33)
\begin{align} \dot{x}=\overline{k} + \sqrt{\overline{k}} \; \zeta(t). \end{align}

D. Stationary PDF of stochastic process

Starting from the stochatic differential equation,

(34)
\begin{align} \dot{x}=f(x)+g(x)\zeta(t)\;\;\mbox{with}\;\;<\!\!\zeta(t)\zeta(t+\tau)\!\!>\;=\;\delta(\tau)\;\;\mbox{(white Gaussian noise)}, \end{align}

we define a drift term (using the Ito formalism),

(35)
\begin{align} A(x)=f(x)+\frac{1}{4}\frac{d}{dx}g^2(x) \end{align}

and a diffusion term

(36)
\begin{align} B(x)= \frac{1}{2}g^2(x). \end{align}

The time-dependent PDF $P(x,t)$ of the fluctuating variable x(t) approximately satisfies the
Fokker-Planck equation

(37)
\begin{align} \frac{\partial}{\partial t}P(x,t)=-\frac{\partial}{\partial x}\left[ A(x)P(x,t) \right] + \frac{\partial^2}{\partial x^2}\left[ B(x)P(x,t) \right]. \end{align}

The stationary solution $P(x)$ of this equation reads

(38)
\begin{align} P(x)=\frac{N}{B(x)} \exp\left[ \int_{x_{min}}^x \!\!\frac{A(s)}{B(s)}ds \right]. \end{align}

Here, N is a normalization constant.


E. Auto-correlation function

For a symmetric CMC in the linear regime, we have $f(x)=(v x_g / k)- (2v/k)x$ and $g(x) = \sqrt{v x_g / k}$, with an average molecule number of $\overline{x}=x_g/2$. Using a new variable $\Delta x(t) = x(t) - \overline{x}$, one obtains $f(\overline{x}+\Delta x) = - (2v/k)\Delta x(t)$, so that the stochastic differential equation can be written

(39)
\begin{align} \left[ \frac{d}{dt} + (2v/k) \right] \Delta x(t) = \sqrt{v x_g / k} \;\;\zeta(t). \end{align}

For the above differential operator we define a Greens function

(40)
\begin{align} \left[ \frac{d}{dt} + (2v/k) \right] G(t) = \delta(t). \end{align}

It has the form

(41)
\begin{align} G(t) = \theta(t) \; e^{-(2v/k)t}. \end{align}

So arbitrary noise functions can be treated with the convolution

(42)
\begin{align} \Delta x(t) = \int_{-\infty}^t G(t-t^\prime)\;\sqrt{v x_g / k} \;\;\zeta(t^\prime)\;\; dt^\prime. \end{align}

Using the normalized white-noise properties of $\zeta(t)$, it is now straight-forward to calculate the ensemble-averaged autocorrelation function of our stationary random process x(t):

(43)
\begin{align} C_{xx}(\tau)=<\Delta x(\tau) \Delta x(0) > = \left( \frac{x_g}{4} \right)\; e^{-(2v/k) \tau}. \end{align}

As a result we obtain exponential correlations with a characterictic time constant

(44)
\begin{align} \tau_c=\frac{k}{2v}. \end{align}

Graphs of the Goldbetercycle with symmetric production and decay-rates

Symbol definitions:

reactions

The original simulation data can be found here

As shown before a Michaelis-Menten-Process behaves like a simple Poisson-process if the following parameter ratios are hold:

$$ K_m = \frac{u + c}{b} >>[x_g(t=0)]$$
$$ b,u >> c $$

A linear production rate $\dot{x}$ = const. of x can be approximated, if

$$ K_m = \frac{u + c}{b} << [x_g(t=0)]$$
$$ b,u >> c $$

In our simulation the following Parameters were kept constant:

$$ x = x_0 =1000;~encymes:~E_{act}=E_{deact}=1000 $$

b=0,1 c=0,01

The chemical decay-rate-constant was swept between 0,1 and $10^4$

S=E=1000_b=0,1_u_sweep_c=0,01

In our 2nd set of simulation c was swept between 0,1 and 1000:

Constant Parameters:

S=E=1000 , b=1 u = 0,1

S=E=e3_b=1_u=0,1_c=0,1_T.JPG
S=E=e3_b=1_u=0,1_c=1000_T=1e1.JPG

The expected powerlaw behaviour cannot be seen in the double logarithmic scale

S=E=e3_b=1_u=0,1_c=0,1_T_log

Another simulation with parameters:

$$ x = x_0 = 100;~~~encyme~E_{act} = E_{deact} = 10 $$
$$ b_{act} = b_{deact} = 1; u_{act} = u_{deact} = 0.1; c_{act} = c_{deact} was swept $$

Goldbeter_Claus_X0_X=100_E=10_b=1_u=0,1_c=sweep.jpg
The original simulation data and the origin evaluation file can be found here

A 3rd simulation with following parameters:

$$ x = x_0 = 1000;~~~encyme~E_{act} = E_{deact} = 10 $$
$$ b_{act} = b_{deact} = 0,01; u_{act} = u_{deact} \geq 10;~c_{act} = c_{deact} = 1$$

X0=X=e3_E=e1_b=0,01_u=sweep_c=1.jpg

The u = 1000 - graph in semilog - plot:

^X0=X=e3_E=e1_b=0,01_u=1000_c=1_Nr297_semilog.jpg

The actual origin file including all Simulation data and the (normalized) plots: http://lpmt-theory.wikidot.com/local--files/the-paper/Paper_Goldbetercyclus_2009-3-20-10.opj


asym

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License