Volume 2011, Issue 1 721352
Research Article
Open Access

Study of Thermodynamically Inspired Quantities for Both Thermal and External Colored Non-Gaussian Noises Driven Dynamical System

Monoj Kumar Sen

Monoj Kumar Sen

Department of Chemistry, Visva-Bharati, Santiniketan 731 235, India

Search for more papers by this author
Alendu Baura

Alendu Baura

Department of Chemistry, Visva-Bharati, Santiniketan 731 235, India

Search for more papers by this author
Bidhan Chandra Bag

Corresponding Author

Bidhan Chandra Bag

Department of Chemistry, Visva-Bharati, Santiniketan 731 235, India

Search for more papers by this author
First published: 10 July 2011
Academic Editor: Manuel O. Cáceres

Abstract

We have studied dynamics of both internal and external noises-driven dynamical system in terms of information entropy at both nonstationary and stationary states. Here a unified description of entropy flux and entropy production is considered. Based on the Fokker-Planck description of stochastic processes and the entropy balance equation we have calculated time dependence of the information entropy production and entropy flux in presence and absence of nonequilibrium constraint (NEC). In the presence of NEC we have observed extremum behavior in the variation of entropy production as function of damping strength, noise correlation, and non-Gaussian parameter (which determine the deviation of external noise behavior from Gaussian characteristic), respectively. Thus the properties of noise process are important for entropy production.

1. Introduction

In recent years the stochastic dynamics [15] community is becoming increasingly interested to study the role of noise in dissipative dynamical systems, because of its potential applications on various noise-induced phenomena, such as noise-induced phase transition [6], noise-sustained structures in convective instability [7], stochastic spatiotemporal intermittency [8], noise-modified bifurcation [9], noise-induced traveling waves [10], noise-induced ordering transition [11], noise-induced front propagation [12], stochastic resonance [1315], coherence resonance [1619], synchronization [20, 21], clustering [22], noise-induced pattern formation [23, 24]. In the traditional classical thermodynamics, the specific nature of the stochastic process is irrelevant but it may play an important role on the way to equilibration of a given nonequilibrium state of the noise-driven dynamical system. The relaxation behavior of the stochastic processes can be understood using information entropy (S). Now the information entropy becomes a focal theme in the field of stochastic processes [2528]. In [27] the authors have been studied the transition from the slow-wave sleep to the rapid-eye-movement sleep in terms of the information entropy. Crochik and Tomé [28] calculated the entropy production in the majority-vote model and showed that the entropy production exhibits a singularity at the critical point. The time evolution of S mainly considers the signature of the rate of phase space expansion and contraction in the random force-driven Brownian motion. This implies that the specific nature of the random process has a strong role to play with S. In view of the importance of the characteristics of the frictional and the random forces, the specific nature of the random process has a strong role to play with information entropy flux and entropy production in presence and absence of nonequilibrium constraint. The random force may be of both internal and external origins. We assume that the internal thermal noise is Gaussian in characteristic. But the external noise may be of non-Gaussian properties. Again we will discuss this aspect in the later part. However, the frictional force of thermal environment may be proportional to momentum of triggered particle, or it may associated with finite memory kernel. Stochastic processes with frictional memory kernel (i.e., non-Markovian stochastic processes) are important in many situations such as chemical reactions, isomerization [5, 2931], and Josephson junction [32]. Extension of Kramers′ rate theory to non-Markovian stochastic processes has been the subject of the recent past [5]. We now want to discuss the efficacy of choosing non-Gaussian noise instead of Gaussian one for external environmental random perturbations. Experimental data indicates that the noise in biological processes may have a non-Gaussian character. Examples include, among others, flow of current through voltage-sensitive ion channels in a cell membrane or experiments on the sensory systems of rat skin [33, 34]. Recent detailed studies on the source of fluctuations in different biology systems [35, 36] have clearly established that, in such a context, noise sources are, in general, non-Gaussian. Recently, Fuentes et al. [37] have shown that the stochastic resonance can be enhanced when the subsystem departs from Gaussian behavior and the system shows marked “robustness” against noise tuning, that is, the signal-to-noise ratio curve can flatten when departing from Gaussian behavior, implying that the system does not require fine tuning of the noise intensity in order to maximize its response to a weak external signal. This theoretical finding was verified experimentally by Castro et al. [38]. Very recently the role of colored non-Gaussian noise having continuous distribution has been investigated in the context of synchronization of coupled phase oscillators [20, 21], kinetics of self-induced aggregation of Brownian particles [22], escape through an unstable limit cycle [39, 40], escape from a metastable state [4147], coherent resonance in the noise-driven neurons [48], and ratchet problem [49]. Furthermore, non-Gaussian noise of third order has been shown to be useful in some of the autocatalytic reactions [50]. The objective of the present paper is the study of time dependence of information entropy production and entropy flux in a unified description for internal and external noises-driven system in the presence and the absence of nonequilibrium constraint.

The outline of the paper is as follows. In Section 2 we calculate entropy flux and entropy production in the nonequilibrium and stationary states. The paper is concluded in Section 3.

2. Calculation of the Information Entropy Flux and Production

2.1. Relaxation of the Noise-Driven Dynamical System to a Stationery State

We consider a stochastic process in the presence of both internal thermal noise and external noise. The Langevin equation of motion for this process can be written as
()
()
where x and p correspond to the position and momentum of a harmonic oscillator with frequency ω. Here γ(tt) is the dissipative memory kernel and f(t) represents internal thermal noise that satisfies the fluctuation-dissipation relation
()
kB is the Boltzmann constant and T is the temperature of the thermal bath. The parameter ϵ is used to identify the noise strength. However, the frictional kernel plays a decisive role in the study of behavior of non-Markovian dynamics [51]. The variance of stochastic observable may not always provide long time limits. Therefore, in general, one has to work out non Markovian system for analytically tractable models [51]. To capture essential features of the non Markovian dynamics, we consider an exponentially decaying frictional memory kernel [5254]. Therefore, γ(tt) in the present model can be represented as
()
where τi is the memory time of the internal colored noise and γ0 is the frictional coefficient in the Markovian limit τi → 0. For the frictional memory kernel (2.4) the integrodifferential (2.2) can read as
()
with
()
Here ζ(t) is a white Gaussian noise of zero mean and 〈ξ(t)ξ(t)〉 = 2δ(tt). However, by the remaining noise term η(t) in (2.1) we have considered the collective effect of nonthermal environment (NTE). We assume that its two-time correlation is independent of damping of the thermal bath. However, by the NTE we mean the degrees of freedom in the complex system which are strongly interacting with the tagged particle. To be noted here that the nonthermal environment is also embedded in the same thermal environment. The NTE is very much relevant in the context of the biological system. Due to nonlinear dynamics, the nonthermal noise may be non-Gaussian and colored in characteristics. The reason is the following. In the system reservoir model [55, 56] we consider that modes of thermal bath are harmonic oscillators. The equilibrium distribution function in terms of the dynamical variables (coordinate and momentum) of a bath mode is of Gaussian type. Therefore, the noise in the Langevin equation of motion for the triggered particle is Gaussian in nature. However, there is a number of situations where noise source is not thermal bath. For example, the time evolution equation of the voltage in the dynamics of up-down sates of neurons is coupled to a very large number of dynamical equations related to ion flow. Their collective effect is the noise of biological origin in the time evolution equation for the voltage [57, 58]. These dynamical equations are in general nonlinear since they exhibit a self-sustained oscillation in the presence of dissipation. Therefore it is obvious that the distribution function of the dynamical variable should be of non-Gaussian type. In the introduction we have already mentioned the experimental evidence for the non-Gaussian behavior of the noise of biological origin [3336]. However, we start the present problem considering that the nonthermal noise η(t) is the Ornstein-Uhlenbeck noise process [5]. The time evolution equation of the noise is
()
The two time correlation function 〈η(t)η(0)〉 decays exponentially:
()
Thus τe is the correlation time of the Ornstein-Uhlenbeck noise and De is the strength of the external noise process.
Now treating f and η as phase space variables on the same footing as x and p and using (2.1), (2.5), (2.6), and (2.7), it is simple to write the Fokker-Planck equation with arbitrary values of dissipation parameter, noise correlation time, and noise strength as follows:
()
where ρ(x, p, f, η, t) is the extended phase space probability distribution function.
It is now important to note that one can use the following linear transformation:
()
in the Fokker-Planck (2.9) since all the above differential equations ((2.1), (2.5), (2.6), and (2.7)) are linear in terms of the phase space variables (x, p, f, and η). Parameters a, b, and c in (2.10) are to be determined. U, being a linear combination of the extended phase space variables, takes care of their stochastic behavior entirely. The transformation (2.10) is generally used [5, 5961] with the purpose of reducing the dimension of the Fokker-Planck equation for a linear stochastic process. Equation (2.10) has been used in general for the study of Kramers’ problem but there one does not replace the full non-Markovian dynamics with this transformed process, but only for stationary nonequilibrium dynamics [5]. It reduces an infinite-dimensional Markovian process (a general non-Markovian Gaussian process) into a one-dimensional description for the flux solution. But in the present problem we use the transformation (2.10) for the full non-Markovian dynamics. We will discuss more this transformation in the appendix.
By virtue of the above transformation the Fokker-Planck equation (2.9) becomes
()
where
()
()
()
λ is an another constant to be determined. We must stress that the reduction of the four-dimensional description (2.9) to the one-dimensional (2.11) is not adhoc and the probability density corresponding to the variables x, p, f, and η and the probability density corresponding to the transformed variable U are the same. Essentially (2.9) is due to special projection of U. Through the terms λU and Deff the one-dimensional description considers effective contributions of the drift and the diffusion of the original dynamics. Now we come back into the issue how one can determine the constants a, b, c, and λ. Using (2.10) in (2.14) and comparing the coefficients of x, p, f, and η, we have
()
From the above algebraic equations we obtain a cubic equation for λ as
()
Now we put transformation z = λ − 1/(3τi) in the above equation to have a cubic equation of the standard form
()
where
()
For the present problem, we consider the real positive root of the above algebraic equation which is as follows:
()
since the distribution function must vanish at the boundary. In order to have real and positive λ we choose n < 0 and (n2/4 + m3/27) > 0. Using the value of λ in (2.15), and (2.16), one can have a, b, and c. Particularly the dependence of c on the oscillator frequency (ω), noise correlation time, and γ is useful in (2.13) for further calculation. It is given by
()
Now we would like to mention that (2.1), (2.5), (2.6), and (2.7) indeed can become again a truly (real-valued) Markovian process for the special projection of U with the condition that λ > 0. Thus U describes original non-Markovian dynamics in terms of a truly (real-valued) Markovian process with the restriction that λ > 0. Now we like to mention another point that the natural demand of positiveness of λ and its finite value governed by the system frequency and other parameters put restriction through (2.15), and (2.16) on values of coefficients a, b, and c in the linear transformation (2.10). Thus the linear transformation (2.10) does not work for arbitrary values of the coefficients. However, the above Fokker-Planck equation (2.11) can be generalized for colored non Gaussian noise-driven systems at a specific limit. The colored non-Gaussian noise can be generated from the solution of the following stochastic differential equation [62]:
()
ξ(t) is a standard Gaussian noise of zero mean and 〈ξ(t)ξ(t)〉 = 2δ(tt). The form of noise η as employed in (2.21) allows us to control the departure from the Gaussian behavior easily by changing a single parameter r. De and τe are noise parameters related to the noise intensity and the correlation time of η. The parameter α in (2.21) is defined as
()
Now we consider two different situations. For r = 1 (2.21) reduces to (2.7). For r > 1 the stationary properties of the noise η, including the two-time correlation function, have been studied in [63], and here we summarize the main results. The stationary probability distribution is given by
()
where Zr is the normalization factor given by
()
Γ indicates the Gamma function. This distribution can be normalized only for r < 3. Since the above distribution function is an even function of η, the first moment, 〈η〉, is always equal to zero and the second moment is given by
()
which is finite only for r < 5/3. It is apparent from the above facts that the distribution function has a long tail which leads to diverge the second moment for r⩾5/3 although it can be normalized up to r < 3. Furthermore, for r < 1, the distribution has a cut-off and it is only defined for
()
Finally, the correlation time of non-Gaussian noise τe of the stationary regime of the process η(t) diverges near r = 5/3, and it can be approximated over the whole range of values of r as
()
Clearly, when r → 1, we recover the limit of η being a Gaussian colored noise, that is, the Ornstein-Uhlenbeck process. In this limit, in fact, the term in the square bracket of (2.23) can be written as
()
and therefore (2.23) becomes
()
with
()
Here we would like to note that (2.25) shows that, for a given external noise strength De and noise correlation time τe, the variance of the non-Gaussian noise is higher than that of the Gaussian one for r > 1, that is,
()
Similarly (2.27) implies that τer > τe for r > 1. Before leaving this part we would like to mention that in the present study we have considered continuous distribution of non-Gaussian noise which is more relevant in the natural systems rather than two state or discrete distributions [64] as mostly used in the literature to study the noise-driven dynamical systems.
Now it is important to note that because of nonlinearity in terms of η in (2.21) calculation of information entropy and flux analytically is very difficult. However, to get the flavor of effect of non-Gaussian noise on these we consider small deviation from Gaussian character of the noise η(t). To do so, first, we replace η2 in (2.21) by its average value at stationary state. This should be a good approximation in the limit r → 1. Thus in the limit r → 1 (2.21) can be written as
()
where
()
τng and Dng are effective noise correlation time and noise strength of the non-Gaussian noise when the Gaussian noise process has correlation time τe and noise strength De. Equation (2.33) show that τng > τe and Dng > De for r > 1. Now the two-time correlation function of η according to (2.32) is given by
()
In the limit τ → 0 the above equation becomes
()
It describes the variance of the white non-Gaussian noise in the limit r → 1. However, to check the validity of the above approximation, we have plotted numerically calculated autocorrelation function (〈η(t)η(0)〉)(ACF) versus t in Figure 1. It shows that at small r (r = 1.25), the ACF is fitted well with first-order exponentially decaying curve (solid curve corresponds to the numerical result and the dotted curve one is due to the fitted function; The same convention is followed for the curves for r = 1.). The approximate auto-correlation function (2.34) for r = 1.25 is presented in the same figure by dashed curve, and it is close to the numerical one. Figure 1 also shows that the preexponential factor for non-Gaussian noise (r = 1.25) is much greater compared to Gaussian noise even at very small τ (τ = 0.01). It implies that the noise strength of white non Gaussian noise is larger than that of Gaussian white noise. However, (2.1), (2.5), (2.6), and (2.32) lead to generalize the Fokker-Planck (2.11) for colored non-Gaussian noise if the noise behavior is not deviated largely from the Gaussian characteristics. The generalize Fokker-Planck can be read as
()
where
()
Details are in the caption following the image
Plot of autocorrelation function (〈η(t)η(0)〉) versus t for the parameter set De = 0.5 and τe = 0.01.

Thus our present study with the above Fokker-Planck equation will lead to have an exact result when both the internal and the external noises are the Ornstein-Uhlenbeck noise. But the result will be approximate if the external colored noise is non-Gaussian one. It would be close to the exact one as the non-Gaussian noise parameter approaches to unity. Before leaving this part we would like to mention that the random force of internal origin and damping are related through fluctuation-dissipation relation but the external noise is independent of damping. Therefore the stationary state in the present problem is a steady state one. In the next section we will discuss relaxation behavior in terms of entropy production and entropy flux of an external force-driven steady state.

Keeping in mind all the above facts now we introduce the Shanon information measure [65, 66]
()
which typically is not a conserved quantity. S in the above equation is called information entropy. If one considers the Boltzmann constant as the information unit and identifies the Shannon measure with the thermodynamic entropy, then the whole of statistical mechanics can be elegantly reformulated by extremization of S, subject to the constraints imposed by the a priori information one may possess concerning the system of interest.
In the next step we define the information entropy flux and entropy production using (2.36) and (2.36). The time evolution of S with (2.36) can be written as
()
Performing partial integration of the right-hand side of the above (2.39) and then putting usual boundary conditions (we consider the system with a finite phase space volume as usually happens in reality. Hence there should be a well-defined boundary on which and beyond the distribution function must be zero. We assume the derivatives of the distribution function at the boundary to vanish), one obtains the following form of entropy balance equation:
()
The first term in (2.40) has no definite sign while the second term is positive definitely, because of positive definiteness of Deff. Then one can identify the first and the second terms as entropy flux (SF) and entropy production (SP), respectively:
()
()
Thus the entropy flux defined here calculates the average of divergence of deterministic force involved in the system, that is, it considers time evolution of the average of phase space expansion or contraction rate by virtue of the deterministic force. But the entropy production measures the rate of phase space expansion due to the random force. It is important to note that (2.42) shows that the information entropy production is proportional to the Fisher information with the proportionality constant, Deff. We then examine the connection between the information entropy production and the phase space collapse of system at steady state. In this state we have (for details we refer to [67])
()
in the limit ϵ ≪ 1.

Here is the Lyapunov exponent of the ith component of the phase space. Thus information entropy production as defined by (2.42) is equal to the negativeness of the Lyapunov exponent or equivalently to the rate of phase space volume contraction plus a correction term vanishing as the noise strength goes to zero [67] at stationary state. It is a link between thermodynamically inspired quantities and the quantities involved in the underlying dynamics in phase space. At the same time this explains how finite phase space volume is possible at long time in the presence of dissipative force. Furthermore, following [67], the connection between the entropy production of irreversible thermodynamics and the underlying dynamics in phase space for the Langevin description may be established.

Using the identity
()
in (2.36) we have
()
where ρst is the stationary solution of the Fokker-Planck (2.36). Here it is to be noted that the first, second, and third integrals in (2.45) are of zeroth, first, and second order, respectively, with respect to the deviation from equilibrium. Doing partial integration of the above equation, one obtains
()
In the above new decomposition of the time evolution of information entropy the first term has no definite sign and contains, in principle, contributions of all orders in the deviation from equilibrium. But the third term is both positive and of second order in the deviation from equilibrium, thereby fulfilling the principal condition required for entropy of irreversible process. Thus it is analogous to the entropy production production of irreversible thermodynamics and we represent it as
()
We call it information entropy production which is due to irreversibility in the relaxation process. In the stationary state these two terms are related as follows:
()
Using (2.43) in the above equation we have
()
This is the required connection between entropy production of irreversible process and phase space dynamics.
To find the explicit time dependence of the above quantities we then search for Green′s function or conditional probability solution [4, 6870] for the system at U at time t given that it had the value at U at t = 0. This initial condition may be represented by the δ-function
()
is the normalization constant. We now look for a solution of (2.36) of the form
()
where G(t) = (−1/(σ(t)))(Uα(t)) 2 + ln ν(t).
We will see that by suitable choice of α(t), σ(t), and ν(t) one can solve (2.36) subject to the initial condition
()
Comparing (2.51) with (2.52) and G(0) we have , α(0) = U, .
If we put (2.51) in (2.36) and equate the coefficients of equal powers of U, we obtain after some algebra
()
The relevant solutions of σ(t) and α(t) for the present problem which satisfy the initial conditions above are given by
()
()
Now making use of (2.51) in (2.41), (2.42), and (2.47) we finally obtain explicit time dependence of the entropy flux (SF) and the entropy production (SP) having all order contribution with respect to deviation from equilibrium and the entropy production () due to irreversibility in the process as
()
()
()
Equation (2.56) describes how phase space contraction rate is affected by ω, τi, and damping strength γ0. Now we consider (2.57). Since the width of the distribution function σ increases with time, the information entropy production and the Fisher information decrease towards a stationery value (λ). At small width of the distribution function, the random force has a strong role to expand the phase space against the deterministic force and therefore the entropy production (the phase space expansion rate) is the highest during the start of motion of Brownian particle. We have demonstrated it in Figure 2. At short time the entropy production is comparatively higher for external Gaussian noise than non-Gaussian one. This is because of greater effective diffusion coefficient of the former than the latter as c in (2.20) is smaller for non-Gaussian noise due to larger effective noise correlation time (τng). To be mentioned here is the entropy production due to irreversibility ( also monotonically decreases to the limiting value). However, the relaxation time solely depends on structure of the effective damping constant λ which depends on characteristics of dynamical system as well as internal thermal noise and is independent of the properties of external noise. We now note that (2.56), and (2.57) satisfies the stationary condition as follows:
()
since at long time
()
Details are in the caption following the image
Plot of (SP) versus time using (2.57) for the parameter set γ0 = 1.0, σ(0) = 0.25, α(0) = 1.0,  ω = 0.5, kBT = 0.25, De = 0.5, and τi = τe = 1.0.

2.2. Relaxation of Small External Force-Driven Equilibrium State to a Steady State

It is now interesting to examine the time dependence of entropy flux and production during the relaxation of small external force-driven steady state. To this end we consider the constant drift fe in (2.5) due to external force so that the total drift in (2.12) now becomes
()
where F0 = −λU, F1 = bfe, and h is the smallness parameter. When h = 0, ρ = ρst, ρst is the stationery solution in absence of F1. The deviation of ρ from ρst in presence of nonzero small h can be explicitly taken into account once we make use of the identity for the diffusion term (2.44). Then for the above definition of the deterministic force the Fokker-Planck (2.36) becomes
()
Using the above equation one can write the rate of change of information entropy for the thermosttaed system [67] as
()
where hδρ = ρρst. Comparing (2.63) with (2.47) one can easily identify that the third term as the entropy production of irreversible process and the remaining terms correspond to the entropy flux like quantity
()
Here the first term presents the rate of phase space volume contraction to the second order, whereas the second one can be read as the average of the work per unit time of the external forcing acting (tangentially) along the motion. In the steady state, we have from(2.63)
()
Thus the above equation establishes a connection between thermodynamically inspired quantities of an irreversible process and phase space dynamics.
In the next step we use the following time-dependent solution of (2.62) as before to find the explicit time dependence of and :
()
where N1 is the normalization constant and σ(t) is obtained from (2.54). The expression for αh(t) is given by
()
Now for the distribution function (2.66) we have
()
()
Thus the above equations describe the time dependence of information entropy production and flux due to irreversibility in the process in the presence of nonequilibrium constraint in a unified scheme for both internal and external noises-driven systems. We now explore explicit dependence of the above thermodynamically inspired quantities on time and properties of the noise. First, in Figure 3 the variation of with time is plotted. It shows that the entropy production first decreases with time and then passes through the minimum and finally reaches the following steady state value:
()
for external Gaussian noise (solid curve). But dashed and dotted curves in this figure imply that the minimum is going to disappear and the new steady state is becoming very close to the original one which is driven by the non equilibrium constraints as the noise behavior deviates more from the Gaussian characteristics.
Details are in the caption following the image
Plot of () versus time using (2.68) for the parameter set γ0 = 1.0, σ(0) = 0.25, α(0) = 1.0,  ω = 0.5, kBT = 0.25, De = 0.5, τi = τe = 1.0 and, fe = 0.5.
These observations can be explained by simplifying (2.68) in the limit σ(0) → 0 and αh(0) → 0 as
()
First term in the numerator in (2.71), which vanishes as t → 0, implies that the external force increases entropy production while the second term corresponds to the decrease of entropy production with time due to dissipative action. Because of these two opposite effects a system thrown away from a steady state by a small external force relaxes to a new steady state passing through a minimum in entropy production with time.
We now consider long time behavior of (2.68), and (2.69). At t (2.68), and (2.69) reduce to the following equations:
()
()
Equation (2.72) describes why is vanishingly small for external non-Gaussian noise (NGN) at long time for the given parameter set in Figure 3. In the effective non equilibrium constraint term () b is smaller for NGN compared to Gaussian noise because τng is greater for the former than the latter. However, the above (2.72) implies that the system with higher effective diffusion constant is more robust against the given non equilibrium constraint and the entropy production decreases monotonically with increase of temperature of the thermal bath. It is demonstrated in Figure 4. It shows that the rate of decrease is higher for external Gaussian noise compared to non-Gaussian. The change of temperature is less effective in the case of the latter since the effective noise strength for NGN is higher than GN. Now we check whether the above results reduce to the standard result or not. In the absence of external noise (c = 1, , and b = (1 − λτi)/τi) (2.72) becomes
()
It implies that the external force is not effective to drive the equilibrium state to a new steady state if λτi = 1. However, in the Markovian limit (τi → 0) (2.74) reduces to
()
which is the standard result for entropy production of irreversible processes for a Brownian particle in thermodynamically closed system.
Details are in the caption following the image
Plot of () versus kBT using (2.68) for the parameter set γ0 = 0.5, σ(0) = 0.25, α(0) = 1.0,   ω = 0.5, t = 10, De = 0.5, τi = τe = 1.0, and fe = 1.0.
Using (2.72) and (2.73) one can have another important check also on the above results through
()
We now demonstrate the variation of as a function of damping strength γ0 in Figure 5 in the presence of internal non Markovian thermal bath and external noise. Solid and dotted curves are corresponding to external colored Gaussian and non Gaussian noises, respectively. This convention has been followed for the rest of the figures. Both the curves in Figure 5 show extremum behavior as a result of interplay of effective damping (λ), diffusion constant (Deff), and noise correlation time. The entropy production becomes close to zero at the minimum since b as well as effective non equilibrium constraint is vanishingly small. Thus for external colored non Gaussian b becomes very small at larger damping strength compared to Gaussian noise.
Details are in the caption following the image
Plot of () versus damping strength γ0 using (2.68) for the parameter set t = 10.0, σ(0) = 0.25, α(0) = 1.0,   ω = 0.5, kBT = 0.5, De = 0.5, τi = τe = 0.5, and fe = 1.0.

In Figure 6 we have presented how the entropy production depends on noise correlation time of internal colored noise in the presence of external colored noise. There is both maximum and minimum for external Gaussian noise. But the minimum disappears for non Gaussian noise. At certain critical value of τi, the product λτng may be equal to unity, and then the effective non equilibrium constraint (F1 = bfe) becomes very small and the external force is not able to drive the stationary state to a new steady state and the minimum appears. The maximum appears when λτng ≪ 1.

Details are in the caption following the image
Plot of () versus internal noise correlation time τi using (2.68) for the parameter set γ = 0.5, σ(0) = 0.25, α(0) = 1.0,   ω = 0.5, kBT = 0.5, De = 0.5, t = 10.0, τe = 0.5, and fe = 1.0.
In the next step, we have demonstrated the variation of the entropy production as a function of τi (which is related to correlation time of external noise) in Figure 7. It shows that first decreases with τi and then passes through the minimum and finally reaches the following limiting value for external Gaussian noise:
()
since at large τe, b and Deff can be approximated as b = −λ and Deff = λ2γ0kBT/(λτi − 1) 2. However, the minimum appears as a result of similar kind of interplay as mentioned for Figure 6. But the minimum disappears for external non Gaussian noise and monotonically increases to the above limiting value.
Details are in the caption following the image
Plot of () versus τe using (2.68) for the parameter set γ = 0.25, σ(0) = 0.25, α(0) = 1.0,   ω = 0.5, kBT = 0.5, De = 0.5, τi = 0.5, t = 10.0, and fe = 1.0.

Finally, in Figure 8 we have plotted the entropy production versus r (which accounts the deviation of noise behavior from Gaussian characteristic). It shows that at some critical value of passes through a minimum as result of interplay of the effective damping and τng since τng depends on r. Thus the effectiveness of the nonequilibrium constraint depends on the deviation of noise properties from Gaussian characteristics.

Details are in the caption following the image
Plot of () versus r using (2.68) for the parameter set γ = 0.25, σ(0) = 0.25, α(0) = 1.0,   ω = 0.5, kBT = 0.5, De = 0.5, τi = 0.5, t = 10.0, and fe = 1.0.

3. Conclusion

In conclusion, we have considered the relaxation behavior of a given nonequilibrium state of the thermal broad band noise-driven harmonic oscillator in presence and absence of nonequilibrium constraint. Here we have studied the time dependence of information entropy production and entropy flux based on the Fokker-Planck description of noise process and the entropy balance equation. It includes the following points.
  • (1)

    Entropy production monotonically decreases with time to a stationary value in the absence of the non equilibrium constraint (NEC). But in the presence of NEC it first decreases with time and then increases passing through a minimum and finally reaches a limiting value for external Gaussian noise for a given parameter set. But the minimum is going to disappear as the noise behavior deviates more from the Gaussian characteristics.

  • (2)

    It is difficult for the non equilibrium constraint to drive the equilibrium state to a steady state as the temperature of the thermal bath increases and the rate of decreases of the entropy production with temperature is fast for external colored Gaussian noise compared to non Gaussian one.

  • (3)

    In the presence of NEC we have observed extremum behavior in the variation of entropy production as function of damping strength, noise correlation, non Gaussian parameter (which determine the deviation of external noise behavior from Gaussian characteristic), respectively. Thus the properties of noise process are important for entropy production.

To be mentioned here is that our present calculations are, of course, restricted to the harmonic oscillator (HO). However, insights of this important system usually have a wide impact, as the HO constitutes much more than a mere example. In general Kramers′ problem on barrier crossing dynamics is studied analytically by linearization of the nonlinear potential energy function around the fixed points [5]. Qualitatively one can say that greater entropy production of a system implies that the barrier crossing rate is larger since the former increases with increase of phase space expansion rate. Thus we hope that our present study will be useful for the understanding of the various phenomena in colored noise-driven thermodynamically closed systems. Another point to be mentioned is that one can generalize the present study considering more complex cases, such as a thermal environment having non exponential decaying memory kernel. Also one can generalize it to have an exact study for external colored nonGaussian noise.

Acknowledgment

Thanks are due to Council of Science and Industrial Research for partial financial support.

    Appendix

    More about the Linear Transformation

    Here we have shown that the linear transformation (2.10) used in Section 2 can also be applied directly to the Langevin dynamics described by (2.1), (2.5), (2.6), and (2.7) to derive the Fokker-Planck equation (2.11). Multiplying a, b, c in both sides of (2.1), (2.5), and (2.6), respectively, and then adding all the equations (2.1), (2.5), (2.6), and (2.7) we have
    ()
    This is the Langevin equation of motion corresponding to the Fokker-Planck equation (2.11). The above equation implies that (2.1), (2.5), (2.6), and (2.7) are the projection of U. In the weak noise limit it becomes
    ()
    The solution of this equation is
    ()
    The effective damping constant λ in the above equation is finite for the finite value of ω, τi, and γ0, and it does not correspond to a particular eigenvalue of the matrix formed by the deterministic parts in the right-hand side of (2.1), (2.5), (2.6), and (2.7). Equations (2.1), (2.5), (2.6), and (2.7) can be written in matrix notation as follows:
    ()
    where
    ()
    The above discussion implies that λ does not correspond to a particular eigenvalue of the matrix H. However, we now come back to (A.3). It implies that U(t) is finite at a finite time t, and it is not the slow variable of the original dynamics since it satisfies the initial condition taking contribution of all the variables of the phase space and λ is not the smallest eigenvalue of the matrix H. Thus U in (A.3) considers contribution from all the variables at arbitrary time. Hence the linear transformation (2.10) being used to reduce the Fokker-Planck equation (2.9) into (2.11) works at any time.

      The full text of this article hosted at iucr.org is unavailable due to technical difficulties.