Gamma process

Also known as the (Moran-)Gamma Process,[1] the gamma process is a random process studied in mathematics, statistics, probability theory, and stochastics. The gamma process is a stochastic or random process consisting of independently distributed gamma distributions where N ( t ) {\displaystyle N(t)} represents the number of event occurrences from time 0 to time t {\displaystyle t} . The gamma distribution has shape parameter γ {\displaystyle \gamma } and rate parameter λ {\displaystyle \lambda } , often written as Γ ( γ , λ ) {\displaystyle \Gamma (\gamma ,\lambda )} .[1] Both γ {\displaystyle \gamma } and λ {\displaystyle \lambda } must be greater than 0. The gamma process is often written as Γ ( t , γ , λ ) {\displaystyle \Gamma (t,\gamma ,\lambda )} where t {\displaystyle t} represents the time from 0. The process is a pure-jump increasing Lévy process with intensity measure ν ( x ) = γ x 1 exp ( λ x ) , {\displaystyle \nu (x)=\gamma x^{-1}\exp(-\lambda x),} for all positive x {\displaystyle x} . Thus jumps whose size lies in the interval [ x , x + d x ) {\displaystyle [x,x+dx)} occur as a Poisson process with intensity ν ( x ) d x . {\displaystyle \nu (x)\,dx.} The parameter γ {\displaystyle \gamma } controls the rate of jump arrivals and the scaling parameter λ {\displaystyle \lambda } inversely controls the jump size. It is assumed that the process starts from a value 0 at t = 0 meaning N ( 0 ) = 0 {\displaystyle N(0)=0} .  

The gamma process is sometimes also parameterised in terms of the mean ( μ {\displaystyle \mu } ) and variance ( v {\displaystyle v} ) of the increase per unit time, which is equivalent to γ = μ 2 / v {\displaystyle \gamma =\mu ^{2}/v} and λ = μ / v {\displaystyle \lambda =\mu /v} .

Plain English definition

The gamma process is a process which measures the number of occurrences of independent gamma-distributed variables over a span of time. This image below displays two different gamma processes on from time 0 until time 4. The red process has more occurrences in the timeframe compared to the blue process because its shape parameter is larger than the blue shape parameter.

Gamma-Process

Properties

We use the Gamma function in these properties, so the reader should distinguish between Γ ( ) {\displaystyle \Gamma (\cdot )} (the Gamma function) and Γ ( t ; γ , λ ) {\displaystyle \Gamma (t;\gamma ,\lambda )} (the Gamma process). We will sometimes abbreviate the process as X t Γ ( t ; γ , λ ) {\displaystyle X_{t}\equiv \Gamma (t;\gamma ,\lambda )} .

Some basic properties of the gamma process are:[citation needed]

Marginal distribution

The marginal distribution of a gamma process at time t {\displaystyle t} is a gamma distribution with mean γ t / λ {\displaystyle \gamma t/\lambda } and variance γ t / λ 2 . {\displaystyle \gamma t/\lambda ^{2}.}

That is, the probability distribution f {\displaystyle f} of the random variable X t {\displaystyle X_{t}} is given by the density

f ( x ; t , γ , λ ) = λ γ t Γ ( γ t ) x γ t 1 e λ x . {\displaystyle f(x;t,\gamma ,\lambda )={\frac {\lambda ^{\gamma t}}{\Gamma (\gamma t)}}x^{\gamma t\,-\,1}e^{-\lambda x}.}

Scaling

Multiplication of a gamma process by a scalar constant α {\displaystyle \alpha } is again a gamma process with different mean increase rate.

α Γ ( t ; γ , λ ) Γ ( t ; γ , λ / α ) {\displaystyle \alpha \Gamma (t;\gamma ,\lambda )\simeq \Gamma (t;\gamma ,\lambda /\alpha )}

Adding independent processes

The sum of two independent gamma processes is again a gamma process.

Γ ( t ; γ 1 , λ ) + Γ ( t ; γ 2 , λ ) Γ ( t ; γ 1 + γ 2 , λ ) {\displaystyle \Gamma (t;\gamma _{1},\lambda )+\Gamma (t;\gamma _{2},\lambda )\simeq \Gamma (t;\gamma _{1}+\gamma _{2},\lambda )}

Moments

The moment function helps mathematicians find expected values, variances, skewness, and kurtosis.
E ( X t n ) = λ n Γ ( γ t + n ) Γ ( γ t ) ,   n 0 , {\displaystyle \operatorname {E} (X_{t}^{n})=\lambda ^{-n}\cdot {\frac {\Gamma (\gamma t+n)}{\Gamma (\gamma t)}},\ \quad n\geq 0,} where Γ ( z ) {\displaystyle \Gamma (z)} is the Gamma function.

Moment generating function

The moment generating function is the expected value of exp ( t X ) {\displaystyle \exp(tX)} where X is the random variable.
E ( exp ( θ X t ) ) = ( 1 θ λ ) γ t ,   θ < λ {\displaystyle \operatorname {E} {\Big (}\exp(\theta X_{t}){\Big )}=\left(1-{\frac {\theta }{\lambda }}\right)^{-\gamma t},\ \quad \theta <\lambda }

Correlation

Correlation displays the statistical relationship between any two gamma processes.

Corr ( X s , X t ) = s t ,   s < t {\displaystyle \operatorname {Corr} (X_{s},X_{t})={\sqrt {\frac {s}{t}}},\ s<t} , for any gamma process X ( t ) . {\displaystyle X(t).}

The gamma process is used as the distribution for random time change in the variance gamma process.

Literature

  • Lévy Processes and Stochastic Calculus by David Applebaum, CUP 2004, ISBN 0-521-83263-2.

References

  1. ^ a b Klenke, Achim, ed. (2008), "The Poisson Point Process", Probability Theory: A Comprehensive Course, London: Springer, pp. 525–542, doi:10.1007/978-1-84800-048-3_24, ISBN 978-1-84800-048-3, retrieved 2023-04-04
  • v
  • t
  • e
Stochastic processes
Discrete timeContinuous timeBothFields and otherTime series modelsFinancial modelsActuarial modelsQueueing modelsPropertiesLimit theoremsInequalitiesToolsDisciplines