I'm implementing a very simple Susceptible-Infected-Recovered model with a steady population for an idle side project - normally a pretty trivial task. But I'm running into solver errors using either PysCeS or SciPy, both of which use lsoda as their underlying solver. This only happens for particular values of a parameter, and I'm stumped as to why. The code I'm using is as follows:
import numpy as np
from pylab import *
import scipy.integrate as spi
#Parameter Values
S0 = 99.
I0 = 1.
R0 = 0.
PopIn= (S0, I0, R0)
beta= 0.50
gamma=1/10.
mu = 1/25550.
t_end = 15000.
t_start = 1.
t_step = 1.
t_interval = np.arange(t_start, t_end, t_step)
#Solving the differential equation. Solves over t for initial conditions PopIn
def eq_system(PopIn,t):
'''Defining SIR System of Equations'''
#Creating an array of equations
Eqs= np.zeros((3))
Eqs[0]= -beta * (PopIn[0]*PopIn[1]/(PopIn[0]+PopIn[1]+PopIn[2])) - mu*PopIn[0] + mu*(PopIn[0]+PopIn[1]+PopIn[2])
Eqs[1]= (beta * (PopIn[0]*PopIn[1]/(PopIn[0]+PopIn[1]+PopIn[2])) - gamma*PopIn[1] - mu*PopIn[1])
Eqs[2]= gamma*PopIn[1] - mu*PopIn[2]
return Eqs
SIR = spi.odeint(eq_system, PopIn, t_interval)
This produces the following error:
lsoda-- at current t (=r1), mxstep (=i1) steps
taken on this call before reaching tout
In above message, I1 = 500
In above message, R1 = 0.7818108252072E+04
Excess work done on this call (perhaps wrong Dfun type).
Run with full_output = 1 to get quantitative information.
Normally when I encounter a problem like that, there's something terminally wrong with the equation system I set up, but I both can't see anything wrong with it. Weirdly, it also works if you change mu to something like 1/15550
. In case it was something wrong with the system, I also implemented the model in R as follows:
require(deSolve)
sir.model <- function (t, x, params) {
S <- x[1]
I <- x[2]
R <- x[3]
with (
as.list(params),
{
dS <- -beta*S*I/(S+I+R) - mu*S + mu*(S+I+R)
dI <- beta*S*I/(S+I+R) - gamma*I - mu*I
dR <- gamma*I - mu*R
res <- c(dS,dI,dR)
list(res)
}
)
}
times <- seq(0,15000,by=1)
params <- c(
beta <- 0.50,
gamma <- 1/10,
mu <- 1/25550
)
xstart <- c(S = 99, I = 1, R= 0)
out <- as.data.frame(lsoda(xstart,times,sir.model,params))
This also uses lsoda, but seems to be going off without a hitch. Can anyone see what's going wrong in the Python code?
See Question&Answers more detail:
os