This paper studies an infinite-server queue in a Markov environment, that is, an infinite-server queue with an arrival rate that equals $\lambda_i$ when an external Markov process is in state $i$. The service times have a general distribution that depends on the state of the background process upon arrival. We start by setting up explicit formulas for the mean and variance of the number of particles in the system at time $t\ge 0$, given the system started empty. The special case of exponential service times is studied in detail, resulting in a recursive scheme to compute the moments of the number of customers at an exponentially distributed time, as well as the steady-state moments. Then we consider an asymptotic regime in which the arrival rates are sped up by a factor $N$, and the transition times by a factor $N^{1+\varepsilon}$ (for some $\varepsilon>0$). Under this scaling it turns out that the number of customers at time $t\ge 0$ is asymptotically Normally distributed; in addition convergence of finite-dimensional distributions is proven.