Cell nucleus segmentation is a fundamental task in biomedical image analysis. Generating realistic cell nucleus data with ground truth masks can help tackle difficulties such as insufficient training data for deep learning models and the need to deal with 'hard' cases (e.g., tightly clumped nuclei). Known nucleus generation methods generated individual nucleus masks from parametric models or based on direct transformations of real masks. It is difficult for these methods to capture and simulate the distributions of real nuclei and interactions among hard nuclei. In this paper, we propose a new three-stage coarse-to-fine nucleus generation method for 2D and 3D nucleus segmentation. The first stage simulates the positions and sizes of nuclei; the second stage simulates the shapes of nuclei and interactions among clumped nuclei; the third stage simulates the textures of nuclei. We evaluate our method on 2D and 3D cell nucleus image datasets. Experimental results show that our new nucleus generation method considerably helps improve cell nucleus segmentation performance and outperforms known nucleus generation methods for nucleus segmentation with a small amount of training data.