The present investigation elucidates how the number of layers/variance of templates influences the phenomena of multi-layer cellular neural networks (MCNNs). This study relates to learning problems for MCNNs. We show that the greater the number of templates that MCNNs adopt, the richer the phenomena that are derived, while equivalently, such neural networks are more efficient as regards the learning aspect. Additionally, the MCNNs with more layers exhibit more phenomena than the ones with fewer layers. A novel phenomenon is seen in the study of the effect of the number of layers with respect to fixed templates.