Some embodiments provide a neural network inference circuit (NNIC) for executing a neural network that includes multiple computation nodes at multiple layers. The NNIC includes a set of clusters of core computation circuits and a channel, connecting the core computation circuits, that includes separate segments corresponding to each of the clusters. The NNIC includes a fabric controller circuit, a cluster controller circuit for each of the clusters, and a core controller circuit for each of the core computation circuits. The fabric controller circuit receives high-level neural network instructions from a microprocessor and parses the high-level neural network instructions.