We propose a novel learning scheme called network morphism. It morphs a parent network into a child network, allowing fast knowledge transferring. The child network is able to achieve the performance of the parent network immediately, and its performance shall continue to improve as the training process goes on. The proposed scheme allows any network morphism in an expanding mode for arbitrary non-linear neurons, including depth, width, kernel size and subnet morphing operations.