研究成果概要：Deep neural networks are currently receiving a great attention from both industry and academia. However, ...Deep neural networks are currently receiving a great attention from both industry and academia. However, its internal mechanism is still unclear.In our research, we start from analyzing the activation functions of neural networks and continue to study its internal mechanism from a space mapping perspective. To be specific, 1) We discover the parameters inside a neural network play two roles at a time if a piecewise linear activation function is implemented. The two roles are a task-driven hidden signal producer and a gate signal generator respectively.2) We experimentally find the learning speeds of the two different roles are different.3) We propose a method to decouple the two roles for a more stable learning process. Besides, a two-stage learning process is formulated. 4) We prove the second stage can be optimized via a kernel trick, which accelerates the learning process for medium size datasets.