Rectified Linear Unit (ReLU): a further non-linear operation is introduced right after every convolution operation. It aims to accomplish non-linear functionality in our CNN and aid our product realize information much better. The output function of ReLU is as follows:Fred Reichheld, a companion at Bain & Organization, made The online promoter scor