Group convolution works well with many lightweight convolutional neural networks (CNNs) that can effectively reduce the number of parameters and computational cost. However, feature maps of different groups cannot communicate, which restricts their representation capability. To address this issue, in this work, we propose a novel convolution operation named Hierarchical Group Convolution (HGC) for creating computationally efficient neural networks. Different from standard group convolution which blocks the inter-group information exchange and induces the severe performance degradation, HGC hierarchically fuses the feature maps from each group and leverages the inter-group information effectively. Taking advantage of the proposed operation, we introduce an efficient compact network named HGCNet. Extensive experimental results on image classification task demonstrate that HGCNet obtain significant reduction of computational cost and the number of parameters, while achieving comparable performance over the prior CNN architectures designed for mobile devices.