Image style transfer, which involves remapping the content of a specified image with a style image, represents a current research focus in the field of artificial intelligence and computer vision. The proliferation of image datasets and the development of various deep learning models have led to the introduction of numerous models and algorithms for image style transfer. Despite the notable successes of deep learning based style transfer in many areas, it faces significant challenges, notably high computational costs and limited generalization capabilities. In this paper, we present a simple yet effective method to address these challenges. The essence of our approach lies in the integration of wavelet transforms into whitening and coloring processes within an image reconstruction network (WTN). The WTN directly aligns the feature covariance of the content image with that of the style image. We demonstrate the effectiveness of our algorithm through examples, generating high-quality stylized images, and conduct comparisons with several recent methods.