Abstract: Change detection in remote sensing images plays an important role in observing earth surface. Over the past few years, deep learning has been widely used in image analysis due to its powerful feature extraction capability, which has shown great potential for change detection task. However, current methods still have difficulties in identifying complex changes due to the insufficient exploration of temporal information. In addition, the complex contextual information of high-resolution images further limits the accuracy. To clarify the temporal information for complex changes and acquire relational contexts of high-resolution images, a dual-perspective change contextual network (DPCC-Net) is proposed for change detection in high-resolution remote sensing images. The presented method emphasizes the process of extraction and optimization of change features by bitemporal feature fusion and contextual modeling. Firstly, a siamese network is used to extract bi-temporal features. Then, a novel dual-perspective fusion (DPF) is proposed, which takes bi-temporal features as reference respectively and obtains two sets of change features from each temporal perspective, thereby increasing the sensitivity to change related information and changes in complex scenes can be better identified. Next, a change context module (CCM) is proposed to incorporate abundant contexts to change features. CCM considers the relation and similarity between each pixel and its contextual pixels, thereby facilitating the integrity of change objects. The quantitative and qualitative results on three change detection datasets indicate that DPCC-Net achieves state-of-the-art performance. The code of DPCC-Net will be released at: https://github.com/SQD1/DPCC-Net.