Multi-label feature selection focuses on constructing a reduced feature space for discriminating multi-label instances. In consideration of the complex structures of label and feature spaces, a critical issue that explicitly determines selection performance is how to induce consistent information from both spaces to steer feature selection. Existing approaches tackle this issue in various spatial-aware views, without sufficient consideration of the negative effects of irrelevant features and imbalanced neighbors on inferring space structure. Inspired by the superiority of margin theory in assessing reliable space structure, we approach multi-label feature selection in the learning framework of preserving label-feature space consistency through probabilistic margin in this paper. In contrast to existing approaches, our model assesses the weighted margin based on the probabilistic nearest neighbors, and preserves consistent margin information in label and feature spaces. In this manner, label-feature space consistency is elegantly achieved, which conduces to effectively capturing discriminative features suitable for multi-label learning tasks and eliminating noisy features. Experimental results on multi-label data sets demonstrate the encouraging performance of the proposed model.