Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Although previous deep imputation methods (e.g., Genera- tive Adversarial Network (GAN) based methods) have been widely designed to impute missing values, they still suffer from the issues, i.e., lack of both imputation diversity and generalization ability. In this paper, we propose a new GAN- based imputation method, namely Meta-GAIN, to investi- gate a new generator for achieving diverse imputation and generalization ability. Specifically, we employ the Kullback- Leibler (KL) divergence to achieve the diversity of imputed data by generating continuous embedding space of the origi- nal data. We also design a task regularizer (i.e., a cross en- tropy between the predicted results and the true labels) to push the samples within the same class close and the sam- ple in different classes far away to achieve generalization ability. Moreover, we theoretically prove that our proposed method achieves the generalization ability. In addition, we de- sign a new meta network to efficient optimize our objective function. Experimental results on real datasets show that our proposed method outperforms all comparison methods under different missing mechanisms in terms of imputation perfor- mance and classification tasks.