Would you like to see your presentation here, made available to a global audience of researchers?
Add your own presentation or have us affordably record your next conference.
Large Language Models (LLMs) excel at generating synthetic data, but ensuring its quality and diversity remains challenging. We propose Genetic Prompt, a novel framework that innovatively combines genetic algorithms (GAs) with LLMs for synthetic data generation. Unlike previous GA approaches, Genetic Prompt implicitly represents semantic text attributes as gene sequences. Additionally, We implement an active learning scheme to optimize the GA process, eliminating traditional fitness evaluation and expanding offspring search space. Experiments on four relation extraction datasets show that Genetic Prompt significantly outperforms State-of-the-art baselines. Further more, analysis reveals that Genetic Prompt generates more diverse data with distributions closer to the gold dataset.