[2]Hanqing Zhang, Haolin Song, Shaoyu Li, Ming Zhou, and Dawei Song. 2023. A survey of controllable text generation using transformer-based pre-trained language models. Preprint, arXiv:2201.05337.
[3]Jasper Dekoninck, Marc Fischer, Luca Beurer-Kellner, and Martin Vechev. 2024. Controlled text generation via language model arithmetic. In The Twelfth International Conference on Learning Representations.
[4]Xin Liu, Muhammad Khalifa, and Lu Wang. 2023. Bolt: Fast energy-based controlled text generation with tunable biases. Preprint, arXiv:2305.12018.
[5]Jonathan Pei, Kevin Yang, and Dan Klein. 2023. PREADD: prefix-adaptive decoding for controlled text generation. In Findings of the Association for Computational Linguistics: ACL 2023, Toronto, Canada, July 9-14, 2023, pages 10018–10037. Association for Computational Linguistics.
[6]Timo Schick, Sahana Udupa, and Hinrich Schütze. 2021. Self-diagnosis and self-debiasing: A proposal for reducing corpus-based bias in nlp. Preprint, arXiv:2103.00453.
[7]Kevin Yang and Dan Klein. 2021. FUDGE: controlled text generation with future discriminators. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2021, Online, June 6-11, 2021, pages 3511–3535. Association for Computational Linguistics.