AUTOMATIC GENERATION OF CULINARY RECIPES FROM A GIVEN SET OF INGREDIENTS

Authors

  • Branislav Anđelić Autor

DOI:

https://doi.org/10.24867/18BE03Andjelic

Keywords:

recipe generation, natural language gene¬ration, sequence to sequence models

Abstract

With the increasing popularity of online web pages for recipe sharing, the amount of widely available culinary data is larger than ever. People are constantly looking for a way to find a recipe and prepare their daily meals quickly. This paper presents a system of automatic recipe generation from the ingredients currently available to the user, using sequence to sequence models and extraction of a suitable subset of the given ingredients. This paper shows that generating a meaningful recipe text is feasible for any given set of ingredients but questions the practical use of such recipes. With the introduction of new ideas for culinary data processing, this paper represents a solid base for future work in the field.

References

[1] Sutskever, Ilya, Oriol Vinyals, and Quoc V. Le. "Sequence to sequence learning with neural networks."
[2] Ramachandran, Prajit, et al. "Stand-alone self-attention in vision models." arXiv preprint arXiv:1906.05909 (2019).
[3] Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
[4] Marin, Javier, et al. "Recipe1m+: A dataset for learning cross-modal embeddings for cooking recipes and food images." IEEE transactions on pattern analysis and machine intelligence 43.1 (2019): 187-203.
[5] Goldberg, Yoav, and Omer Levy. "word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method." arXiv preprint arXiv:1402.3722 (2014).
[6] H. Lee, Helena, et al. "RecipeGPT: Generative pre-training based cooking recipe generation and evaluation system." Companion Proceedings of the Web Conference 2020. 2020.
[7] Radford, Alec, et al. "Language models are unsupervised multitask learners." OpenAI blog 1.8 (2019): 9.
[8] Papineni, Kishore, et al. "Bleu: a method for automatic evaluation of machine translation." Proceedings of the 40th annual meeting of the Association for Computational Linguistics. 2002.
[9] Lin, Chin-Yew. "Rouge: A package for automatic evaluation of summaries." Text summarization branches out. 2004.
[10] Teisberg, Dev Bhargava Thomas. "Recipe for Disaster: A Seq2Seq Model for Recipe."
[11] Luong, Minh-Thang, Hieu Pham, and Christopher D. Manning. "Effective approaches to attention-based neural machine translation." arXiv preprint arXiv:1508.04025 (2015).
[12] Chorowski, Jan, et al. "Attention-based models for speech recognition." arXiv preprint arXiv:1506.07503 (2015).
[13] Cho, Kyunghyun, et al. "Learning phrase representations using RNN encoder-decoder for statistical machine translation." arXiv preprint arXiv:1406.1078 (2014).
[14] Devlin, Jacob, et al. "Bert: Pre-training of deep bidirectional transformers for language understanding." arXiv preprint arXiv:1810.04805 (2018).

Published

2022-07-04

Issue

Section

Electrotechnical and Computer Engineering