Recipe representation plays an important role in food computing for,perception, recognition, recommendation and other applications.,Learning pretrained recipe embeddings is a challenging task, as,there is a lack of high quality annotated food datasets. In this paper,,we provide a joint approach for learning effective pretrained recipe,embeddings using both the ingredients and cooking instructions.,We present,Reciptor,, a novel set transformer-based joint model to,learn recipe representations, that preserves permutation-invariance,for the ingredient set and uses a novel knowledge graph (KG) derived triplet sampling approach to optimize the learned embeddings,so that related recipes are closer in the latent semantic space. The,embeddings are further jointly optimized by combining similarity among cooking instructions with a KG based triplet loss. We,experimentally show that,Reciptor,’s recipe embeddings outperform state-of-the-art baselines on two newly designed downstream,classification tasks by a wide margin.