일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 | 31 |
- abstraction
- XOR
- pytorch
- Attention
- machine learning
- python practice
- GNN
- deep learning
- NLP
- neural net
- Self-loop attention
- Python
- word2vec
- overfitting
- Classificaion
- Transformer
- sentence embedding
- Set Transformer
- elif
- sigmoid
- Today
- Total
목록Set Transformer (2)
Research Notes

Title: Reciptor: An Effective Pretrained Model for Recipe Representation Learning Authors: Diya Li, Mohammed J. Zaki Summary a joint approach for learning effective pretrained recipe embeddings using both the ingredients and cooking instructions a novel set transformer-based joint model to learn recipe representations that preserve permutation-invariance Framework Tags data: Model Validation에 사용..

Title: Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks Authors: Juho Lee, Yoonho Lee, Jungtaek Kim, Adam R. Kosiorek, Seungjin Choi, Yee Whye Teh 0. Abstract 다양한 ML tasks는 set의 순서에 의존하지 않는 *permutation invariant한 특성을 반영하지 못함 기존 모델의 이러한 한계로, Attention-based neural network model인 Set Transformer를 제안함 Set Transformer는 input set 내 요소들의 interaction을 반영할 수 있도록 설계..