The prompts are fed into 175B parameter DaVinci model, resulting in 165k examples in the 7 linkages after cleaning.These prompts are generated from the Atomic 20-20 human-authored dataset.There are only 7 linkage atoms (edges, so to speak) in these queries, but of course many actions / direct objects.General idea: use GPT-3 as a completion source given a set of prompts, like:.Courtesy of Yannic Kilcher's youtube channel.From a team at University of Washington / Allen institute for artificial intelligence/.Symbolic Knowledge Distillation: from General Language Models to Commonsense Models Tags: concept net NLP transformers graph representation knowledge date: 11-04-2021 17:48 gmt
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |