Luca Cacini (talk | contribs) No edit summary |
Luca Cacini (talk | contribs) No edit summary |
||
Line 1: | Line 1: | ||
<!------------------------> | <!------------------------> | ||
<!-- do not remove this --> | <!-- do not remove this --> | ||
<div id=" | <div id="Luca comment 1" class="comment"> | ||
<!------------------------> | <!------------------------> | ||
Line 21: | Line 21: | ||
</div> | </div> | ||
[[ | [[Category:Content form - comment]] | ||
<!------------------------> | <!------------------------> |
Latest revision as of 14:12, 31 January 2024
What happens when language models are so pervasive that subsequent models are trained on language data that was largely produced by other models’ previous outputs? The snake eats its own tail, and a self-collapsing feedback effect ensues.