Combined display of all available logs of wiki4print. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).
- 10:10, 31 January 2024 Luca Cacini talk contribs created page File:Serpiente alquimica.jpeg (Ouroboros, drawing from a late medieval Byzantine Greek alchemical manuscript, 1478. Vol. 279 of Codex Parisinus graecus 2327, a copy (made by Theodoros Pelecanos (Pelekanos) of Corfu in Khandak, Iraklio, Crete in 1478) of a lost manuscript of an early medieval tract that was attributed to Synosius (Synesius) of Cyrene (d. 412). The text of the tract is attributed to Stephanus of Alexandria (7th century).)
- 10:10, 31 January 2024 Luca Cacini talk contribs uploaded File:Serpiente alquimica.jpeg (Ouroboros, drawing from a late medieval Byzantine Greek alchemical manuscript, 1478. Vol. 279 of Codex Parisinus graecus 2327, a copy (made by Theodoros Pelecanos (Pelekanos) of Corfu in Khandak, Iraklio, Crete in 1478) of a lost manuscript of an early medieval tract that was attributed to Synosius (Synesius) of Cyrene (d. 412). The text of the tract is attributed to Stephanus of Alexandria (7th century).)
- 09:39, 31 January 2024 Luca Cacini talk contribs created page Luca comment 2 (Created page with "<!------------------------> <!-- do not remove this --> <div id="{{PAGENAME}}" class="comment"> <!------------------------> The resulting models may be narrow, entropic or homogeneous; biases may become progressively amplified; or the outcome may be something altogether harder to anticipate. What to do? Is it possible to simply tag synthetic outputs so that they can be excluded from future model training, or at least differentiated? Might it become necessary, conversel...")
- 09:33, 31 January 2024 Luca Cacini talk contribs created page Luca comment 1 (Created page with "<!------------------------> <!-- do not remove this --> <div id="{{PAGENAME}}" class="comment"> <!------------------------> '' What happens when language models are so pervasive that subsequent models are trained on language data that was largely produced by other models’ previous outputs? The snake eats its own tail, and a self-collapsing feedback effect ensues.'' <!------------------------> <!-- do not remove this --> </div> Category:Content form - com...")
- 13:34, 16 January 2024 Luca Cacini talk contribs moved page Luca to Luca - The Autophagic mode of production
- 13:15, 16 January 2024 Luca Cacini talk contribs created page Luca (Created page with "<div class="metadata"> == The Autophagic mode of production: hacking the metabolism of AI. == '''Luca Cacini''' </div>In the metabolic process of content production, generative AI operates as an Autophagic organism. Autophagy in biological systems can be summarized as “a natural process in which the body breaks down and absorbs its own tissue or cells.”(''AUTOPHAGY | English Meaning - Cambridge Dictionary'', n.d.) In cells, this mechanism is dedicated to maintaining...") Tag: Visual edit
- 21:12, 8 January 2024 User account Luca Cacini talk contribs was created