recentpopularlog in


« earlier   
Better Language Models and Their Implications
GPT-2 displays a broad set of capabilities, including the ability to generate conditional synthetic text samples of unprecedented quality, where we prime the model with an input and have it generate a lengthy continuation. In addition, GPT-2 outperforms other language models trained on specific domains (like Wikipedia, news, or books) without needing to use these domain-specific training datasets. On language tasks like question answering, reading comprehension, summarization, and translation, GPT-2 begins to learn these tasks from the raw text, using no task-specific training data. While scores on these downstream tasks are far from state-of-the-art, they suggest that the tasks can benefit from unsupervised techniques, given sufficient (unlabeled) data and compute.
gpt2  ML  AI  text  text-generation  generative  OpenAI  2019 
2 days ago by zzkt
Iterograph - An iterative drawing tool
Let's explore the beauty of geometric art and minimalism. Discover the endless possibilities of iterative drawing and create wonderful abstract design in no time !
art  generators  geometry  generative  webapp 
4 days ago by keynell
generated-space - generative algorithms
Generated Space is the result of a year-long endeavour to make computers do unexpected things.

It presents a wide range of different generative algorithms; from organic flow fields and particle systems to rigid fractals and grammar-based shapes. Some more serious than others.

All the code is open source and available on GitHub, so feel free to change and improve upon any sketches that interests you.

For any questions or inquiries, feel free to contact me. ~Kjetil
gallery  art  generative  interactive  images 
6 days ago by RBarnard
Hakim El Hattab
maker of good shit, including interactive generative art sketches
generative  design  webdesign  ui  ux  javascript  animation  art  webdev  portfolio 
7 days ago by inrgbwetrust

Copy this bookmark:

to read