News
Child sex abuse images found in dataset training image generators, report says Stable Diffusion 1.5 reportedly “tainted” by more than 1,000 child abuse images.
A new study finds that many popular image datasets used to train AI models are contaminated with test images or ...
This dataset, LAION-400M, contains 413M image-text pairs and has subsequently been used "in many papers and experiments." The new dataset, LAION-5B, was collected using a three-stage pipeline.
They described the dataset in “ A crowdsourced dataset of aerial images with annotated solar photovoltaic arrays and installation metadata,” which was recently published in Scientific Data.
Researchers have found child sexual abuse material in LAION-5B, an open-source artificial intelligence training dataset used to build image generation models. The discovery was made by the ...
4d
Tech Xplore on MSNNew dataset for smarter 3D printing released
Oak Ridge National Laboratory's Peregrine software, used to monitor and analyze parts created through powder bed additive ...
LAION has taken down its machine learning dataset, which Google uses, after researchers found it contained child sexual abuse material.
With the open dataset, Getty Images wants to address enterprises’ ML training woes and position itself as a credible data partner.
Paper: Stable Diffusion “memorizes” some images, sparking privacy concerns But out of 300,000 high-probability images tested, researchers found a 0.03% memorization rate.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results