This new data-poisoning tool lets artists fight back against generative AI.

Item request has been placed! ×
Item request cannot be made. ×
loading   Processing Request
  • Additional Information
    • Abstract:
      Nightshade is a new tool that allows artists to add invisible changes to their artwork before uploading it online. These changes can disrupt the training data used by AI models, causing them to produce unpredictable and chaotic outputs. The tool is intended to empower artists and protect their copyrights and intellectual property from being used without permission by AI companies. Nightshade exploits a vulnerability in generative AI models and can manipulate the models into learning incorrect associations between concepts. While there is a risk of misuse, the tool requires a large number of poisoned samples to cause significant damage to powerful AI models. [Extracted from the article]
    • Abstract:
      Copyright of MIT Technology Review is the property of MIT Technology Review and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)