Reddit CEO Steve Huffman is standing by Reddit’s decision to block companies from scraping the site without an AI agreement.

Last week, 404 Media noticed that search engines that weren’t Google were no longer listing recent Reddit posts in results. This was because Reddit updated its Robots Exclusion Protocol (txt file) to block bots from scraping the site. The file reads: “Reddit believes in an open Internet, but not the misuse of public content.” Since the news broke, OpenAI announced SearchGPT, which can show recent Reddit results.

The change came a year after Reddit began its efforts to stop free scraping, which Huffman initially framed as an attempt to stop AI companies from making money off of Reddit content for free. This endeavor also led Reddit to begin charging for API access (the high pricing led to many third-party Reddit apps closing).

In an interview with The Verge today, Huffman stood by the changes that led to Google temporarily being the only search engine able to show recent discussions from Reddit. Reddit and Google signed an AI training deal in February said to be worth $60 million a year. It’s unclear how much Reddit’s OpenAI deal is worth.

Huffman said:

Without these agreements, we don’t have any say or knowledge of how our data is displayed and what it’s used for, which has put us in a position now of blocking folks who haven’t been willing to come to terms with how we’d like our data to be used or not used.

“[It’s been] a real pain in the ass to block these companies,” Huffman told The Verge.

  • BorgDrone@lemmy.one
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 个月前

    Reddit used to be open source. There is still a copy of that source available on github. It’s 7 years old so it’s probably significantly different from what they are running now. Still, it gives some insight into the design.

    For example, deleted comments aren’t deleted, it just sets a deleted flag. Example code that shows this.

    I haven’t dug around the code enough to figure out how editing works, it’s Python code so an unreadable mess. The database design also seems very strange. It’s like they built a database system on top of a database.

    • ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 个月前

      For example, deleted comments aren’t deleted, it just sets a deleted flag.

      FWIW even when you properly delete something from a database table, the deleted row can be reconstructed from the audit tables. And even if that weren’t the case, databases are regularly backed up to tape drives or whatever - when people delete or munge all their comments, Reddit doesn’t go back into all the backups and make the same changes there. In fact, I would imagine that when they sell their shit to companies for AI training, they sell old pre-AI backups rather than a latest copy.

    • finley@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      3 个月前

      This is not evidence that overwritten and deleted comments could be restored to the original state. Moreover, that points to the original source code of Reddit, not the current code of Reddit.

      This is also not evidence that deleted or overwritten and deleted comments have been restored. This is merely evidence that, at one time, this is how deleted comments used to be handled.

      All this is evidence of is, as you put it, things are very strange in the code.

      • BorgDrone@lemmy.one
        link
        fedilink
        English
        arrow-up
        5
        ·
        3 个月前

        I never claimed it was evidence of how it currently works, only that it gives some insight into how Reddit was designed. I would be very surprised if they changed this aspect of the design. It makes sense to not delete comments or edits for reasons I mentioned before. Unfortunately we won’t know for sure unless Reddit confirms it.