-
chevron_right
Redacted Content
pubsub.slavino.sk / hackerfactor · Monday, 10 April, 2023 - 15:02 edit · 10 minutes
-
Pornography is legal (1st Amendment, free speech). However, child pornography is illegal (18 U.S.C.
2251
,
2252
,
2258
, etc.). Sites that permit porn have a bigger problem with child porn. By forbidding all porn, I don't have to
report
as much child porn. In addition, I don't want my administrators to have to spend any time looking at the picture in order to determine if it is a young adult or a child.
-
Public sites that permit identity documents (passports, drivers licenses, etc.) end up being used for fraud. There are also huge issues related to everything from harassment and identity theft to reporting responsibilities in case of a server compromise. To mitigate these issues, we block as many of these documents as possible. By
blocking as fast as possible
, we actively discourage this kind of use.
- Public sites that are widely used for illegal activities (drug distributions, identity theft, human trafficking, child porn, etc.) are often raided by law enforcement. Even if the owners were not directly involved in the crime, they may be changed with helping facilitate these transactions. Thus, if we see it on the public FotoForensics service, we block it .
Hosting Sensitive Documents
Since the first day that the public site went live, the FAQ has warned against uploading any kind of sensitive images. For years, we've been actively banning people who upload drivers licenses, travel documents, bank statements, utility bills, and passport photos. However, after 11 years, we've had our first run-in with the widespread distribution of sensitive stolen documents.Let me be blunt: FotoForensics is not Wikileaks. Wikileaks actively encourages people to upload stolen and highly sensitive documents for public distribution. In contrast, the public FotoForensics service does not solicit and does not want this type of content. Do not use the public FotoForensics service to evaluate or distribute known-stolen documents. Right now, I'm manually blocking access to a set of widely distributed documents that were clearly stolen. When I get a little more time, I'll automate this banning process.
What kind of documents triggered this most recent policy enforcement? As reported by NPR, " Top-secret Pentagon documents on Ukraine war appear on social media ". The Pentagon has reported that these documents were stolen and leaked.
Why are people uploading them to FotoForensics? It's the " Streisand Effect ". Specifically, the New York Times reported , "Military analysts said the documents appear to have been modified in certain parts." They used those magic words: "appear to have been modified." As soon as someone says a picture may be altered, people grab copies of the pictures and upload them to FotoForensics in order to check it themselves.
Again, let me reiterate: FotoForensics is not Wikileaks. Do not use upload or distribute known-stolen documents to my public service. I don't care if some news outlets are showing the images. I'm already spending way too much time blocking porn and fraudulent documents. I don't want to deal with the legal headaches related to hosting potentially stolen content. Moreover, I don't want the reputation of running a site that permits hosting stolen content. There are just too many liability issues in this legal minefield.
Variants and Sources
I'm not going to link to the pictures or even discuss the content in the photos. (There are plenty of news outlets who are doing that.)Rather, I'm looking at the viral spread so that I can have a detector and solution in place the next time this happens. (Hopefully this will never happen again, but truthfully, I'm surprised it took 11 years.)
With these government images, the pictures went viral extremely fast. We've received over 300 unique copies of the documents in 3 days. Nearly all are variants of four base photos. They vary by dimensions (scaling), visual region (cropping), re-encoding (jpeg to jpeg), transcoding (png to jpeg to webp, etc.), augmenting with annotations, etc. I've seen variants of the images uploaded from 4chan, Twitter, Discord, Imgur, Reddit, the Washington Post, some Taiwanese news outlet, and more. It's everywhere. (And I'm doing my best to remove it from my servers.) If you do find some place hosting the pictures, be aware that the images are far from the camera originals. I doubt you'll find a version that is high enough quality to evaluate or that contains the original metadata.
The news first reported this information leak on 2023-04-06 . However, the earliest upload to FotoForensics came from Sweden a day earlier: 2023-04-05. This doesn't mean that it originated there; only that a person using an IP address in Sweden had come across the pictures. Shortly after that, it was uploaded from a link at Discord, then a 4chan-like site, and then 4chan. It took a day to really go viral; that's when it spread to Twitter. 4chan and Twitter have been the two primary distribution channels for these images. (I suspect that the pre-Musk Twitter would have cracked down on the images, but Musk's Twitter does nothing to proactively deter this distribution.) From there, it spread to other 4chan-like sites as well as Telegram, Imgur, and a variety of news outlets (2023-04-06 and 2023-04-07). The uploads initially came from Northern and Eastern Europe, but quickly spread to the rest of Europe, the Middle East, and Asia. North America didn't become active until hours later (when Twitter began distributing the images).
Besides recording the IP address that identifies who did the upload to FotoForensics, we also record the timestamps from any web-based submissions. If you supply a URL to a picture, that web server does not just return a picture. It also returns the picture's timestamp on the web server. This often denotes when the file was uploaded to the server or became accessible from the server. Most of these pictures have web server timestamps that match the viral distribution (2023-04-05 or later, often minutes or hours before being submitted to FotoForensics). However, one URL at Discord has a web server timestamp that is nearly a month earlier: 2023-03-06 23:55:25 GMT. This means:
-
The pictures were stolen, leaked, and quietly distributed via Discord at least a month before they went viral.
- Discord has many channels where people discuss topics. There's a user in Europe who knows this how to find this older Discord channel. (I know he's in Europe because of his IP address, and he has to know the Discord channel since he knew the URL to provide to FotoForensics.)
Unusual Trends
I've previously written about my trend detector . It looks for widespread variants of the same pictures and where they are being uploaded from. Interestingly, these sensitive documents are not spreading like your typical viral pictures. For example:-
Typical viral images start at publicly accessible sites: media outlets, meme sites, 4chan, Twitter, Facebook, Instagram, etc. From there, copies of the pictures spread to other social media services. That's how the various floods at FotoForensics featuring
One Direction
, Taylor Swift, BTS, "
Amazing Fried Rice Wave
",
Kim Jong Un
, etc. have started. They rarely start at Discord. And if you think about it, this makes sense: Discord channels are closed forums. The public can't access the content so the content rarely spreads virally.
-
When pictures go viral, they usually have a single identifiable starting point and then blossom to other social media outlets. (Eleanor Calder's viral
dog photo
started on her Instagram feed. The flood of
face photos
coincided with the release of "This Person Does Not Exist".) The total time from the initial seed to the viral dissemination may be measured in hours or days (or maybe a week), but not much longer. It's very uncommon for a picture to exist on one social media service for a long time before suddenly going viral. However, the earliest timestamp I've seen with these pictures shows that they were on Discord for a month before being distributed.
-
Facebook and Instagram usually play a big role in viral disseminations. However, I've only seen one of these sensitive pictures uploaded from Facebook, and that happened 3 days after the images went viral (2023-04-08). Either Meta (Facebook's parent company) is cracking down on the images or it's just not there.
-
Unlike typical viral content, these sensitive pictures initially spread through the troll networks (4chan, kohlchan, endchan, etc.) and Twitter. There was not much delay between these sightings. It's almost as if one person said "Now!" and then the postings happened in a coordinated fashion.
- The countries uploading to FotoForensics are also not distributed like normal viral imagery. There initially were a lot of uploads from Sweden, Finland, and the Netherlands. (The Netherlands is a founding NATO member. Sweden is trying to join NATO, and Finland joined NATO the day before the viral spread.) The uploads also quickly appeared in other pro-Ukraine countries, including Poland and Germany. Then it spread through the Middle East and China. The widespread viral access from the UK and US only happened after the pictures showed up on Twitter and in the media. As I mentioned, the dissemination wave went from Europe to the Middle East and Asia before hitting the United States. Typical viral pictures go country-to-country with a timeline that follows the sunrise. That didn't happen here.
One More Thing...
When a high-volume trend hits FotoForensics, there are usually 3 separate waves. First comes the visual copies (variants) of the pictures. Just as the first wave begins to ebb, the porn people start showing up. These are people who learned about FotoForensics from the previous viral news and then decided to upload prohibited pictures. The third and smaller wave has the child porn people who follow the porn people (who followed the first viral wave). These waves are each 2-3 days apart and are so predictable that I can know when a wave of porn uploads will be coming. (This is why I built the trend detector.)Oddly, the viral distribution of these sensitive photos coincided at the same time with an unexpected rise in pornography. There was no delay between these two waves and they followed the exact same volume curves. I can't help but wonder if sites that permit stolen and sensitive documents also have a larger problem with pornography. I tried to search for any public stats related to Wikileaks and pornography, but the Google results almost entirely referenced child pornography and related associations with Wikileaks. Even more unusual: the expected third wave of child porn never appeared at FotoForensics (whew).
In any case, this association between sensitive photos and prohibited (pornographic) imagery is yet another reason why I don't want that content on my servers. Here's to hoping that I don't have this problem again for another 11 years.
Značky: #Politics, #Forensics, #Network, #FotoForensics