Bleats

Spoiler: There Is No Incognito Mode For Your Saucy Video Viewing, Google And Facebook Know

Nothing pure can stay.

So, as a savvy internet user with a keen appreciation for the human form, you’ve been confidently swapping to incognito mode in Google Chrome whenever you feel the need to… um, explore your innermost imagination. Groinally, we mean.

He gets it.

And you’ve been quietly confident that your salty secrets have been safe, and you know what heartbreaking news is coming next.

Yes, it turns out that those companies which have lied to you about how secure your data is and whether or not they’re collecting and using it have been lying to you about the security of your data and collecting and using it. WHO COULD HAVE GUESSED???

A new study entitled Tracking sex: The implications of widespread sexual data leakage and tracking on porn websites is currently in pre-print, in which researchers from Microsoft, Carnegie Mellon, and the University of Pennsylvania looked at data tracking tools in porn sites and found there are loads of them, feeding data back to companies like Google, Facebook and Oracle.

And incognito mode stops that data being stored on your computer, it’s true… but not from these data trackers.

And because few porn sites are encrypted it means that other companies can presumably exploit site security to access said trackers themselves.

And why is this an issue? Let’s look at the study’s abstract:

“We identify three core implications of the quantitative results: 1) the unique/elevated risks of porn data leakage versus other types of data, 2) the particular risks/impact for vulnerable populations, and 3) the complications of providing consent for porn site users and the need for armative consent in these online sexual interactions.”

In other words: people can be outed, this can be devastating for particular groups of people depending on where they live, and this is being done without clear consent.

Honestly, people. VPNs. Get onto it.

Instagram's Latest Update Will Force You To Check Yourself Before You Wreck Yourself

It's like a best friend saying "hey, how about you *don't* post that?"

Cyberbullying is one of those wicked problems which have baffled legislators, parents and people who know nothing about the internet but have opinions they like to express regardless.

How can we restrict bullying while not accidentally blocking legitimate speech? How can an algorithm tell a joke from a threat? Is it better just to burn down the internet and return to the sea? No-one seems to know, beyond that Something Simply Must Be Done.

And now Instagram – aka The Legacy Social Media Platform Young People Actually Still Use – are taking active steps to make things less fraught in a move which deserves widespread and sustained applause: one using A1, and one in the hands of users.

First up, there will shortly be a new option: restrict.

It’s a softer alternative for young people who are worried that blocking, unfriending or reporting bullies is a great way to make things escalate IRL.

Messages from restricted people will be moved to Message Requests, meaning that people can more easily ignore them, and they don’t see whether you have read their message or interacted with their post, or be alerted when you’re on the platform. Also, you’ll be able to delete comments and messages without having to read them.

And that’s great, but there’s also the A1 powered Comment Warning, which gives people a chance to reflect on whether the thing they’re about to post is actually worth posting – and hopefully you haven’t attempted a flamewar yet because it’s already been rolled out.

These are small changes with potentially huge positive effects, and seems like something Facebook (who own Insta) should be adding to their platform asap.

Twitter, obviously, doesn’t need it since it’s already a textbook example of genteel good manners.

Twitter, today.

Deep Fakes Is The First Adult Tech With The Power To Kill Us All

You'll never hear the term "streaming video" in the same way again.

If you want to know the future of technology, then you need only look at what’s happening in pornography.

You know what standardised Super 8 as a format in the 1960s? Porn. You know what killed Beta as a video format? The greater availability of porn on VHS. 1-800 numbers, CD-ROM, pay TV, video on demand: they all started with some engineer with a dream, which involved seeing someone else with no clothes on.

Not necessarily this specific someone else.

And when the internet appeared a lot of the things you take for granted – secure online payments, encryption, video formats – were all pioneered by the industrious smut-bees of the porn biz.

Even the weird-ass “pregnancy belt” used by particularly empathetic fathers-to-be to experience the feelings of carrying a child is only possible because of the leaps in haptic engineering that the porn industry spearheaded for masturbatory tech – what is rather brilliantly called “teledildonics”. Streaming video, appropriately, started with porn.

Anyway: this brings us to Deepfakes, the technology which is about to end the world.

Well, at least it’s still creepy.

Deepfakes began, once again, from the simple dream of wanting to see people with their clothes off. Specifically, famous people.

The term was coined in 2017 – blending the coding term “deep learning” with “fake” – but the exercise has existed ever since horny people realised they could use graphics programs to cut and paste a famous person’s face onto a digital centrefold.

And, like all technology, it rapidly left the porn suburbs and moved to the cultural big city where it’s used to do things like create a creepy Princess Leia for Rogue One or have 70 year old Samuel L Jackson perform as his 45 year old self in for Captain Marvel.

But, of course, now that tech is out there and anyone can use it to make anyone do or say anything in a way that’s pretty convincing.

So how long until it’s used for a world leader to declare war, or have an opponent announce they’re a pedophile? It’s vaguely amazing it hasn’t happened yet.

At the time of writing, at least.

But it’s not even what they can create that’s of concern: it now gives a perfect rationale for denying the reality of anything caught on film.

That’s reasonable if it’s, say, Emma Watson rejecting a sex tape – but less so if it’s Putin denying war crimes, or Trump rejecting video of alleged urine-related events in Russian hotel rooms. When anything is real, nothing is.

And god, how embarrassing will it be if humans end up rendered extinct from a war started by a bad video of Chinese leader Xi Jinping grinding down on Vice President Mike Pence?

When we get to the afterlife, we’re going to look like such idiots.

Pop-up Channel

Follow Us