iOS 15

Facebook already know all about you even without an account

1 Like

Not with the blocking that I do, they don’t :wink:

2 Likes

I completely share your privacy concerns with whatsapp - but you don’t need a facebook account to use it.

1 Like

There are a lot of ways to prevent this if you want to though. It’s just most people don’t bother.

1 Like

I think if I was starting my internet life now, maybe I would be more careful. But I feel like that door has long since slammed shut!

3 Likes

Why? I don’t really understand how people can be so offended or triggered by a company. If a friend was not inclined to use Microsoft would you feel rude setting up a Skype call?

If someone sent me a Google hangout request and it could be used through my iPhone I’d just… use it. It’s a service, a means to an end.

I don’t get anyone who would feel it be rude to ask you to use a service to contact them. Just say “I don’t have that sorry, anything else we can use?”, or just download it to use.

I’m probably in the minority I’m sure, but Google, Apple, Microsoft… it’s just a company. Use it or don’t.

4 Likes

Well, seems pretty obvious to me that if my friends are ‘inclined against’ apple products, I wouldn’t want to send them a facetime request.

It’s just your number no Facebook account required

1 Like

Since there tend to be really good privacy focussed discussions on this forum, and o particularly enjoy reading @N26throwaway’s takes on privacy, I am curious what people on here think of Apple’s recently announced child protection changes coming in iOS 15.

The internet seems to be up in arms about it, but Apple seem to be doubling down already.

I wouldn’t care too much, but as I admittedly don’t understand how hashing and AI stuff works, I’m not sure how valid the internet’s concerns about false positives are. Also, whether the scope could be expanded later on, as the internet says

To be blunt. Apple are wrong on this one. It’s torn a rift internally too.

Some of their actions lately seem to be at odds with their core values, as much as they’re trying to spin doctor them as a good thing. I don’t like it, I can’t believe Tim Cook supports it, and I hope it kicks up enough attention that he’s made aware, becomes informed, and fires the guy responsible.

The fact they had the audacity to dismiss anyone with a cautious viewpoint of not understanding is downright tone deaf and insulting.

Recall back to the FBI wanting to back door the iPhone so they can access content on a terrorist phone. Tim Cook called that the software equivalent of cancer. When it came to terrorism, privacy was worth protecting. When it comes to child pornography, it isn’t? This sets a precedent in my view, one that governments will now want to exploit. How can one reasonably justify not building a backdoor to prevent terrorism, when you can scan everyone’s phone to hopefully catch child pornography?

Tim Cook called a backdoor the software equivalent of cancer. This is stage 0 of said cancer.

I think there are some good things here though. Like the reporting tools, and censoring potentially harmful content before it’s viewed. But the line is crossed where they scan the hashes of every photo stored on your device. This is also one of these solutions that only really affect law abiding citizens. It eats away at our privacy. Paedophiles will find a way to circumvent it, and will want to, to protect themselves.

The issue isn’t what they’re doing with the AI CSAM stuff. It’s the precedent it sets for potentially more worrisome and invasive things later. I trust Apple to not want to go further, but I don’t trust governments not to compel them to do so on the basis of well you’ve done it for this, so you can do it for that.

4 Likes

Massive agree with this - this is very succinctly why I’m against this.

And as you say also, this is not going to have any strong effect, as anyone with CSA images will just take one of many possible measures to get around it…

Very fucked precedent.

3 Likes

It’s clearly a response to the US and UK governments’ “If big tech doesn’t do something then we’ll force it on them” - i.e. they’ve jumped before being pushed. The fact that it also ensures that CSAM is kept off their servers is a bonus…

Offenders will just find somewhere else to store their stash, or just keep it on the iPhone without uploading to iCloud.

1 Like

This is the issue. Right now it’s CSAM, but soon enough Russia and China will request it for something far more grey. And Apple will have no choice but to comply.

With encryption Apple fell back on the argument that they physically couldn’t access the data. In this instance they’ve built they keys.

It’s a virtual slippery slope.

3 Likes

That won’t work. If you upgrade to iOS 15, you consent to Apple using on device scanning of your photo library. Matches will then be sent off somewhere for investigation regardless of if you use iCloud or not.

If this was purely an iCloud thing, I doubt the response would be as negative as it is.

Edit: my understanding is wrong here, and you’re correct. The scanning is done on device, but it only scans your on device photos if you use iCloud. They recently clarified this yesterday, and I missed it until just now.

But yes, if it’s already that easy to circumvent, it’s no solution, just a weakening of privacy.

3 Likes

My understanding was that figerprinting only occurs if you’re using iCloud Photo Library. If you’re not, there’s no issue.

1 Like

The problem is not just iCloud. Apple recently confirmed that they might (read: will) open this up to other, third party apps too. So it’s iCloud today, but all your apps tomorrow. There’s nothing stopping Apple from making opting in an App Store approval criteria

And as @ravipatel says, they could easily expand the scope to other, politically sensitive material in countries where they have a big user base and strong pressure from the government

The intention is good, but this is bad, big time

2 Likes

I agree it’s a real shame. But I’d also point out that if you are doing something that requires extreme sensitivity, you’ve probably already stopped uploading anything to iCloud or any cloud service as part of your threat model. If you haven’t maybe this will force people to reevaluate. It’s already the case that iCloud backups are a noted weak spot in iPhone security and a common place for information to be extracted from.

This could affect the average user though and allow for mass surveillance. I guess that’s the worry here.

1 Like

Yeah I’m very much not coming at this from a position of having secret information that’s gonna get me killed or kidnapped if the government finds out - but more from a “here we go with even further mass surveillance than before”.

3 Likes

Indeed. There will always be a ‘justification’ for mass surveillance - terrorism, CSAM, ‘security’ etc but at the end of the day it never just gets used for the reason it was justified.

I think the examples of China are slightly spurious though because they can already see everything all their citizens are doing online. I’m more worried about even further encroaches in the West, especially in the U.K. where surveillance is already absurd.

2 Likes