@rqsd You can get Pixel 6a at this point for extremely low cost, and these people often refuse not only because of money, but because they want to just not get another phone.

Whether it's cost prohibitive or not, it's impossible to keep those devices secure and updated. That's just how it is. People are misled by many cultists that the OS is the only thing you need to update, which is completely false.

@inference
It is false, but it's not unreasonable. What are the chances of encountering a threat targeted at specific hardware in the wild? If someone has physical access to your device, you're fscked anyway. And that's security, their privacy is more often threatened by newer software, by stuff marketed as useful features that come built right into their ROM.
I don't disagree with you, but claiming older devices a privacy nightmare is a bit of a stretch too 🤷
@rqsd@borg.social

@m0xee @rqsd It's completely untrue that physical access is fatal, because verified boot exists, and Pixels take it further with a HSM.

Verified boot and a locked bootloader makes it so any tampering of the device will be detected and the user will be warned.

@inference
Not completely! Secure boot and chain of trust stuff was there for decades, but we still have jailbroken iPhones and all that. Yeah, I know, verified boot is different, okay-okay 😅
And we're only talking well-known exploits here, you can't prove there aren't any 0-day ones. There is no such thing as 100% secure and with physical access the amount of attack vectors is *always* higher. You just choose what security level is acceptable to you.
@rqsd@borg.social

@m0xee @rqsd You can't know there are no zero-days, of course not, but using an EoL device means they can't be fixed at firmware level when they *are* discovered. You're basically allowing everyone to pwn you for the entire time you use that device from that point, and the OS can only do so much.

An example in any phone, regardless of OEM or OS, is the SoC TEE; an exploit in that means even apps could see other apps' and even system encryption keys. Only SoC firmware patches can prevent that.

@inference
> You're basically allowing everyone to pwn you for the entire time
Well, yeah! But if it is not a remote exploit, maybe it's an acceptable threat level for me? I don't want to get a new phone, but I consider physical access fatal so I don't have anything sensitive on my phone. You, being into infosec, have everything patched and up to date and may have more on your phone than me. Neither of us is crazy, let's not get dogmatic — that was my original point actually 😅
@rqsd@borg.social

@m0xee @rqsd If you know what you're doing, you're probably fine. The issue I have is when normies or even techies are falsely misled into believing that devices don't require proprietary firmware and code, which LineageOS and others do, or, even worse, state that it *decreases* security and privacy when that's outright malicious to state without an atom of evidence, which is what FSF and GNU does.

I don't lie about how things *actually* work, nor do I use ideology over fact, logic, and evidence. I will never join a cult like the FOSS cult.
Follow

@inference
> state that it *decreases* security and privacy when that's outright malicious to state
I think what they state is that if firmware was open everyone could audit it, and eventually it'll get more secure. Making firmware closed and harder to access does make vulnerabilities harder to find, but makes them impossible to fix by anyone other than the original developer. The fact that making something open makes it more secure by itself is just a widespread misinterpretation.

@m0xee Finally, a reasonable response to source availability instead of the cultist response I typically receive from people on here.

Proprietary at least has monetary incentive behind it and is often properly signed etc. It also has paid developers working on it.

Open source is auditable, but that means nothing if the codebase is massive or people aren't actually auditing it. I know I don't bother if the code is more than something like 100-200 lines long.

If someone thinks I'm going to audit Clang or Linux just because I'm a security researcher, they have another thing coming.
Sign in to participate in the conversation
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml