@inference
It is false, but it's not unreasonable. What are the chances of encountering a threat targeted at specific hardware in the wild? If someone has physical access to your device, you're fscked anyway. And that's security, their privacy is more often threatened by newer software, by stuff marketed as useful features that come built right into their ROM.
I don't disagree with you, but claiming older devices a privacy nightmare is a bit of a stretch too 🤷
@rqsd@borg.social
@inference
Not completely! Secure boot and chain of trust stuff was there for decades, but we still have jailbroken iPhones and all that. Yeah, I know, verified boot is different, okay-okay 😅
And we're only talking well-known exploits here, you can't prove there aren't any 0-day ones. There is no such thing as 100% secure and with physical access the amount of attack vectors is *always* higher. You just choose what security level is acceptable to you.
@rqsd@borg.social
@inference
> You're basically allowing everyone to pwn you for the entire time
Well, yeah! But if it is not a remote exploit, maybe it's an acceptable threat level for me? I don't want to get a new phone, but I consider physical access fatal so I don't have anything sensitive on my phone. You, being into infosec, have everything patched and up to date and may have more on your phone than me. Neither of us is crazy, let's not get dogmatic — that was my original point actually 😅
@rqsd@borg.social
@inference
> state that it *decreases* security and privacy when that's outright malicious to state
I think what they state is that if firmware was open everyone could audit it, and eventually it'll get more secure. Making firmware closed and harder to access does make vulnerabilities harder to find, but makes them impossible to fix by anyone other than the original developer. The fact that making something open makes it more secure by itself is just a widespread misinterpretation.
Proprietary at least has monetary incentive behind it and is often properly signed etc. It also has paid developers working on it.
Open source is auditable, but that means nothing if the codebase is massive or people aren't actually auditing it. I know I don't bother if the code is more than something like 100-200 lines long.
If someone thinks I'm going to audit Clang or Linux just because I'm a security researcher, they have another thing coming.