Let's say OP takes a very different turn with their software that I am comfortable with - say reporting my usage data to a different country. I should be able to say "fuck that upgrade, I'm going to run the software that was on my phone when I originally bought it"
This change blocks that action, and from my understanding if I try to do it, it bricks my phone.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
With virtualization this could be done with the same device. The play VM can be properly isolated from the secure one.
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Can you explain it in simpler terms such that an idiot like me can understand? Like what would an alternative OS have to do to be compatible with the "current eFuse states"?
> The anti-rollback mechanism uses Qfprom (Qualcomm Fuse Programmable Read-Only Memory), a region on Qualcomm processors containing one-time programmable electronic fuses.
What a nice thoughtful people to build such a feature.
That’s why you sanction the hell out of Chinese Loongson or Russian Baikal pity of CPU — harder to disable than programmatically “blowing a fuse”.
You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.
Because in the case of smartphones, there is realistically no other option.
> For example if they don't trust it, they may avoid logging in to their bank on it.
Except when the bank trusts the system that I don't (smartphone with Google Services or equivalent Apple junk installed), and doesn't trust the system that I do (desktop computer or degoogled smartphone), which is a very common scenario.
I recently moved to Apple devices because they use trusted computing differently; namely, to protect against platform abuse, but mostly not to protect corporate interests. They also publish detailed first-party documentation on how their platforms work and how certain features are implemented.
Apple jailbreaking has historically also had a better UX than Android rooting, because Apple platforms are more trusted than Android platforms, meaning that DRM protection, banking apps and such will often still work with a jailbroken iOS device, unlike most rooted Android devices. With that said though, I don't particularly expect to ever have a jailbroken iOS device again, unfortunately.
Apple implements many more protections than Android at the OS level to prevent abuse of trusted computing by third-party apps, and give the user control. (Though some Androids like, say, GrapheneOS, implement lots that Apple does not.)
But of course all this only matters if you trust Apple. I trust them less than I did, but to me they are still the most trustworthy.
What do you mean by this? On both Android and iOS app developers can have a backend that checks the status of app attestation.
Users don't have a choice, and they don't care. Bitlocker is cracked by the feds, iOS and Android devices can get unlocked or hacked with commercially-available grey-market exploits. Push Notifications are bugged, apparently. Your logic hinges on an idyllic philosophy that doesn't even exist in security focused communities.
https://arstechnica.com/information-technology/2024/10/phone...
https://peabee.substack.com/p/everyone-knows-what-apps-you-u...
About Apple I just don't know enough because I haven't seriously used them for years
The carriers in the US were caught selling e911 location data to pretty much whoever was willing to pay. Did that hurt them? Not as far as I can tell, largely because there is no alternative and (bizarrely) such behavior isn't considered by our current legislation to be a criminal act. Consumers are forced to accept that they are simply along for the ride.
People would stop taking photos with their camera that they didn't want to be public.
If Google did something egregious enough legislation might actually get passed because realistically, if public outcry doesn't convince them to change direction, what other option is available? At present it's that or switch to the only other major player in town.
...and not because, in truth, they don't care?
How would we even know if people distrusted a company like Microsoft or Meta? Both companies are so deeply-entrenched that you can't avoid them no matter how you feel about their privacy stance. The same goes for Apple and Google, there is no "greener grass" alternative to protest the surveillance of Push Notifications or vulnerability to Pegasus malware.
Persistent bootkits trivial to install
No verified boot chain
Firmware implants survived OS reinstalls
No hardware-backed key storage
Encryption keys extractable via JTAG/flash dump
Modern Secure Boot + hardware-backed keystore + eFuse anti-rollback eliminated entire attack classes. The median user's security posture improved by orders of magnitude.It's not that trusted computing is inherently bad. I actually think it's a very good thing. The problem is that the manufacturer maintains control of the keys when they sell you a device.
Imagine selling someone a house that had smart locks but not turning over control of the locks to the new "owner". And every time the "owner" wants to add a new guest to the lock you insist on "reviewing" the guest before agreeing to add him. You insist that this is important for "security" because otherwise the "owner" might throw a party or invite a drug dealer over or something else you don't approve of. But don't worry, you are protecting the "owner" from malicious third parties hiding in plain sight. You run thorough background checks on all applicants after all!
See also:
https://github.com/zenfyrdev/bootloader-unlock-wall-of-shame
We just had the Google side loading article here.
All I'm saying is that we have to acknowledge that both are true. And, if both are true, we need to have a serious conversation about who gets to choose the core used in our front door locks.
The fact that it's locked down and remotely killable is a feature that people pay for and regulators enforce from their side too.
At the very best, the supplier plays nice and allows you to run your own applications, remove whatever crap they preinstalled and change to font face. If you are really lucky, you can choose to run practically useless linux distribution instead of practically useful linux distribution with their blessing. Blessing is a transient thing that can be revoked any time.
Why not?
Obviously we don't have that. But what stops an open firmware (or even open hardware) GSM modem being built?
https://hackaday.com/2022/07/12/open-firmware-for-pinephone-...
The governments can ban this feature and ban companies from selling devices with that.
I’m sure CIA was not founded after covid :-)
Any kind of device-unique key is likely rooted in OTP (via a seed or PUF activation).
The root of all certificate chains is likely hashed in fuses to prevent swapping out cert chains with a flash programmer.
It's commonly used to anti rollback as well - the biggest news here is that they didn't have this already.
If there's some horrible security bug found in an old version of their software, they have no way to stop an attacker from loading up the broken firmware to exploit your device? That is not aligned with modern best practices for security.
You mean the attacker having a physical access to the device plugging in some USB or UART, or the hacker that downgraded the firmware so it can use the exploit in older version to downgrade the firmware to version with the exploit?
The evil of the type of attack here is that the firmware with an exploit would be properly signed, so the firmware update systems on the chip would install it (and encrypt it with the PUF-based key) unless you have anti-rollback.
Of course, with a skilled enough attacker, anything is possible.
... which describes US border controls or police in general. Once "law enforcement" becomes part of one's threat model, a lot of trade-offs suddenly have the entire balance changed.
Most SoCs of even moderate complexity have lots of redundancy built in for yield management (e.x. anything with RAM expects some % of the RAM cells to be dead on any given chip), and uses fuses to keep track of that. If you had to have a strap per RAM block, it would not scale.
I assume that's also why China is investing so heavily into open source risc-v
Basically breaking any kind of FOSS or repairability, creating dead HW bricks if the vendor ceases to maintain or exist.
When you really need it, like to download maps into the satnav, you can connect it to your home WiFi, or tether via Bluetooth.
OnePlus and other Chinese brands were modders-friendly until they suddenly weren't, I wouldn't rely on your car not getting more hostile at a certain point
Shhh. Nobody tell him where his phone, computer, and vast majority of everything else in his house was made.
My ownership is proved by my receipt from the store I bought it from.
This vandalization at scale is a CFAA violation. I'd also argue it is a fraudulent sale since not all rights were transferred at sale, and misrepresented a sale instead of an indefinite rental.
And its likely a RICO act, since the C levels and BOD likely knew and/or ordered it.
And damn near everything's wire fraud.
But if anybody does manage to take them to court and win, what would we see? A $10 voucher for the next Oneplus phone? Like we'd buy another.
Fabricated or fake consent, or worse, forced automated updates, indicates that the company is the owner and exerting ownership-level control. Thus the sale was fraudulently conducted as a sale but is really an indefinite rental.
If I buy a used vehicle for example, I have exactly zero relationship with the manufacturer. I never agree to anything at all with them. I turn the car on and it goes. They do not have any authorization to touch anything.
We shouldn't confuse what's happening here. The engineers working on these systems that access people's computers without authorization should absolutely be in prison right alongside the executives that allowed or pushed for it. They know exactly what they're doing.
Generally speaking and most of the time, yes; however, there are a few caveats. The following uses common law – to narrow the scope of the discussion down.
As a matter of property, the second-hand purchaser owns the chattel. The manufacturer has no general residual right(s) to «touch» the car merely because it made it. Common law sets a high bar against unauthorised interference.
The manufacturer still owes duties to foreseeable users – a law-imposed duty relationship in tort (and often statute) concerning safety, defects, warnings, and misrepresentations. This is a unidirectional relationship – from the manufacturer to the car owner and covers product safety, recalls, negligence (on the manufacturer's behalf) and alike – irrespective of whether it was a first- or second-hand purchase.
One caveat is that if the purchased second-hand car has the residual warranty period left, and the second-hand buyer desires that the warranty be transferred to them, a time-limited, owner-to-manufacturer relationship will exist. The buyer, of course, has no obligation to accept the warranty transfer, and they may choose to forgo the remaining warranty.
The second caveat is that manufacturers have tried (successfully or not – depends on the jurisdiction) to assert that the buyer (first- or second-hand) owns the hardware (the rust bucket), and users (the owners) receive a licence to use the software – and not infrequently with strings attached (conditions, restrictions, updates and account terms).
Under common law, however, even if a software licence exists, the manufacturer does not automatically get a free-standing right to remotely alter the vehicle whenever they wish. Any such right has to come from a valid contractual arrangement, a statutory power, or the consent, privity still works and requires a consent – all of which weakens the manufacturer's legal standing.
Lastly, depending on the jurisdication, the manufacturer can even be sued for installing an OTA update on the basis of the car being a computer on wheels, and the OTA update being an event of unauthorised access to the computer and its data, which is oftenimes a criminal offence. This hinges on the fact that the second-hand buyer has not entered into a consentual relationship with the manufacturer after the purchase.
A bit of a lengthy write-up but legal stuff is always a fuster cluck and a rabit hole of nitpicking and nuances.
I still sometimes ponder if oneplus green line fiasco is a failed hardware fuse type thing that got accidentally triggered during software update. (Insert I can't prove meme here).
I have however experienced that a ISP will write to you because you have a faulty modem (some Huawei device) and asks you to not use it anymore.
Not surprisingly, stolen phones tend to end up in those locations.
The effects on custom os community is causing me worried ( I am still rocking my oneplus 7t with crdroid and oneplus used to most geek friendly) Now I am wondering if there are other ways they could achieved the same without blowing a fuse or be more transparent about this.
Google pushed a non-downgradable final update to the Pixel 6a.
I was able to install Graphene on such a device. Lineage was advertised and completely incompatible, but some hinted it would work.
You absolutely do not, this is an extremely healthy starting position for evaluating a corporations behavior. Any benefit you receive is incidental, if they made more money by worsening your experience they would.
This makes sense and much less dystopia than some of the other commenters are suggesting.
I don't believe for a second that this benefits phone owners in any way. A thief is not going to sit there and do research on your phone model before he steals it. He's going to steal whatever he can and then figure out what to do with it.
Thieves don't do that research to specific models. Manufacturers don't like it if their competitors' models are easy to hawk on grey markets because that means their phones get stolen, too.
Thieves these days seem to really be struggling to even use them for parts, since these are also largely Apple DRMed, and are often resorting to threatening the previous owner to remove the activation lock remotely.
Of course theft often isn't preceded by a diligent cost-benefit analysis, but once there's a critical mass of unusable – even for parts – stolen phones, I believe it can make a difference.
Android's normal bootloader unlock procedure allows for doing so, but ensures that the data partition (or the encryption keys therefore) are wiped so that a border guard at the airport can't just Cellebrite the phone open.
Without downgrade protection, the low-level recovery protocol built into Qualcomm chips would permit the attacker to load an old, vulnerable version of the software, which has been properly signed and everything, and still exploit it. By preventing downgrades through eFuses, this avenue of attack can be prevented.
This does not actually prevent running custom ROMs, necessarily. This does prevent older custom ROMs. Custom ROMs developed with the new bootloader/firmware/etc should still boot fine.
This is why the linked article states:
> The community recommendation is that users who have updated should not flash any custom ROM until developers explicitly announce support for fused devices with the new firmware base.
Once ROM developers update their ROMs, the custom ROM situation should be fine again.
Sophisticated actors (think state-level actors like a border agent who insists on taking your phone to a back room for "inspection" while you wait at customs) can and will develop specialized tooling to help them do this very quickly.
They don't want the hardware to be under your control. In the mind of tech executives, selling hardware does not make enough money, the user must stay captive to the stock OS where "software as a service" can be sold, and data about the user can be extracted.
Give ROM developers a few weeks and you can boot your favourite custom ROMs again.
To be fair, they are right: the vast majority of users don't give a damn. Unfortunately I do.
Specifically GrapheneOS on Pixels signs their releases with their own keys. And with the rollback protection without blowing out any fuses.
I know that all these restrictions might make sense for the average user who wants a secure phone.. but I want an insecure-but-fully-hackable one.
OnePlus just chose the hardware way, versus Apple the signature way
Whether for OnePlus or Apple, there should definitively be a way to let users sign and run the operating system of their choice, like any other software.
(still hating this iOS 26, and the fact that even after losing all my data and downgrading back iOS 18 it refused to re-sync my Apple Watch until iOS 26 was installed again, shitty company policy)
There is a good reason to prevent downgrades -- older versions have CVEs and some are actually exploitable.
What exactly is it comparing? What is the “firmware embedded version number”? With an unlocked bootloader you can flash boot and super (system, vendor, etc) partitions, but I must be missing something because it seems like this would be bypassable.
It does say
> Custom ROMs package firmware components from the stock firmware they were built against. If a user's device has been updated to a fused firmware version & they flash a custom ROM built against older firmware, the anti-rollback mechanism triggers immediately.
and I know custom ROMs will often say “make sure you flash stock version x.y beforehand” to ensure you’re on the right firmware, but I’m not sure what partitions that actually refers to (and it’s not the same as vendor blobs), or how much work it is to either build a custom ROM against a newer firmware or patch the (hundreds of) vendor blobs.
The abl firmware contains an anti rollback version that is checked with the eFuse version.
The super partition is a bunch of lvm logical partitions on top of a single physical partition. Of these, is the main root filesystem which is mounted read only and protected with dm-verity device mapping. The root hash of this verity rootfs is also stored in the signed vbmeta.
Android Verified Boot also has an anti rollback feature. The vbmeta partition is versioned and the minimum version value is stored cryptographically in a special flash partition called the Replay Protected Memory Block (rpmb). This prevents rollback of boot and super as vbmeta itself cannot be rolled back.
This doesn't make sense unless the secondary boot is signed and there is a version somewhere in signed metadata. Primary boot checks the signature, reads the version of secondary boot and loads it only if the version it's not lower than what write-once memory (fuse) requires.
If you can self-sign or disable signature, then you can do whatever boot you want, as long as it's metadata satisfies the version.
From the article:
Any subsequent attempt to install older firmware results in a permanent "hard brick" - the device becomes unusable
This implies that not only does an older custom ROM not work, but neither does attempting to recover by installing a newer ROM.
Edit: It seems that this does apply to OxygenOS too: https://xdaforums.com/t/critical-warning-coloros-16-0-3-501-...
This is ultimately about making the device resistant to downgrade attacks. This is what discourages thieves from stealing your phone.
Not just "there should be some phone brands that cater to me", but "all phone brands, including the most mainstream, should cater to me, because everyone on earth cares more about 'owning their hardware' than evil maid attack prevention, Cellebrite government surveillance, theft deterrence, accessing their family photos if they forget their password, revocable code-signing with malware checks so they don't get RATs spying on their webcam, etc, and if they don't care about 'owning their hardware' more than that, they are wrong".
It is objectively extremist and fanatical.
As time goes on, the options available for those that require such sovereignty seem to be thinning to such an extent that [at least absent significant disposable wealth] the remaining options will appear to necessitate adopting lifestyle changes comparable to high-cost religious practices and social withdrawal, and likely without the legal protections afforded those protected classes. Given the "big tech's" general hostility to user agency and contempt for values that don't consent to being subservient to its influence peddling, intense emotional reaction to loss of already diminished traditional allies seem like something that would reasonably viewed compassionately, rather than with hostility.
None of the situations you mentioned are realistic or even worth thinking about for the vast majority of the population. They're just an excuse to put even more control into the manufacturer's hands.
I don't care if they can downgrade the device, just that I boot into a secure verified environment, and my data is protected.
I also think thieves will just grab your phone regardless, they can still sell the phone for parts, or just sell it anyway as a scam etc.
There's over a 10x difference in fence price between a locked and unlocked phone. That's a significant incentive/deterrent.
It has some increasing timer for auth, and if you try and factory reset it - it destroys all the data?
As I said its less important that the thief can boot a new os, the security of my data is more important. How is that compromised?
It feels like a thief is just going to opportunistically grab a phone from you rather than analyse what device it is.
If so, is this 'fuse' per-planned in the hardware? My understanding is cell phones take 12 to 24 months from design to market. so, initial deployment of the model where this OS can trigger the 'fuse' less one year is how far back the company decided to be ready to do this?
You can still change the IMEI on many phones if you know how to.
https://service.oneplus.com/us/search/search-detail?id=op588
They make it clear that this feature is unsupported and it's possible to mess things up. The reason why it's an ideal and not an expectation is that flashing alternate operating systems is done at one's own risk and is unsupported. They have already told the users that they bear no responsibility for what may go wrong if they flash the wrong thing on that device. Flashing incompatible operating systems to the device requires people to be careful and proper care to ensure compatibility before going through with flashing was not done.
https://www.eag.com/services/engineering/fib-circuit-edit-de...
Costs are what you'd expect for something of this nature.
But vendors wouldn't be able to say the device runs "Android" as it's trademarked. AVB is therefore mandatory and in order for AVB to be enforced, you can't really control the device - unlocking the bootloader gives you only partial control, you can't flash your own "abl" to remove AVB entirely.
But I don't want AVB and I can't buy such device for money.. this isn't free market, this is Google monopoly..
But it's not just about that, it's about the fact that I can't flash my own "abl" or the software running in the TrustZone there at all as I don't control the actual signing keys (not custom_avb_key) and I'm not "trusted" by my own device.. There were fuses blown as evident by examining abl with its fastboot commands - many refuse to work saying I can't use it on a "production device". Plus many of those low-level partitions are closed source proprietary blobs..
Yes yes - I DO understand that for most people this warning is something positive, otherwise you could buy a phone with modified software without realizing it and these modifications could make it impossible to restore the original firmware.
I do agree it's far from ideal though. But there are so many, much worse offenders that uses these fuses to actually remove features, and others that do not allow installing a different OS at all. The limited effort should probably be spent on getting rid of those first.
Of course there are bigger problems in the ecosystem, like Play Integrity which actively attempt to punish me for buying open hardware. Unfortunately that's the consequence of putting "trusted" applications where they IMO don't belong - there are smartcards with e-ink displays and these could be used for things like banking confirmations, providing the same security but without invading my personal computing devices. But thanks to Android and iOS, banks/governments went for the anti-user option.
But to answer your question: we know iPhones have a foolproof kill switch, it's a feature. Just mark your device as lost in Find My and it'll be locked until someone can provide your login details. Assuming it requires logging in to your Apple account (which it does, AFAIK; I don't think logging in to a local account is enough), this is the same as a remote kill switch; Apple could simply make a device enter this locked-down state and then tweak their server systems to deny logins.
Millions of fully working apple devices are destroyed because of that even - Apple won't unlock them even with proof of ownership.
Realize that many of these manufacturers sell their hardware in and employ companies in highly policed societies. Just the fact that they are allowed to continue to operate implies that they are playing ball and may well have to perform a couple of favors. And that's assuming they are fully aware of what they are shipping, which may not be always the case.
I don't think it is a bad model at all to consider any cell phone to be compromised in multiple ways even though you don't have hard proof.
Pre-prod (etc.) devices will also have different fuses burnt.
Using eFuses is a popular way of implementing downgrade prevention, but also for permanently disabling debug flags/interfaces in production hardware.
Some vendors (AMD) also use eFuses to permanently bond a CPU to a specific motherboard (think EPYC chips for certain enterprise vendors).
At the moment they're 'older' and would class as a rollback, which this fuse prevents.
Storing it in the firmware would mean every user has the same key. Storing it in eeprom means a factory reset will clear it. This allows me to ship hardware with the default key on a sticker on the side, and let's a non technical user reset it back to that if they need to.
It gives you a 256bit block to work with - https://docs.espressif.com/projects/esp-idf/en/stable/esp32/...
They are no different than some shit ransomware, except there is no demand for money. However, there is a demonstrable proof of degradation and destruction of property in all these choices.
Frankly, criminal AND civil penalties should be levied. Criminally, the C levels and boars of directors should all be in scope as to encouraging/allowing/requiring this behavior. RICO act as well, since this smells like a criminal conspiracy. Let them spend time in prison for mass destruction of property.
Civally, start dissolving assets until the people are made whole with unbroken (and un-destroyed) hardware.
The next shitty silly-con valley company thinks about running this scam of 'customer-bought but forever company owned', will think long and hard about the choices of their network and cloud.
There is when the device becomes hard bricked and triggers an unnecessary need for a new one.
Now I have to consider my device dead re updates, because if I haven't already gotten the killing update I'd rather avoid it. First thing I did was unlock the bootloader, and I intend to root/flash it at some point. Will be finding another brand whenever I'm ready to upgrade again.