The technical details of the reverse engineering are from last year (1) (or here (2) directly), this post is just regarding the announcement of a big fine, and some rambling.
Several of today's "business leaders" and the various judges appointed by the ascendant kleptocracy seem to constantly argue otherwise. What's the actual plan for doing that?
I don't think the right is terribly unified on this. A populist with a strong angle might be able to make enough of a movement to push some version of this through. I have trouble imagining this populist figure, but it might resemble a "cross of AI".
Being a cat-and-mouse game, would the reaction from the app-owner be to shift the computation to methods that are not discernable by dissection of the app?
For example having the app be purely a data collection tool which then streams it to the server to do all computation?
No. Any sane engineering team would have built it that way in the first place, so they almost certainly don't have the competence to change it now, or possibly even to understand your question.
I think that would be the case if the employer was doing this on purpose.
I would bet this is more a case of business goals being met by dev teams in the quickest and easiest way possible, without anyone providing legal or regulatory oversight to ensure the implementation is complying with required laws.
That's not any kind of justification or excuse though!
The reverse engineering is really secondary to the regulatory regime. The company in this story had already been investigated and fined before anyone had tried to reverse-engineer their app.
It's an offence under GDPR to fail to cooperate with a supervisory authority. There are extensive record-keeping and transparency requirements. Trying to play cat-and-mouse is itself illegal and likely to be legible to the regulator.
Not so fast, Frida can be detected.. you need to deal with those detection vectors first.
I'd not be surprised if the next version of the app included an "integrity proctection" added officially in order to "protect couriers' security".. these can be bypassed, but it shows that exposing your tools is not always a wise move.
Frida is more like a debugger, or a very fancy Action Replay. But once can generally just patch put the detection mechanism given sufficient time and motivation.
The technical details of the reverse engineering are from last year (1) (or here (2) directly), this post is just regarding the announcement of a big fine, and some rambling.
[1] - https://reversing.works/posts/2023/12/mobile-reverse-enginee...
[2] - https://media.ccc.de/v/37c3-12133-mobile_reverse_engineering...
A COMPUTER CAN NEVER BE HELD ACCOUNTABLE
THEREFORE A COMPUTER MUST NEVER MAKE A MANAGEMENT DECISION
- however the entity that owns it can be held accountable.
Several of today's "business leaders" and the various judges appointed by the ascendant kleptocracy seem to constantly argue otherwise. What's the actual plan for doing that?
I don't think the right is terribly unified on this. A populist with a strong angle might be able to make enough of a movement to push some version of this through. I have trouble imagining this populist figure, but it might resemble a "cross of AI".
Being a cat-and-mouse game, would the reaction from the app-owner be to shift the computation to methods that are not discernable by dissection of the app?
For example having the app be purely a data collection tool which then streams it to the server to do all computation?
No. Any sane engineering team would have built it that way in the first place, so they almost certainly don't have the competence to change it now, or possibly even to understand your question.
I think that would be the case if the employer was doing this on purpose.
I would bet this is more a case of business goals being met by dev teams in the quickest and easiest way possible, without anyone providing legal or regulatory oversight to ensure the implementation is complying with required laws.
That's not any kind of justification or excuse though!
The reverse engineering is really secondary to the regulatory regime. The company in this story had already been investigated and fined before anyone had tried to reverse-engineer their app.
It's an offence under GDPR to fail to cooperate with a supervisory authority. There are extensive record-keeping and transparency requirements. Trying to play cat-and-mouse is itself illegal and likely to be legible to the regulator.
https://gdpr-info.eu/art-30-gdpr/
The elephant in the room: how many of these tactics are used by US companies, considering (I presume) our relatively lax data protection laws?
Not so fast, Frida can be detected.. you need to deal with those detection vectors first.
I'd not be surprised if the next version of the app included an "integrity proctection" added officially in order to "protect couriers' security".. these can be bypassed, but it shows that exposing your tools is not always a wise move.
How can a static disassembler be detected? It doesn't actually run the process it disassembles.
Frida is more like a debugger, or a very fancy Action Replay. But once can generally just patch put the detection mechanism given sufficient time and motivation.