Apple was for a while rumored to be planning launch for iPhone of AI assisted emails, texts, summaries and so on including via Siri, to be announced at WWDC 24.
"The point of emojis is that they are pictograms, that there is a fixed pool and each takes on a unique meaning. If you generate new ones all the time, now all you are doing is sharing tiny pictures. Which is fine if you want that, but seriously, no, stop it." Spend some time in Twitch chat with all the engagement with emotes, BTTV, 7TV etc. and the plain, fixed set of Emojis becomes pretty dull. Don't get me wrong, the staples of Kappa, LUL, KEKW, POG have their role as those "fixed cultural touchpoints" but I've found the constantly expanding set is still a net positive experience.
Twitch is an interesting case because the point is that each channel has its own set of fixed touchpoints, you can't make up new ones on your own. I think that points to what makes a good set of options...
" My essential decision is that I trust Google. It’s not that I don’t trust Apple at a similar level, it’s that there has been no reason strong enough to risk a second point of failure."
I disagree with the first sentence here. I think Google is better modeled currently as a large group of engineers with a broadly late-90s / early aughts Slashdot ethos (i.e., basically how computer scientists have always been) compatible with a desire for strong privacy guarantees and robust safeguards, managed by an increasingly profit-focused executive and upper management class who are going to constantly undermine attempts by engineering teams and sympathetic lower/ middle managers to implement them.
The basic problem is one of incentive gradients. While Google products often reflect reasonably good-faith attempts to be reasonably transparent about what they're doing (I would, for example, put them above median in terms of "responsible behavior towards users providing sensitive data"), they're going to be subject to constant enshittification pressure because ultimately Google's raison d'etre is to sell ads, and their basic business model is monetizing user behavior and user data (and user time and eyeballs) in order to do so. Privacy safeguards are ultimately directly opposed to Google's basic profit-making business model and as a result we should expect to be some combination of weak or weakening at all times.
Apple's goal is to sell expensive hardware, with a secondary revenue stream from its walled-garden ecosystem of software and services. Like Google, this is subject to observable enshittification failure modes -- most obviously, never including enough storage or RAM in base model hardware product offerings and then charging $400 for $5 worth of additional parts on a soldered board. Or dropping the headphone jack to sell more airPods. But importantly, these enshittification pressures are largely orthogonal to privacy preservation, so unlike Google, Apple's fairly robust privacy-respecting track record isn't directly opposed to its revenue streams.
To be clear I was describing a practical decision (that Google failing could do a lot of damage and I'm not making major sacrifices to mitigate that). And I do prefer what they offer to what Apple does, no matter how enshittified a lot of it is.
All that of course is assuming that they are telling the truth about not storing it. My fear is that they’re going to store it, they’re going to train models on it to help solve the “not enough data“ hurdle, and then when someone blows the whistle they’ll just have to pay a fine and just say “Oopsie!” without any major consequences.
Pros: this sounds convincing. Cons: this is Apple, and they always walled-garden until stopped; also privacy is a myth that doesn't and shouldn't matter.
AI Narration of this post:
https://askwhocastsai.substack.com/p/aiphone-by-zvi-mowshowitz
"The point of emojis is that they are pictograms, that there is a fixed pool and each takes on a unique meaning. If you generate new ones all the time, now all you are doing is sharing tiny pictures. Which is fine if you want that, but seriously, no, stop it." Spend some time in Twitch chat with all the engagement with emotes, BTTV, 7TV etc. and the plain, fixed set of Emojis becomes pretty dull. Don't get me wrong, the staples of Kappa, LUL, KEKW, POG have their role as those "fixed cultural touchpoints" but I've found the constantly expanding set is still a net positive experience.
Twitch is an interesting case because the point is that each channel has its own set of fixed touchpoints, you can't make up new ones on your own. I think that points to what makes a good set of options...
> 11. You can customize your iPhone home screen icons a little now.
I don't think this link goes to the right place FYI!
" My essential decision is that I trust Google. It’s not that I don’t trust Apple at a similar level, it’s that there has been no reason strong enough to risk a second point of failure."
I disagree with the first sentence here. I think Google is better modeled currently as a large group of engineers with a broadly late-90s / early aughts Slashdot ethos (i.e., basically how computer scientists have always been) compatible with a desire for strong privacy guarantees and robust safeguards, managed by an increasingly profit-focused executive and upper management class who are going to constantly undermine attempts by engineering teams and sympathetic lower/ middle managers to implement them.
The basic problem is one of incentive gradients. While Google products often reflect reasonably good-faith attempts to be reasonably transparent about what they're doing (I would, for example, put them above median in terms of "responsible behavior towards users providing sensitive data"), they're going to be subject to constant enshittification pressure because ultimately Google's raison d'etre is to sell ads, and their basic business model is monetizing user behavior and user data (and user time and eyeballs) in order to do so. Privacy safeguards are ultimately directly opposed to Google's basic profit-making business model and as a result we should expect to be some combination of weak or weakening at all times.
Apple's goal is to sell expensive hardware, with a secondary revenue stream from its walled-garden ecosystem of software and services. Like Google, this is subject to observable enshittification failure modes -- most obviously, never including enough storage or RAM in base model hardware product offerings and then charging $400 for $5 worth of additional parts on a soldered board. Or dropping the headphone jack to sell more airPods. But importantly, these enshittification pressures are largely orthogonal to privacy preservation, so unlike Google, Apple's fairly robust privacy-respecting track record isn't directly opposed to its revenue streams.
To be clear I was describing a practical decision (that Google failing could do a lot of damage and I'm not making major sacrifices to mitigate that). And I do prefer what they offer to what Apple does, no matter how enshittified a lot of it is.
“OpenAI promises not to retain data”
Are they training models on your data? If so, isn’t that just as bad for privacy and security as storing it?
My understanding is that if it isn't stored they aren't able to train on it.
All that of course is assuming that they are telling the truth about not storing it. My fear is that they’re going to store it, they’re going to train models on it to help solve the “not enough data“ hurdle, and then when someone blows the whistle they’ll just have to pay a fine and just say “Oopsie!” without any major consequences.
Pros: this sounds convincing. Cons: this is Apple, and they always walled-garden until stopped; also privacy is a myth that doesn't and shouldn't matter.