58 Comments

Thanks for this informative post.

I wrote a post with a possible "happy ending" for the social media generation. Here's an excerpt:

"Still, I think it would be highly unfortunate if in our discourse we condemned the “social media generation” to an inevitable and permanently higher prevalence of misery and “stupidity” by assuming this is their fate and repeatedly telling them so. In fact, it would not surprise me if, contrary to all assumptions, this generation emerged into maturity with a greater sense of both reality and resilience. Perhaps they will discover at a much earlier age the value of the stoic approach–––we can’t control most events, but we can control our reaction to them."

Full post below (short, 1,000 words) about my own personal experience as a child and a parent of now adult children. In case anyone has an interest.

Title: My Youthful Distractions, American Teens and Social Media, Marcus Aurelius on Binge-Watching

https://robertsdavidn.substack.com/p/my-youthful-distractions-american

Expand full comment

There are a lot of big advantages to the new world - I am in no way giving up, and things might turn out great. The experiment is also about to get hella-confounded by AI.

Expand full comment

Nice post! One small nitpick: I don't know if "less bullying" is the correct conclusion to draw from the CDC data. The only category they give as green is "bullied at school" where the rates drop from 19-20% for 8 years in a row to suddenly 15% in 2021. I'd assume that is a pandemic effect of not being *at school*, and might revert to the extremely stable 2011-2019 trend.

Expand full comment

Jason Pargin has talked/written somewhere about the shift that occurred to websites such as his Cracked.com in 2012 with the advent of smartphones. The shift is, of course, complete collapse and panic and severe constraints on the type of content which is successful on social media platforms as opposed to websites.

Marshall Mcluhan would be fascinated by the tribalizing effects of moving from the detached contemplation of a screen to a tactile environment you control with your finger. A profane, sequential world of specialists to a discontinuous mosaic of communally created content designed to incentivize deep involvement and participation.

As a palliative measure I would suggest “Art & Games” in the Mcluhanian sense of providers of exact information of how to rearrange one's psyche in order to anticipate the next blow from our own extended faculties.

That is, the artist can show us how to "ride with the punch," instead of "taking it on

the chin." It can only be repeated that human history is a record of "taking it on the chin".

Expand full comment

Yeah, I totally kept meaning to talk about that aspect and just forgot. So many angles.

Expand full comment

David Chapman attributes this (I think) – "the shift that occurred to websites ... with the advent of smartphones" – to "recommender engines" instead of smartphones, or 'social media' overall.

It seems very plausible to me, i.e. that that aspect might NOT have been so terrible had 'social media' remained mostly variations on 'feed aggregators' instead of becoming outrage/controversy generators. It seems pretty clear that 'The Algorithm' is almost always referring to "recommender engines" specifically.

Relevant bit from his recent 'web book' discussing this: https://betterwithout.ai/AI-already-at-war

Expand full comment

This is a good reason for having multiple kids, not too far apart, and encouraging strong ties between them, so even if they're isolated from the world they have each other.

Expand full comment

I endorse this. I strongly believe my kids are happier because they have each other.

Expand full comment

I have 6 kids, 18 to 4, homeschooled. This lets them have a much, much better childhood compared to their peers. No tests, no grades, no phones, no crazy social pressures and zero sum games. Lots of playing together, across ages.

Also, graduating from college at 18 while living at home (1/6, others on track).

Expand full comment

Hi, late to the party, enjoying this comments section. @Magus, by the way, I graduated from college at 16 while living at home (after homeschooling) and now consider it to have been serious mistake. You may be able to provide a better environment and my experience is of course individual, but for me, I felt that going to college cut off my education to focus instead on grades and tests. My younger siblings are attending top engineering schools at normal ages and are the best in their classes and able to take advantage of stuff like having peers and reading papers with their professors. I wish that I had gone to college starting at age 15-18 rather than 12. When I was starting I imagined it would confer a career advantage but that effect was marginal at best - it's certainly impressive, but it was a huge waste of time and energy to spend my adolescence pretending to learn, and caused a lot of bad habits.

Expand full comment

It is a truth universally acknowledged that email is one of the Cardinal Evils of Our Times. This doesn't seem to match my own experience; am I just very lucky, or am I missing out on some core (horrible) use of email? In case the demographics are relevant, I'm 36F, married with a preschooler and a dog, have a PhD in math, work in tech, and have moved around in the past but live in the suburbs of Minneapolis. My uses of email are (sorted, approximately, from most to least pleasant / useful):

* At work (on the entirely-separate work account): a mixture of relevant, somewhat relevant, outright corporate spam, outright external spam. Typically easy to sort out which is which. Entirely circumscribed to working hours.

* Personal communication, formerly known as letters but few people I know are willing to hand-write those anymore. Not a large fraction of my inbox, because there's few long-distance friends who like the format.

* Coordination of in-person or online activities, split between email and text. The most inconvenient feature here is that it's split, meaning I might have to search two different places for details that didn't get propagated into the calendar immediately.

* Newsletters that I read (including this one).

* Tickets, confirmations, appointment reminders, charity receipts, etc.

* Some notes to self; admittedly, not an ideal place for them.

* Spam that I (usually) don't read, typically easy to identify on the rare occasions that Google doesn't filter it out for me. Emails from Twitter, LinkedIn, Facebook, "someone liked your comment," store advertisements, etc. fall into this category.

This is undoubtedly a slightly rosy picture of my inbox, but I don't think it's grossly distorted, and it doesn't seem especially unhealthy (even when email is present on the phone). What universal, horrifyingly bad experience of using email am I missing?

Expand full comment

I did NOT list emails as one of the horrible things, nor am I looking to reduce use of email in general, so I don't think you are missing anything.

I DO think that checking for new email continuously on your phone or while at your computer, and letting it interrupt you constantly, is a real problem. Having email notifications on your phone I find pretty toxic. But email itself I think is mostly one of the good techs.

Expand full comment

Fair enough, although I do still think that it's usually lumped in with the evils. I think my divergence with you might be (a) I get less email, period -- most of the time, there's 0 unread emails in my inbox that I intend to read, except maybe newsletters; and (b) I find that notifications are what *stops* me checking. I imagine the dynamics would be different if e.g. most of the time you do have unread email to get to, in which case being told that the pile just grew wouldn't be particularly useful.

Expand full comment

Yeah. Inbox 0 is a distant dream. I could try, but... yeah, good luck with that. I also use 'unread' as code for 'needs eventual attention' rather than pure unread.

Expand full comment

_If_ you can handle the (average) 'flow' of emails, inbox zero should be doable.

Handling the initial (or periodic) 'stock' of unread emails is 'easy' enough – 'just' 'declare email bankruptcy' and mark (all/almost-all/most) unread emails as read.

I've done okay at moving unread emails that indicate 'needs eventual attention' thru subsequent 'tiers' of my 'productivity system'; in my case a more regular TODO list and then something more like 'someday-maybe-TODO cold storage'.

Expand full comment

I mean Inbox zero unread declared by fiat doesn't mean anything. What matters would be actually dealing with all my emails - to me that means inbox 0 including what's been read, or damn near it.

Expand full comment

Like regular legal/financial bankruptcy, "inbox zero unread declared by fiat" can mean 'I have too much to get caught up on so I'm just going to (mostly) start over from here'. Also like 'real bankruptcy', it's not literally all or nothing either. Maybe marking every unread email older than six months, or a year, would suffice to let one get to and stay at/around inbox zero.

I think a big part of 'practical inbox zero' is NOT using one's email inbox (i.e. unread emails) as a task/TODO/project list. As part of reading ('processing') one's emails, one should create tasks/TODOs/projects – elsewhere. That frees one's email to function as a (mostly) 'pure communication channel'.

The main benefit of inbox zero is that your emails can serve as a 'well behaved queue', i.e. you have some kind of (rough) 'guarantees' that you will see every email (if not read them entirely or in depth). From the perspective of 'system design', decoupling the basic 'message processing loop' from (long-running) 'tasks' to be potentially performed on receipt of some messages allows one to independently scale the different kinds of work in the system. It's much easier to delegate or assign a specific concrete task in a task/TODO list/system, or a project in some kind of project/issue/ticket system, than it would be to grant access to a subset of one's email inbox.

That written, I currently have ~150 unread emails. Fortunately, almost all of those emails 'need' less than a minute of processing before I'm happy to mark them as read. I am about 1-2 hours from 'inbox zero', and I'm only in this 'email debt' because of an unexpected mostly-overwhelming/urgent 'personal project'. Most of the time, I reach literal inbox zero regularly (e.g. once a day, sometimes several times a day, and usually, at worst, once every few days or once a week).

Where I think it makes more and more sense to get to "inbox zero unread declared by fiat" is when one has thousands (or more) of unread emails. That seems to me to indicate a situation where someone's (unread) emails consist of tasks/TODOs, projects, but also lots of 'someday/maybe' tasks/projects – as well as a considerable amount of potential work that has been obviated by the passage of time. (Or it indicates that read/unread is haphazard and doesn't itself signify anything in particular, other than which emails one might have happened to 'open' at some time.)

Most people with email inboxes like this never seem to 'reach zero', or make any serious effort to do so. That's (potentially) fine! (_I_ often find emailing these people annoying and it makes sense why – I can't reasonably expect that they'll even see my emails to them.)

I've known exactly one person that – in retirement – has made a serious effort at reaching inbox zero starting from thousands (or even tens-of-thousands) of unread emails. I somewhat recently received a reply from them to an email I'd sent more than _a decade ago_. Regardless of whether one aspires to inbox zero, replying to an email sent more than a decade ago (generally) seems pretty silly!

Expand full comment

The average inflow problem is what leads to e-mail despair, in my case. When there is too much inflow and a cultural norm that the really urgent stuff is allowed to be an email (and not text or phone) it is a recipe for email PTSD.

This is where I despise my particular situation’s old outlook system and lack of smart AI options, combining with the abhorrence most people feel at human personal assistants. If you have lots and lots of incoming, but are not in a place to choke that off and some large portion are actionable (or action required)... that’s where I’m coming from when I lump e-mail in with “the evils”. An AI tool or human that (for example) daily compiled all the days various meeting requests, let me categorize into yes/no/maybe, and suggested times would save me absurd amounts of time & stress.

Expand full comment

Hmmm – I'm not sure how what you describe is a fault of email.

"Too much inflow" is a problem for any kind of 'queue'/inbox/pipe/road/etc..

Being able to batch emails is, for me personally, FAR superior to being interrupted by texts, phone calls, or similar, e.g. chat messages/notifications. But then a lot – almost all – of my own work is 'high context', i.e. context switching is VERY expensive.

I didn't know it is true that there's ubiquitous "abhorrence ... at human personal assistants". It seems perfectly fine for people to have personal assistants. I can understand why many people don't have them, e.g. they're expensive, but having too much "urgent stuff" to personally handle seems like a _really_ good reason to have one!

I will readily admit tho that I rarely participate (let alone lead or even organize) meetings, and when I do they're almost always one-on-one meetings. But even at previous jobs when I did have meetings, I had a very low opinion of their utility.

I think I get what you mean by email being one of "the evils" but I can't disentangle what you describe from the inevitable/expected consequences of 'working somewhere terrible' (or just poorly run).

Expand full comment

I don't think everything generalizes outside US/developed countries [I speak from Brazil here]. We do, however, have a serious issue with smartphones and social media. I also have 3 kids, but my eldest is already 13. He is pretty much the single kid in his group at school that does not have social media. My middle child is 6; she complains to me that all her friends have tablets and phones and she doesn't; the little one is 3 and is blissfully unaware of any such things.

I think part of the problem is, yes, our generation and the generation before. Most people my age [30-50] are what used to be called "extremely online"; they are usually checking their phones, scrolling social media. It is unavoiadble for that behavior to be passed on to their kids. Now, the previous generation, my kids' grandmothers? They not only use their phones all the time, but stick the bloody thing in front of the kids! Candy crush and photo apps and whatnots! I would be surprised if I didn't grow up watching TV every morning and every night and having zero supervision on the kind of games I was allowed to play.

Expand full comment

We too struggle with grandmas who will shove a tablet or phone into our 2 year old's face at the slightest whimper.

Expand full comment

Doom scrolling is a big problem. There should be a way on social media to dial in a minimum amount of positivity in what you read: "I want my feed to be at least 85% positive messages" or "keep negative messages confined to weekdays between 7:00 and 8:00 PM". Identifying positive vs negative posts is something AI can help with. Similarly everyone posts about the best days in their lives and it makes it seem like you are losing the game of life. There should be a way to put a control on that. If we give people more videos of puppies playing and less videos of people verbally attacking each other we can create more positivity and reverse the drift towards depression.

Expand full comment

Also, give parents control of the positivity/negativity ratio their kids see on social media.

Expand full comment

I find this a little less obvious; I thought one of the major narratives about why teenage girls might get depressed by social media was the social comparison ("everyone else seems so much more successful than me"), so it might not help as much?

Expand full comment

It seems like the "recommender engines" (a variety of AI), is what is generating the Doom to be scrolled, and it also seems like they are providing exactly the metrics the social media sites want.

It seems like many/most people preferred social media 'pre-The-Algorithm', i.e. when those sites/apps were more like 'feed aggregators'. It was much easier (or, sadly, even possible) to manage the 'positivity' of one's feeds then, but (very sadly) the 'negativity' is much more of what the social media companies want (e.g. to maximize engagement).

The only things still like this, AFAIK, are feed readers, but they won't help you follow anything that doesn't have something like an RSS feed.

Expand full comment

It even happens in neutral ways. I periodically scroll the news headlines Google picks for me in the browser. Because I read so little, I get a lot o "mainstream stuff" (about celebrities and such). Then there's a combination of things I've randomly searched for and occasionally discussed out loud in the presence of my phone. When I switch to searching or browsing in another language for long enough, I start getting recs in that language. Sometimes I have this weird combination of headlines, say about Ballet and Game of Thrones, and almost nothing else - and these remind me that everything is algorithmically picked. It's not really a "curated collection of headlines to give a good overview" or anything like that.

Moreover, if I click a particular kind of article - say, the disaster in East Palestine - I'm bound to start getting more such articles. Just like after I've clicked a few local restaurant and venue review articles, I'm getting more and more of these in my recommendations. It doesn't meant that the frequency of such articles, or the types of events I'm seeing, is actually increasing: but it can seem that way.

Expand full comment

I think the idea is that what you describe as "neutral ways" aren't actually neutral. They're not trying to manipulate you to believe anything in particular; 'just' to get you to 'engage' with those systems (and thus see their ads).

But you're totally right that this 'curation' isn't a good overview of anything! I think that's a much older problem, e.g. yellow journalism, but I'd expect the new 'AIs' (i.e. "recommender engines") are more efficient/optimized for what is mostly the same purpose.

Expand full comment

It's always kinda weird when all the blogs I read put out posts at the same time about the same topical topic...which is +1 for "no I don't miss out on Big News for not using MSM, trust me I'll hear about it anyway". Naturally this was the most rigorous of the takes.

I've always followed a version of your Ten Phone Commandments...the devices just aren't that interesting to me, there's not much to get addicted to in the first place. 95% of "phone usage" is just to play music*, the rest is incidental/being too lazy to do something The Old Fashioned Way on laptop. (Totally agree with you that gating some essential apps and services to mobile-only is a terrible practice - this makes it more acceptable to slack off on quality web design since Everybody Knows you gotta optimize for mobile instead...so even stuff that nominally works on a PC works less well in general. Beware Trivial Inconveniences.) But, boy, trying to convince someone who didn't start with such apparently-rare disposition is hard...it's very much like weaning someone off an actual drug habit. Years of patient debate, showing friends I'm advocating cause I care about them, not out of misguided Luddite tendencies. "This seems to make you unhappy, have you tried Doing Something Which Is Not That? I'm not living some weird hermit life, it's fine!"

But of course that approach doesn't scale, and I really do worry about the network-effect collapse for someone already not-doing-well. Social media might be a shitty simulacra for actual human socialization, but if it's all someone has...so much better to not get hooked in the first place. It's such a perfect collective action problem, really. I used to joke with a former boss that we'd have a much more productive workplace if we could install some cell signal jammers around the building (likely true!), and yet. (Ironically, of course, the best way to scale modeling a social media-free life...would be by being an influencer for such on social media. Are there similar approaches? Why haven't comedians and famous celebrities managed to pull this off already - lack of critical mass?)

*I think quality does matter here, as with the other Acceptable Uses. Aesthetics are a whole confounder, but in general opting for Absorbing Content that discourages low-effort multitasking is probably good...so like, music that one gets lost in, not BGM For IRL. Anything where the experience would be cheaped by also scrolling at the same time. The beat just absorbs all the space: https://www.youtube.com/watch?v=DcJFdCmN98s

Expand full comment

Yeah, all the takes. I liked Maxim truth (Lott). He had units on the y-axis! Suicide rate is 6 per 100,000. No wonder I haven't heard of any teen suicides. There's one thing I like about everyone looking at their phones. I've always loved to have my nose in a book, but I would 'stand out' as the guy in the corner reading. Now I've got my kindle and I don't stand out.

Expand full comment

The thing about books vs. tv. vs. social media gets me thinking along the lines of the whole "the medium is the message" thing. The structure of information delivery itself shapes the information that is delivered. A lot of books are worthless, but something about the effort required to read a book, and social expectations around books leads there being an accrual of quality over time. The exact opposite is true of social media, where the structural incentives just promote the delivery of bad memetics and accrual of nothing of value.

I don't know how to escape the current Molochian paradigm, but I doubt government regulation would help a lot. One observation I've made is that YouTube has a better (not nearly perfect, but better) garbage : useful ratio than most social media, and the incentives are better aligned. Part of it is that the expectation for a "good" YouTube video it will last 5 to 30 minutes and be skillfully edited. This is in stark contrast to TikTok or Instagram content which is expected to last 30s or less. So, creating incentives for companies to shift their models towards a YouTube-like paradigm might be helpful at the margins.

*shrug*

Expand full comment

OK, and I'm sitting here thinking the scariest AI is the one setting your youtube or phone feed, to keep clicking away on the next thing. Scary, and yet I love listening to podcasts, and if some AI would look at the podcasts I listen to, and then suggest others I might like... well I would like that.

Expand full comment

Extremely naive question, driven by seeing that the '90s cohorts were struggling too. (This is NOT mutually exclusive with the phones hypothesis, to be clear).

Is it possible that there's something about the Silent Generation - Gen X - Gen Z sequence that doesn't apply to the Greatest Generation -- Baby Boomer -- Millennial sequence in the same way?

I'm aware that maybe I'm just letting correlation/bucketing drive causation here, and that generations are messy buckets at best, but honestly, look at the _set of names_ that we have for the latter sequence. You start off with a generation that, in the words of the Ninth Doctor, had "Lots to do. Save the world. Beat the Germans. And don't forget the welfare state!" Then you get the Boomers that grow up in the peak of American confidence and self-deal themselves every benefit they can (e.g., Prop 13). The Millennials don't get as great of a deal, but they grow up before things start to feel really constrained.

By contrast, you have the generational sequence that is unbeloved by American history and society in every way. The Silent Generation doesn't even get a memorial for the Korean War until 15 years after the Vietnam War memorial opens. Gen X that grows up in the end of history in a society built on a myth of endless frontiers. Gen Z gets, well, a future that (even if better) feels awful.

Maybe kids growing up after 2011 have the bad luck of being in the relatively traumatized family generational sequence, instead of the relatively happy one.

Expand full comment

I gotta push back on one thing here: Prop 13 was not Boomer self-dealing; the median boomer was still in college when it passed. The motivated population was GGs with paid-up mortgages who wanted to keep their retirement-age housing costs at pre-boom levels during a boom. Shunting the tax burden onto first-time buyers was a disaster for first-time buyers: that is, precisely, late Boomers. (I mean, take one look at a picture of Howard Jarvis in 1977 and tell me that’s a Boomer in 1977.)

Expand full comment

I think that we might be in the midst of a positive change in social media habits with the growing popularity of twitch + discord. Small streams with <100 people feel intimate and like hanging out in a way that social media almost never does. I think this will trickle down to teens eventually, and make hanging out come back.

Expand full comment

This is really a great summary, investigation and essay. Thank you.

I think if you wanted to explain the 90's misery bump along the same lines as the more recent bump, you could in fact add in the "end of the world" narratives that were cropping up. I remember the late 80s and early 90s being a time of real "we need to act to save the world!" starting to become more common in TV. Stories about how the world was doomed due to global cooling/warming with claims about everyone dying in a few years unless people act now were becoming common, and taught in school, for instance. I remember a lot of people who took that at face value. Not as much as today, but the ubiquity was ramping up, combined with more serious application of mass media TV and cable. Along with that catastrophizing was an increase in "look how good your life could/should be" type shows, although I don't know how much of an impact that had.

(I was outside the cable bubble, as we didn't have access to the 4 broadcast channels till my parents got a satellite dish in '99, but I recall my friends watching very different shows, and some of my cohort seeming to watch a LOT of tv while the rural kids like me had options along the lines of "Golf tournament, or go do something else.")

I recall though that by the time I was in college around the turn of the century, people didn't care as much about catastrophes and were more excited about online possibilities and doing stuff together. I don't remember people bringing up e.g. climate stuff much at all then. All the woke stuff seemed to bubble up more seriously by the late aughts and early teen's, but I might just not have noticed as much.

Expand full comment

My guess for why the "Youth Risk Behavior Survey ... collected every two years among a nationally representative sample of U.S. high school students" fails to contain stats on completed suicide attempts has to do with the notoriously low survey response rates among people who have previously killed themselves.

Expand full comment

To add to your theory re: video games in the 90s/2000s: online games were starting to come...online right around 2000. I was already playing MUDs as early as '97 but they were always extremely niche. The first non-MUD online game I really got into the community around was Tribes: Aerial Assault on PS2 c. 2002 and by then stuff like Everquest was also already out. Original Xbox was 2001 and the most gaming I did with strangers in high school was Halo LAN parties in some dude's basement c. 2004. Definitely feel like you could make the case that gaming became much more social right around the year 2000.

otoh video games broadly were still kind of niche; I don't think anyone else in my main high school friend group was into games, though I also had several random other friends who were. There's also a lot to be said about how various forms of matchmaking that started to become in vogue in the mid 2010s tended to dampen forming new communities and friend groups around games. I personally find it hard to enter into new communities spontaneously (although this is especially difficult with an adult schedule now) and my regular group of "gaming friends" are a group of guys I played WoW with for a few months in like 2006 that happened to stick.

Also, Occupy Wall Street was late 2011, which is the point where I'd argue social justice went mainstream and the modern iteration of protest culture took off. If I knew more about the Tumblr trend of if-you're-not-mad-about-thing-of-the-week-you're-literally-the-worst I'd add in some speculation there as well. My internal reference for this was a couple SSC posts from around 2014 so it was already a thing by at least that point. #Kony2012 was 2012.

Expand full comment

> …and make concerted efforts to see people in person as often as possible

I'll be in New York around the end of Passover (around April 12-21), let me know if you want to meet up around then

Expand full comment

I've lost track of the number of times people on ACX and related places have asserted that things are "objectively better" today than in the past, by which they mean certain economic metrics have ostensibly improved, and everything else is irrelevant.

Expand full comment