Of Snobbery and Boredom

A thought about my most recent review:

I don’t tend to like things unless they stand out from the herd. Call that elitism, call it snobbery; I don’t care. I can’t pretend to like things I don’t like.

One thing I feel obligated to point out is that I detest snobbery. Snobbery is close-minded, passive bullying. Snobbery is adopting a categorical rule that X kind of story, told by Y kind of people, cannot possibly be good. It is a sweeping (possibly hasty) generalization, a fallacy of relevance.

That line about not liking things unless they stand out from the herd sounds snobby as hell, but I dont’ really mean it like that. I’m not holding myself above the Great Unwashed and their low-brow tastes. I’m fine with common tastes and basic stuff. They can be a positive tonic.

What I mean by that is I get bored of seeing the same kind of stories over and over. Everyone likes to dump on Hallmark movies this time of year for their cookie-cutter plots, but the truth is almost every genre has tropes that it regularly employs. This is true of so-called “Prestige Television” as well. No one who sat through the Game of Thrones finale could have escaped how obligatory that ending felt.

Certain kinds of stories appeal to me more than others. That’s personal taste. What snobs do is conflate their personal taste with universal aesthetic truths. A story may not interest me, but it would be wrong to say that a story is bad because I’m not interested in it.

So you should never take my grumbles about Nothing to See at the Theater as serious aesthetic judgements. That’s just me being bored, and venting spleen accordingly. 80% of all my prejudices are “Good Lord, this again…” That doesn’t prevent me from overcoming it.

Generations Are Arbitrary. Act Accordingly.

punk-band-generation-x
Members of this band were all Boomers.

The idea of generations, especially as Demographers use them, is overrated. I’ve said so before, I’ll likely go on saying it.

Being born at the same time as others gives you a set of shared cultural memories and not much else. Now, those shared cultural memories can be powerful, especially given the rate of pop culture decay, but they aren’t as determinative as people like to believe.

I have some more to say on this topic, over on Contena.com, which is a writer’s resource that’s added a blogging feature. I like to try out blogging features, so I penned Dead X.

The idea of “Generation X” was coined by the Canadian writer Douglas Coupland, and he was referring to people born around his time, the late fifties to early sixties, who came of age in the seventies. Too young to really be involved in the great Sixties upheaval, they lived in the immediate consequences of it. We would call these folk today “Late Boomers”.

Now, this is a provocative idea, if we were to apply the 15-year cycle that Democraphers are fond of using today. What if, instead of this:

  • Baby Boom: 1946-1964
  • Generation X: 1965-1980
  • Millenials: 1981-1995
  • Generation Z: 1996-2010
  • Generation Alpha: 2011-2025

We borrowed from Coupland’s original notion, and went with this:

  • Baby Boom: 1946-1960
  • Generation X: 1961-1975
  • Generation Y/Xennials: 1976-1990
  • Generation Z/Millenials: 1991-2005
  • Generation Alpha: 2006-2020

This setup has the virtue of a) recognizing that postwar birthrates started to decline in the early 60’s, when birth control became a reality, b) using the original conception of Generation X, c) moving the group called “Millenials” to those born around the actual Millenium, and d) giving the “Xennial” identity an actual demography.

Of course, it would shift myself from Generation X to Generation Y, but it would put me in the same Generation as my wife, so… I can live with it.

The link to Dead X on my Contena profile again, that you may Read the Whole Thing.

 

We Have to Be Liked

Bret Easton Ellis, writing in his essay collection White, on the social-corporate demand of inclusivity:

Most people of a certain age probably noticed this when they joined their very first corporation. Facebook encouraged its users to “like” things, and because this platform is where they branded themselves on the social Web for the first time, their umpulse was to follow the Facebook dictum and present an idealized portrait of themselves — or or a nicer, friendlier, duller self. And this is where the twin ideas of likability and “relatability” were born, which together began to reduce all of us, ultimately, to a neutered clockwork orange, enslafed to yet another corporate version of the status quo. To be accepted, we had to follow an upbeat morality code under which everything had to be liked and everybody’s voice had to be respected, and anyone who held negative or unpopular opinions that weren’t inclusive — in other words, a simple dislike — would be shut out of the conversation and ruthlessly shamed. Absurd doses of invective were often hurled at the supposed troll, to the poitn where the original “offense” or “transgression” or “insensitive dickish joke” or “idea” seemed negligible by comparison. In the new post-Empire age we’re accustomed to rating TV shows, Restaurants, video games, books, even doctors, and we mostly give positive reviews because nobody wants to look like a hater. And even if you aren’t one, that’s what you’re labeled as if you steer away from the herd.

I like this because it’s a take obverse from the usual complaint about the internet and social media: a festering boil of rage and uncouthery. I myself have described Twitter as “both the tape recorder and the riot”. But this suggests that really everything is pushing the other way, as social media purges those lacking social credit, as the Chinese put it. Skynet turned out to be far more seductive than we thought.

31b05vitlyl._sx346_bo1204203200_

Click here for link.

It Appears Disney Is Going All-In on Streaming

So suggests Variety.

This explains what I’ve been wondering about, both here on the blog and on my podcast: Disney’s current strategy of “nothing but live-action reboots and Frozen 2.” It also jibes with the news that they’re done making Star Wars films for the time being (and with the rumors that the test screening for Episode 9 have not been going well).

In a nutshell, movies are too expensive and risky to make when there’s a cheapier method to getting content onto the screen.

If you spend $250 on a movie, you’ve got a single run in theaters plus home-video sales to make your money back. If the movie doesn’t find an audience, that’s a bunch of sales you aren’t going to get.

But if you have a captive audience of people throwing down $7 a month for all of your content, you don’t need to worry so hard. They’re paying, and you just have to keep putting half-way decent content up, and they’ll be happy.

So it appears that the Spielbergs and Scorseses are wrong: this is what cinema is going to be. If a powerhouse like Disney decides it’s not worth it to put Star Wars movies in theaters, then movie theaters are going the way of the dodo.

Planned Obsolescence Update

Everything has moved on from my previous computing issue. As it turns out, putting 16gigs of RAM when you previously had 4 improves things nicely (Also, getting the dust out from inside of there… yeesh.) So I’ve upgraded my Scrivener and started on the short pieces for the next issue of UJ. Also, I’m planning on doing some recording and practicing with the new version of GarageBand, as I’m pretty sure it let’s you make your own drum loops and fills. That will be fun.

The difficulty always lies in using the computer as a tool rather than a distraction. To sit down with a goal in mind, even if that goal is “write 500 words” focuses the mind nicely, I’ve found. Scott Adams may disagree, but small, doable goals are a way to build to “systems”

Escaping from Planned Obsolescence

I bought my current home desktop computer, a Mac Mini, in 2012. I bought it on Father’s Day, and I sprung extra for the newly released version. It has reliably served my needs. I write, blog, pay my taxes, record podcasts and music, and play copious amounts of Crusader Kings 2 on this machine. It has been a solid investment.

This past week, I nearly destroyed it.

The new iOS on my phone is now sufficiently advanced that I cannot upload photos to my machine from it (I don’t use the cloud. I don’t trust the cloud. The cloud is a lie). In order to fix this, I needed to install the new MacOS, Catalina. Which I did.

I immediately regretted this decision.

In the first place, a bunch of apps that I regularly use won’t work with Catalina. Like Scrivener, which I use for writing. Like Microsoft Word. Like Garageband, which I use to record podcasts.

So Scrivener I can upgrade at a discouted rate by sending the developer an email with my receipt from the Mac App store I bought it from. Annoying, but doable. Microsoft word I can download for free, but they want $70 a year for an Office subscription. I feel like I can do better.

Garageband took some figuring out, but by deleting the old app I was able to download the new app, for free, and used it to record a podcast on Saturday. So far, so good.

But

the

computer

is

now

so

slow.

are-you-fucking-kidding-me-cle

Now, a new MacMini costs $800. And my wife’s computer is older, so she gets dibs on a replacement. Me shelling out that kind of cash because I let Apple shove me down the obsolescence treadmill is… well, it’s just dumb.

But, that generation of Mac Minis was designed to be user-upgradable (I know, is it even an Apple product?). So upgrading the memory will be really easy. I can do it myself for the price of some new RAM, which is currently on its way via Amazon.

And if that doesn’t work? I can try installing a new hard drive, which would actually be even cheaper. I didn’t go with it because there’s plenty of disc space on my existing hard drive, so I figured RAM was the problem.

The point is, I have options. I can gain expertise and understanding. I can do for myself instead of just chorfing the next part of the product cycle. I can, in short, act like an actual rational human.

And be silently thankful that Apple actually made choosing that option easy.

Dan Simmons Demonstrates There’s No Such Thing as Bad Publicity

Apparently he committed thoughtcrime by criticizing Little Angry Climate Girl, whereupon the usual gang of Two-Minute-Haters jumped up and down, whereupon his most well-known book shot up to #1 on Amazon. Larry Correia has the details.

Now, logically speaking, we must stipulate that Correlation is not Causality, so it’s entirely possible that the Legions of Woke were not the cause of Dan Simmons’ thirty-year-old book getting purchased by everyone who wearies of the Legions of Woke.

But if something else were the cause, then that might be even worse for the Neo-Puritans. Because that means their *INTERNET RAGE* had no power to derail … whatever that cause was. Incompetence or irrelevance, take your pick.

This reminds us that, absent a real armed struggle, the perpetually angry only have the power that you grant them. And once people realize that, realize that there are plenty of people who are sick as they are of the endless noise, then the noise retreats accordingly. As Rotten Chestnuts has it:

once the revolutionary fervor passed away with the first generation of fanatics, Puritanism was unsustainable.  In Massachusetts, for example, they were hanging witches in 1693; by 1698 Cotton Mather was being openly mocked, and by 1700 everyone was pretending that the whole sordid business never happened.

Stand Your Ground seems to be the operant principle.