But the relief was short-lived, the hope was fleeting, and we are amid another surge. A surge that is fueled by a highly transmissible variant and those unvaccinated. My experiences in the ICU these past weeks have left me surprised, disheartened, but most of all, angry.
I am angry that the tragic scenes of prior surges are being played out yet again, but now with ICUs primarily filled with patients who have chosen not to be vaccinated. I am angry that it takes me over an hour to explain to an anti-vaxxer full of misinformation that intubation isn’t what “kills patients” and that their wish for chest compressions without intubation in the event of a respiratory arrest makes no sense. I am angry at those who refuse to wear “muzzles” when grocery shopping for half an hour a week, as I have been so-called “muzzled” for much of the past 18 months.
I cannot understand the simultaneous decision to not get vaccinated and the demand to end the restrictions imposed by a pandemic. I cannot help but recoil as if I’ve been slapped in the face when my ICU patient tells me they didn’t get vaccinated because they “just didn’t get around to it.” Although such individuals do not consider themselves anti-vaxxers, their inaction itself is a decision — a decision to not protect themselves or their families, to fill a precious ICU bed, to let new variants flourish, and to endanger the health care workers and immunosuppressed people around them. Their inaction is a decision to let this pandemic continue to rage.
And meanwhile, immunocompromised people, for whom vaccines don’t generate much immunity, are desperately waiting for herd immunity. I have no way to comfort my rightfully outraged transplant patients who contracted COVID-19 after isolating for over a year and getting fully vaccinated as soon as they could. With angry tears, these patients tell me it’s not fair that there are people who are choosing to endanger both themselves and the vulnerable people around them. They feel betrayed by their fellow citizens and they are bitter and angry. I cannot blame them.
I am at a loss to understand how anyone can look at these past months of the pandemic — more than 600,000 lives lost in the U.S. and more than 4 million worldwide — and not believe it’s real or take it seriously. But the unhappy truth is that there are people who do not. They did not in the beginning and many are doubling down now.
look... sometimes being extremely on the nose is good! and I really appreciate FIF's commitment to calling out the sadistic hypocrisy of the American religious right-wing.
Let's keep them separate; melanin just can't succeed. "Give me liberty… or something." It's better if you just don't read. Crank your phasers up to "slaughter," turn your wine back into water. When you play this song, Al Qaeda wins, and Jesus was American!
The United State of Amnesia, make us numb, make it dumb, anesthesia. Cut the cord, close the door, we don't know 'ya. This is zen and the art of xenophobia.
i haaaaaaate that every icon has the same silhouette now!!!! I CANNOT LOCATE APPS IN THE DOCK OR SWITCHER ANYMORE BY SHAPE-BASED INTUITION AND IT MAKES ME VERY SAD / wastes my time and disorients me constantly
remember when “iconic” meant that something was instantly recognizable?
"baffled and bewildered" — removing the deck chairs from macOS
I really appreciate this piece by Riccardo Mori entitled, "Habits, UI changes, and OS stagnation". He works through the arguments around how to tell whether something feels bad just because it's new, or because it's truly worse in some way. It can be a hard thing to put your finger on, especially quickly! But in light of the ongoing cascade of bad design decisions coming out of Apple right now, I really like his rubric for changes to macOS specifically.
The argument “Is this really bad UI, or is it just you who are averse to change?” will never go away, huh? A change in a user interface can be disruptive, but it’s usually easy to see if it’s disruptive-beneficial or disruptive-confusing or ‑frustrating after a while.
You can see when change brings more thoughtfully-designed UI details. Saying that “You just need some time to get used to it” is in itself indicative that the new UI is problematic. You can completely redesign an app, but if the new UI is well-designed, people will figure it out.
When change ultimately brings UI rearrangement for UI rearrangement’s sake, then you just offer something that is user-hostile. Changing habits can be healthy if it brings improvement.
If users have a poor reaction to having to relearn your non-intuitive changes just because you felt the need to ‘refresh’ your app, doesn’t mean people are lazy or change-averse. It means they’re annoyed at your lack of respect for their productivity and their time.
"Lack of respect" is precisely what I feel when looking at Apple's software over the last 5+ years!!! I also fully agree with these points:
The two major things I find especially misguided about Mac OS are:
The fact that Apple considers it a product that needs to look cool and be shown off, instead of a utility that runs computers.
The fact that Apple feels the need to release a new version of it every year.
Followed by a nice summary of the goals of OS X at its release, emphasizing its usefulness, not its marketability. Then he describes nicely the feeling of being subjected to useless or impeding changes in tools you use every day, with my emphases in bold:
This insistence around the most superficial aspects of a graphical user interface — the look — often reminds me of the constant redesign iterations of some third-party apps in an attempt to make them more alluring to customers and to increase sales. The hyperfocus on always looking new and fresh can sometimes lead to harsh breaks in an app’s ‘usability continuum’ (as I like to call it). I’m sure you’ve experienced it more than once if you have been using Mac and iOS apps for the past several years. The developer triumphantly announces the ‘significant visual overhaul’ in the app’s changelog, and after the (often inescapable) app update you are presented with something that has changed so much, its controls completely rearranged, that it becomes unrecognisable and essentially forces you to relearn how to use the app as proficiently as before.
Both for work reasons and for personal research, I’ve had a lot of experience dealing with regular, non-tech-savvy users over the years. What some geeks may be shocked to know is that most regular people don’t really care about these changes in the way an application or operating system looks. What matters to them is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways. Like saving mouse clicks or making a multi-step workflow more intuitive and streamlined.
But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress. It’s change for change’s sake. It’s rearranging the shelves in your supermarket in a way that seems cool and marketable to you but leaves your customers baffled and bewildered.
I love that a little further on, he refers to Windows as "mastodontic." What a great word!
Microsoft may leave entire layers of legacy code in Windows, turning Windows into a mastodontic operating system with a clean surface and decades of baggage underneath. Apple has been cleaning and rearranging the surface for a while now, and has been getting rid of so much baggage that they went to the other extreme. They’ve thrown the baby out with the bathwater, and Mac OS’s user interface has become more brittle after all the changes and inconsistent applications of those Human Interface Guidelines that have informed good UI design in Apple software for so long.
This act of ‘reinventing the wheel over and over’ has been incredibly stifling and has, in my opinion, largely lead to operating system stagnation. Roughly since Mac OS X 10.7 Lion onward, Mac OS has gained a few cool features, but it has been losing entire apps, services, and certain facilities — like Disk Utility — have been dumbed down. Meanwhile the system hasn’t really gone anywhere.
An operating system is something that shouldn’t be treated as an ‘app’, or as something people should stop and admire for its æsthetic elegance, or a product whose updates should be marketed as if it’s the next iPhone iteration. An operating system is something that needs a separate, tailored development cycle. Something that needs time so that you can devise an evolution plan about it; so that you can keep working on its robustness by correcting bugs that have been unaddressed for years, and present features that really improve workflows and productivity while building organically on what came before. This way, user-facing UI changes will look reasonable, predictable, intuitive, easily assimilable, and not just arbitrary, cosmetic, and of questionable usefulness.
Here's a reminder that, when confronted with a terrible, tiny-text page like this, you can click the Reading View on most browsers to make it show the text at a better size! or just hit cmd + = a few times on a Mac to make stuff bigger!
But rather than question whether Poe’s findings are verified by the facts, shouldn’t we rather ask whether he manages to elicit in his readers the feeling of surprise that accompanies discovery? In literature, proximity to discovered facts is far less important than adherence to the internal laws of discovery itself. In other words, it’s a question of forming a hypothesis and then seeing to what extent you can erect a new system of rules, utterly different from the existing rules of our everyday lives.
Maybe what we call the everyday is just thought without hypotheses. Or rather hypotheses exist, but they cling so stubbornly to phenomenal reality that they have already lost their function. When a fresh hypothesis is brought in, the everyday is suddenly destabilized and begins to take on strange new forms. It becomes activated, objectified, and our consciousness is roughly shaken.
Ghost Stories that do not Believe in Ghosts. If I am correct above, then the “s” of sf need not stand for science. Not only would semi-science do just as well, but one could use anything that made an effective hypothesis, even something without any appearance of the scientific. The observation I made earlier—that the term science fiction today is used to include horror and fantasy—is not something due simply to the fledgling state of the genre. Rather, it is because of the fundamental nature of science fiction that this conflation occurs.
Actually, it is said that Poe’s initial motive for writing these science fiction-like works was a desire to ironize or parody the taste for the grotesque that was sweeping society at the time. From their very inception, then, science fiction and horror shared a common lineage.
Nonetheless, science fiction and horror are not the same thing. The difference, obviously, lies in whether the monster is simply a monster, or whether it represents a hypothesis with which to plumb reality. For example, Poe’s creatures are certainly hypothetical beings, but that quality of hypothesis is diminished in the monsters of E.T.A. Hoffmann. So while we can refer to Poe as a pioneer in science fiction, it is rather more difficult to call Hoffmann a science fiction author. Perhaps one could say that a ghost story writer produces ghost stories that believe in their own ghosts, while the science fiction author writes ghost stories with no such belief.
I'm into this framing!
Then there's a whole section about Frankenstein that's great. Just a couple of excerpts:
Thanks to the movies, Frankenstein has achieved the rank of a horror superstar, but in the films he never moves beyond the status of a horror to believe in—in other words a ghost story monster rather than a hypothetical one. The screen versions are less science fiction movies than horror movies. [... In the book,] this horrible monster is actually nothing less than a hypothesis for plumbing the depths of human love and solitude. Unlike the films, this is straight science fiction. If movies don’t rapidly develop to the point where they can treat the monster hypothetically, there will never be a science fiction movie in the true sense of the term.
And then the second interview is very applicable 60 years later, during the Disneyfication of the fantastic in film, and the franchising of compulsory, meaningless heroism in games:
There is a passage somewhere—in a Thomas Mann novel, I think—to the effect that long ago, before a lion was called a lion, it was a supernatural being to be feared like a demon; but once it had been given the name “lion,” it became just another wild animal that could be overcome by humans. There is no question that unknown objects are much more disturbing than those we know; they also have a far greater potential energy. An unknown monster X in the forest is far more frightening than the familiar lions on view at just about every zoo in the country. This is the common ground of the mystery, science fiction, and ghost-story genres. Although each is written in a different way, all are firmly rooted in the same quest for things unknown, the same world of enigma.
I suspect, however, that science fiction, which used to stand for challenging the unknown, has recently come to side instead with the ready-named. The mysterious fascination and vigor that it should be the explorer’s privilege to seek has rather taken on the air of a circus lion.
The question “What is literature?” is impossible to answer, and for that very reason—like the definition of mankind, or being, or the world—it is able to be an eternal question.
We need the same degree of modesty when it comes to science fiction. It is high time that people stopped throwing tantrums like so many spoiled children the moment anyone portrays the genre in a way different from their own conceptions of it.
What use is it inoculating readers against sf or putting it on a leash? But turning sf into a performing lion means just that. By the same token, it is no wonder that sf, as the number one monster-hunter among the genres of literature, is a bigger monster than its quarry. And that is why I am among those who value it as the very literature among literatures. When I look at the state of Japanese literature, a dressed-up herbivore fawning on weak-kneed pseudo-lion tamers called critics, my hopes rest all the more fervently on the monstrousness of science fiction.
And then, particularly for Noam:
Seeing that our sf faction, through regular contact with aliens, has accumulated a wealth of experience with patterns of invasion, I suggest that we use our expertise to infiltrate. Since my strategy is secret, I am unable to disclose any operational details, but I am sure that simply writing this will inspire all the readers on our side to plenty of merry tactics: seeking out new dimensions, shape-shifting, time travel, and operations involving metamorphosis, uncertainty, parody, etc.
Indeed, we may have gone into action already. The monster sf is not to be gauged by common sense, and it may already have taken up the fight without our knowing it. Surely, then, it is all part of a plan that the science fiction boom never actually progressed beyond fandom, or that SF Magazine has still not achieved the biggest circulation in Japan. Yes, as long as science fiction continues to be unnameable, we need not give up hope.
And as an addendum, here's a page Ryan showed me from the afterword to Wizard of Earthsea.
I love this. And all of this circles around to a growing frustration I have with games that insist on using tired, medieval settings — whether set in the past, future, or another world, almost all games manage to be medieval, if you know what I mean!! — to tell empty stories about power grounded in nothing, simply to facilitate comfort(?) or approachability(?) or simply an extreme shortage of imagination????? I'm absolutely exhausted by it, and trying my best to come up with other paths for games that can still be mechanically compelling or even familiar, while refusing to simply inhabit any of the shells littering the beachhead of culture.
Arthur Koestler on how bad Behaviorist psychology is
I've been dipping in and out of a bunch of books recently! Last night took me into Koestler's The Ghost in the Machine from 1967. I've read bits of a few of Koestler's books; I don't know much about his intentions or broader perception, but I tend to like his casual tone while challenging widely-held conventions.
Here are a few bits I highlighted while reading, as he goes through how strange it is that the dominant form of psychological research in the early 20th century was dedicated to ignoring consciousness. It's still relevant-feeling now because of the widespread application of Pavlovian/Behaviorist ideas and mechanisms throughout our world, particularly in software/game development and their deliberate lean into psychological manipulation.
By far the most powerful school in academic psychology, which at the same time determined the climate in all other sciences of life, was, and still is, a pseudoscience called Behaviourism. Its doctrines have invaded psychology like a virus which first causes convulsions, then slowly paralyses the victim.
On the strength of this doctrine, the Behaviourists proceeded to purge psychology of all 'intangibles and unapproachables'. The terms 'consciousness', 'mind', 'imagination' and 'purpose', together with a score of others, were declared to be unscientific, treated as dirty words, and banned from the vocabulary. In Watson's own words, the Behaviourist must exclude 'from his scientific vocabulary all subjective terms such as sensation, perception, image, desire, purpose, and even thinking and emotion as they were subjectively defined'.
Psychology used to be defined in dictionaries as the science of the mind; Behaviourism did away with the concept of mind and put in its place the conditioned-reflex chain.
In his standard work Science and Human Behaviour the hopeful student of psychology is firmly told from the very outset that 'mind' and 'ideas' are non-existent entities, 'invented for the sole purpose of providing spurious explanations. . . . Since mental or psychic events are asserted to lack the dimensions of physical science, we have an additional reason for rejecting them'.  By the same logic, the physicist may, of course, reject the existence of radio waves, becanse they are propagated through a so-called 'field' which lacks the properties of ordinary physical media. In fact, few of the theories and concepts of modern physics would survive an ideological purge on Behaviourist principles — for the simple reason that the scientific outlook of Behaviourism is modelled on the mechanistic physics of the nineteenth century.
The attempt to reduce the complex activities of humanity to the hypothetical 'atoms of behaviour' found in lower mammals produced next to nothing that is relevant — just as the chemical analysis of bricks and mortar will tell you next to nothing about the architecture of a building. Yet throughout the dark ages of psychology most of the work done in the laboratories consisted of analysing bricks and mortar in the hope that by patient effort somehow one day it would tell you what a cathedral looked like.
The unique attributes of humanity, verbal communication and written records, science, art, and so forth, are considered to differ only in degree, not in kind, from the learning achievements of the lower animals — once more epitomised, for Hull as for Skinner, in the bar-pressing activities of the rat. Pavlov counted the number of drops which his dogs salivated through their artificial fistulae, and distilled them into a philosophy of man; Professors Skinner, Hull and their followers took an equally heroic short cut from the rat in the box to the human condition.
Skinner did not intend to write a parody.* He means it seriously.
Both are engaged in question-begging on a heroic scale, apparently driven by an almost fanatical urge to deny, at all costs, the existence of properties which account for the humanity of the human and the rattiness of the rat.
[...] the crude slot-machine model, in its modernised, more sophisticated versions, has had a profounder influence on them — and on our whole culture — than they realise. It has permeated our attitudes to philosophy, social science, education, psychiatry. Even orthodoxy recognises today the limitations and shortcomings of Pavlov's experiments; but in the imagination of the masses, the dog on the laboratory table, predictably salivating at the sound of a gong, has become a paradigm of existence, a kind of anti-Promethean myth; and the word 'conditioning', with its rigid deterministic connotations, has become a key-formula for explaining why we are what we are, and for explaining away moral responsibility.
At first its intention was merely to exclude consciousness, images and other non-public phenomena as objects of study from the field of psychology; but later on this came to imply that the excluded phenomena did not exist.
This is one of the most frustrating things to me about the reductive scientific worldview, which persists here almost 60 years later. Thinking exists, it happens, you're doing it right now. What it is, exactly, is up for discussion, but it's trash semi-intellectualizing to assert that the sum of human (and other animal!) consciousness can be reduced to on-off electricity in a salt barrier, as one smarmier-than-thou scientist asserted to me a few years ago. Those things might be observable, but they don't add up to anything approaching the whole!
Werner Heisenberg, one of the greatest living physical scientists, has laconically declared: 'Nature is unpredictable'; it seems rather absurd to deny the living organism even that degree of unpredictability which quantum physics accords to inanimate nature.
It is impossible to arrive at a diagnosis of humanity's predicament — and by implication at a therapy — by starting from a psychology which denies the existence of mind, and lives on specious analogies derived from the bar-pressing activities of rats. The record of fifty years of ratomorphic psychology is comparable in its sterile pedantry to that of scholasticism in its period of decline, when it had fallen to counting angels on pin-heads — although this sounds a more attractive pastime than counting the number of bar-pressings in the box.
And finally, a side note as he acknowledges he's made some of these points in earlier books:
It is embarrassing to have to repeat, over and again, that two half-truths do not make a truth, and two half-cultures do not make a culture.