— Berserk, Chapter 5
— Berserk, Chapter 5
i haaaaaaate that every icon has the same silhouette now!!!! I CANNOT LOCATE APPS IN THE DOCK OR SWITCHER ANYMORE BY SHAPE-BASED INTUITION AND IT MAKES ME VERY SAD / wastes my time and disorients me constantly
remember when “iconic” meant that something was instantly recognizable?
I really appreciate this piece by Riccardo Mori entitled, "Habits, UI changes, and OS stagnation". He works through the arguments around how to tell whether something feels bad just because it's new, or because it's truly worse in some way. It can be a hard thing to put your finger on, especially quickly! But in light of the ongoing cascade of bad design decisions coming out of Apple right now, I really like his rubric for changes to macOS specifically.
The argument “Is this really bad UI, or is it just you who are averse to change?” will never go away, huh? A change in a user interface can be disruptive, but it’s usually easy to see if it’s disruptive-beneficial or disruptive-confusing or ‑frustrating after a while.
You can see when change brings more thoughtfully-designed UI details. Saying that “You just need some time to get used to it” is in itself indicative that the new UI is problematic. You can completely redesign an app, but if the new UI is well-designed, people will figure it out.
When change ultimately brings UI rearrangement for UI rearrangement’s sake, then you just offer something that is user-hostile. Changing habits can be healthy if it brings improvement.
If users have a poor reaction to having to relearn your non-intuitive changes just because you felt the need to ‘refresh’ your app, doesn’t mean people are lazy or change-averse. It means they’re annoyed at your lack of respect for their productivity and their time.
"Lack of respect" is precisely what I feel when looking at Apple's software over the last 5+ years!!! I also fully agree with these points:
The two major things I find especially misguided about Mac OS are:
- The fact that Apple considers it a product that needs to look cool and be shown off, instead of a utility that runs computers.
- The fact that Apple feels the need to release a new version of it every year.
Followed by a nice summary of the goals of OS X at its release, emphasizing its usefulness, not its marketability. Then he describes nicely the feeling of being subjected to useless or impeding changes in tools you use every day, with my emphases in bold:
This insistence around the most superficial aspects of a graphical user interface — the look — often reminds me of the constant redesign iterations of some third-party apps in an attempt to make them more alluring to customers and to increase sales. The hyperfocus on always looking new and fresh can sometimes lead to harsh breaks in an app’s ‘usability continuum’ (as I like to call it). I’m sure you’ve experienced it more than once if you have been using Mac and iOS apps for the past several years. The developer triumphantly announces the ‘significant visual overhaul’ in the app’s changelog, and after the (often inescapable) app update you are presented with something that has changed so much, its controls completely rearranged, that it becomes unrecognisable and essentially forces you to relearn how to use the app as proficiently as before.
Both for work reasons and for personal research, I’ve had a lot of experience dealing with regular, non-tech-savvy users over the years. What some geeks may be shocked to know is that most regular people don’t really care about these changes in the way an application or operating system looks. What matters to them is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways. Like saving mouse clicks or making a multi-step workflow more intuitive and streamlined.
But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress. It’s change for change’s sake. It’s rearranging the shelves in your supermarket in a way that seems cool and marketable to you but leaves your customers baffled and bewildered.
I love that a little further on, he refers to Windows as "mastodontic." What a great word!
Microsoft may leave entire layers of legacy code in Windows, turning Windows into a mastodontic operating system with a clean surface and decades of baggage underneath. Apple has been cleaning and rearranging the surface for a while now, and has been getting rid of so much baggage that they went to the other extreme. They’ve thrown the baby out with the bathwater, and Mac OS’s user interface has become more brittle after all the changes and inconsistent applications of those Human Interface Guidelines that have informed good UI design in Apple software for so long.
This act of ‘reinventing the wheel over and over’ has been incredibly stifling and has, in my opinion, largely lead to operating system stagnation. Roughly since Mac OS X 10.7 Lion onward, Mac OS has gained a few cool features, but it has been losing entire apps, services, and certain facilities — like Disk Utility — have been dumbed down. Meanwhile the system hasn’t really gone anywhere.
An operating system is something that shouldn’t be treated as an ‘app’, or as something people should stop and admire for its æsthetic elegance, or a product whose updates should be marketed as if it’s the next iPhone iteration. An operating system is something that needs a separate, tailored development cycle. Something that needs time so that you can devise an evolution plan about it; so that you can keep working on its robustness by correcting bugs that have been unaddressed for years, and present features that really improve workflows and productivity while building organically on what came before. This way, user-facing UI changes will look reasonable, predictable, intuitive, easily assimilable, and not just arbitrary, cosmetic, and of questionable usefulness.
do you think, when mario wakes up and needs to p-block in the middle of the night, he makes yoshi carry him?
Here are a few quotes from the aforementioned essays by Kôbô Abe about sci-fi's relationship to facts and the joy of discovery.
Here's a reminder that, when confronted with a terrible, tiny-text page like this, you can click the Reading View on most browsers to make it show the text at a better size! or just hit
cmd + = a few times on a Mac to make stuff bigger!
But rather than question whether Poe’s findings are verified by the facts, shouldn’t we rather ask whether he manages to elicit in his readers the feeling of surprise that accompanies discovery? In literature, proximity to discovered facts is far less important than adherence to the internal laws of discovery itself. In other words, it’s a question of forming a hypothesis and then seeing to what extent you can erect a new system of rules, utterly different from the existing rules of our everyday lives.
Maybe what we call the everyday is just thought without hypotheses. Or rather hypotheses exist, but they cling so stubbornly to phenomenal reality that they have already lost their function. When a fresh hypothesis is brought in, the everyday is suddenly destabilized and begins to take on strange new forms. It becomes activated, objectified, and our consciousness is roughly shaken.
Ghost Stories that do not Believe in Ghosts. If I am correct above, then the “s” of sf need not stand for science. Not only would semi-science do just as well, but one could use anything that made an effective hypothesis, even something without any appearance of the scientific. The observation I made earlier—that the term science fiction today is used to include horror and fantasy—is not something due simply to the fledgling state of the genre. Rather, it is because of the fundamental nature of science fiction that this conflation occurs.
Actually, it is said that Poe’s initial motive for writing these science fiction-like works was a desire to ironize or parody the taste for the grotesque that was sweeping society at the time. From their very inception, then, science fiction and horror shared a common lineage.
Nonetheless, science fiction and horror are not the same thing. The difference, obviously, lies in whether the monster is simply a monster, or whether it represents a hypothesis with which to plumb reality. For example, Poe’s creatures are certainly hypothetical beings, but that quality of hypothesis is diminished in the monsters of E.T.A. Hoffmann. So while we can refer to Poe as a pioneer in science fiction, it is rather more difficult to call Hoffmann a science fiction author. Perhaps one could say that a ghost story writer produces ghost stories that believe in their own ghosts, while the science fiction author writes ghost stories with no such belief.
I'm into this framing!
Then there's a whole section about Frankenstein that's great. Just a couple of excerpts:
Thanks to the movies, Frankenstein has achieved the rank of a horror superstar, but in the films he never moves beyond the status of a horror to believe in—in other words a ghost story monster rather than a hypothetical one. The screen versions are less science fiction movies than horror movies. [... In the book,] this horrible monster is actually nothing less than a hypothesis for plumbing the depths of human love and solitude. Unlike the films, this is straight science fiction. If movies don’t rapidly develop to the point where they can treat the monster hypothetically, there will never be a science fiction movie in the true sense of the term.
And then the second interview is very applicable 60 years later, during the Disneyfication of the fantastic in film, and the franchising of compulsory, meaningless heroism in games:
There is a passage somewhere—in a Thomas Mann novel, I think—to the effect that long ago, before a lion was called a lion, it was a supernatural being to be feared like a demon; but once it had been given the name “lion,” it became just another wild animal that could be overcome by humans. There is no question that unknown objects are much more disturbing than those we know; they also have a far greater potential energy. An unknown monster X in the forest is far more frightening than the familiar lions on view at just about every zoo in the country. This is the common ground of the mystery, science fiction, and ghost-story genres. Although each is written in a different way, all are firmly rooted in the same quest for things unknown, the same world of enigma.
I suspect, however, that science fiction, which used to stand for challenging the unknown, has recently come to side instead with the ready-named. The mysterious fascination and vigor that it should be the explorer’s privilege to seek has rather taken on the air of a circus lion.
The question “What is literature?” is impossible to answer, and for that very reason—like the definition of mankind, or being, or the world—it is able to be an eternal question.
We need the same degree of modesty when it comes to science fiction. It is high time that people stopped throwing tantrums like so many spoiled children the moment anyone portrays the genre in a way different from their own conceptions of it.
What use is it inoculating readers against sf or putting it on a leash? But turning sf into a performing lion means just that. By the same token, it is no wonder that sf, as the number one monster-hunter among the genres of literature, is a bigger monster than its quarry. And that is why I am among those who value it as the very literature among literatures. When I look at the state of Japanese literature, a dressed-up herbivore fawning on weak-kneed pseudo-lion tamers called critics, my hopes rest all the more fervently on the monstrousness of science fiction.
And then, particularly for Noam:
Seeing that our sf faction, through regular contact with aliens, has accumulated a wealth of experience with patterns of invasion, I suggest that we use our expertise to infiltrate. Since my strategy is secret, I am unable to disclose any operational details, but I am sure that simply writing this will inspire all the readers on our side to plenty of merry tactics: seeking out new dimensions, shape-shifting, time travel, and operations involving metamorphosis, uncertainty, parody, etc.
Indeed, we may have gone into action already. The monster sf is not to be gauged by common sense, and it may already have taken up the fight without our knowing it. Surely, then, it is all part of a plan that the science fiction boom never actually progressed beyond fandom, or that SF Magazine has still not achieved the biggest circulation in Japan. Yes, as long as science fiction continues to be unnameable, we need not give up hope.
And as an addendum, here's a page Ryan showed me from the afterword to Wizard of Earthsea.
I love this. And all of this circles around to a growing frustration I have with games that insist on using tired, medieval settings — whether set in the past, future, or another world, almost all games manage to be medieval, if you know what I mean!! — to tell empty stories about power grounded in nothing, simply to facilitate comfort(?) or approachability(?) or simply an extreme shortage of imagination????? I'm absolutely exhausted by it, and trying my best to come up with other paths for games that can still be mechanically compelling or even familiar, while refusing to simply inhabit any of the shells littering the beachhead of culture.
And see also:
Haven't read this, but it was mentioned in this interesting interview with Kobo Abe. But look at this god damn cover:
I've been dipping in and out of a bunch of books recently! Last night took me into Koestler's The Ghost in the Machine from 1967. I've read bits of a few of Koestler's books; I don't know much about his intentions or broader perception, but I tend to like his casual tone while challenging widely-held conventions.
Here are a few bits I highlighted while reading, as he goes through how strange it is that the dominant form of psychological research in the early 20th century was dedicated to ignoring consciousness. It's still relevant-feeling now because of the widespread application of Pavlovian/Behaviorist ideas and mechanisms throughout our world, particularly in software/game development and their deliberate lean into psychological manipulation.
By far the most powerful school in academic psychology, which at the same time determined the climate in all other sciences of life, was, and still is, a pseudoscience called Behaviourism. Its doctrines have invaded psychology like a virus which first causes convulsions, then slowly paralyses the victim.
On the strength of this doctrine, the Behaviourists proceeded to purge psychology of all 'intangibles and unapproachables'. The terms 'consciousness', 'mind', 'imagination' and 'purpose', together with a score of others, were declared to be unscientific, treated as dirty words, and banned from the vocabulary. In Watson's own words, the Behaviourist must exclude 'from his scientific vocabulary all subjective terms such as sensation, perception, image, desire, purpose, and even thinking and emotion as they were subjectively defined'.
Psychology used to be defined in dictionaries as the science of the mind; Behaviourism did away with the concept of mind and put in its place the conditioned-reflex chain.
In his standard work Science and Human Behaviour the hopeful student of psychology is firmly told from the very outset that 'mind' and 'ideas' are non-existent entities, 'invented for the sole purpose of providing spurious explanations. . . . Since mental or psychic events are asserted to lack the dimensions of physical science, we have an additional reason for rejecting them'.  By the same logic, the physicist may, of course, reject the existence of radio waves, becanse they are propagated through a so-called 'field' which lacks the properties of ordinary physical media. In fact, few of the theories and concepts of modern physics would survive an ideological purge on Behaviourist principles — for the simple reason that the scientific outlook of Behaviourism is modelled on the mechanistic physics of the nineteenth century.
The attempt to reduce the complex activities of humanity to the hypothetical 'atoms of behaviour' found in lower mammals produced next to nothing that is relevant — just as the chemical analysis of bricks and mortar will tell you next to nothing about the architecture of a building. Yet throughout the dark ages of psychology most of the work done in the laboratories consisted of analysing bricks and mortar in the hope that by patient effort somehow one day it would tell you what a cathedral looked like.
The unique attributes of humanity, verbal communication and written records, science, art, and so forth, are considered to differ only in degree, not in kind, from the learning achievements of the lower animals — once more epitomised, for Hull as for Skinner, in the bar-pressing activities of the rat. Pavlov counted the number of drops which his dogs salivated through their artificial fistulae, and distilled them into a philosophy of man; Professors Skinner, Hull and their followers took an equally heroic short cut from the rat in the box to the human condition.
Skinner did not intend to write a parody.* He means it seriously.
Both are engaged in question-begging on a heroic scale, apparently driven by an almost fanatical urge to deny, at all costs, the existence of properties which account for the humanity of the human and the rattiness of the rat.
[...] the crude slot-machine model, in its modernised, more sophisticated versions, has had a profounder influence on them — and on our whole culture — than they realise. It has permeated our attitudes to philosophy, social science, education, psychiatry. Even orthodoxy recognises today the limitations and shortcomings of Pavlov's experiments; but in the imagination of the masses, the dog on the laboratory table, predictably salivating at the sound of a gong, has become a paradigm of existence, a kind of anti-Promethean myth; and the word 'conditioning', with its rigid deterministic connotations, has become a key-formula for explaining why we are what we are, and for explaining away moral responsibility.
At first its intention was merely to exclude consciousness, images and other non-public phenomena as objects of study from the field of psychology; but later on this came to imply that the excluded phenomena did not exist.
This is one of the most frustrating things to me about the reductive scientific worldview, which persists here almost 60 years later. Thinking exists, it happens, you're doing it right now. What it is, exactly, is up for discussion, but it's trash semi-intellectualizing to assert that the sum of human (and other animal!) consciousness can be reduced to on-off electricity in a salt barrier, as one smarmier-than-thou scientist asserted to me a few years ago. Those things might be observable, but they don't add up to anything approaching the whole!
Werner Heisenberg, one of the greatest living physical scientists, has laconically declared: 'Nature is unpredictable'; it seems rather absurd to deny the living organism even that degree of unpredictability which quantum physics accords to inanimate nature.
It is impossible to arrive at a diagnosis of humanity's predicament — and by implication at a therapy — by starting from a psychology which denies the existence of mind, and lives on specious analogies derived from the bar-pressing activities of rats. The record of fifty years of ratomorphic psychology is comparable in its sterile pedantry to that of scholasticism in its period of decline, when it had fallen to counting angels on pin-heads — although this sounds a more attractive pastime than counting the number of bar-pressings in the box.
And finally, a side note as he acknowledges he's made some of these points in earlier books:
It is embarrassing to have to repeat, over and again, that two half-truths do not make a truth, and two half-cultures do not make a culture.
I just wanna write these down so that I can stop cursing them silently!
This is SO WEIRD, and frustrates me all the time. I have an iPhone and an iPad, and I read books on both. Let's say I'm on page 20 when I start reading on my phone. Around page 40, I lock the phone or switch to another app without tapping the "back arrow" in the upper left, because why would I do that? So, at some point, whether a few minutes or hours later, if pick up the ipad and open the book there? What page should I be on? Invariably, no matter how much time has passed, the ipad thinks I'm still on page 20. Even worse: it now sets the "synced position" to page 20, because that's the most recent page iCloud has seen me reading!
This happens especially when I've been reading the same book in both places, because the text is "open" in both. But... it's such a tiny bit of metadata. When I pick up the iPad with the book already "open," why in hell doesn't it check the server and move to page 40? If I manually close the book before I put down the iPad (why would I do this), and manually close the book before I put down my iPhone (again, this isn't how these devices are used), then the next time I open the book on either device, it will open to the most recent page.
Anyway! This frustrates me!
I can start dragging and then move down the right side of the screen to highlight all the lines in between... except for the last line of a paragraph. It just... doesn't highlight. If I move a bit farther down, it highlights to the end of the next paragraph's first line. Wha....? So then I remember that I have to move my finger to the letters at the end of the paragraph, and that having my finger past those characters means they don't get highlighted. I just... what the fuck? I'm pretty sure it used to work the sensible way.
I hold the controversial opinion that software should be developed by people who actually use that software, which increasingly doesn't feel like the case with Apple's apps. Further, it should incorporate external feedback and research, which also doesn't seem to be in style over there. Even further, the same people should be working on the same app for more than 9-12 months, so their experience can be carried into the next round of iterations. Reports from inside Apple suggest this is hardly ever the case; engineers get shuffled around to different projects every year or so. Which helps explain why their apps get totally rebuilt with fewer features every couple of years!
Hhhhhhhhhhngh! These are two very small examples of a thousand other instances of this kind of, "nobody took the time to understand how this should really work," all the way across Apple's software. Not to mention all the ways the software is just straight up broken, or being redesigned to do less and take up more visual space. I remain impressed with their hardware, but my Apple software frustration level is really getting dire.
I'm not familiar with the publication and am not a Christian, but I saw this paragraph linked on kottke.org and appreciate its stark framing:
Americans are, of course, the most thoroughly and passively indoctrinated people on earth. They know next to nothing as a rule about their own history, or the histories of other nations, or the histories of the various social movements that have risen and fallen in the past, and they certainly know little or nothing of the complexities and contradictions comprised within words like “socialism” and “capitalism.” Chiefly, what they have been trained not to know or even suspect is that, in many ways, they enjoy far fewer freedoms, and suffer under a more intrusive centralized state, than do the citizens of countries with more vigorous social-democratic institutions. This is at once the most comic and most tragic aspect of the excitable alarm that talk of social democracy or democratic socialism can elicit on these shores. An enormous number of Americans have been persuaded to believe that they are freer in the abstract than, say, Germans or Danes precisely because they possess far fewer freedoms in the concrete. They are far more vulnerable to medical and financial crisis, far more likely to receive inadequate health coverage, far more prone to irreparable insolvency, far more unprotected against predatory creditors, far more subject to income inequality, and so forth, while effectively paying more in tax (when one figures in federal, state, local, and sales taxes, and then compounds those by all the expenditures that in this country, as almost nowhere else, their taxes do not cover). One might think that a people who once rebelled against the mightiest empire on earth on the principle of no taxation without representation would not meekly accept taxation without adequate government services. But we accept what we have become used to, I suppose. Even so, one has to ask, what state apparatus in the “free” world could be more powerful and tyrannical than the one that taxes its citizens while providing no substantial civic benefits in return, solely in order to enrich a piratically overinflated military-industrial complex and to ease the tax burdens of the immensely wealthy?
Our cruel, inefficient, and monstrously expensive health system makes this obvious.
And there are some great passages addressing the incredibly-obtuse ways people treat the word "socialism":
Moreover, just because a totalitarian regime happens to call itself socialist—or, for that matter, a republic, or a union of republics, or a people’s republic, or a people’s democratic republic—we are under no obligation to take it at its word. What we call “democratic socialism” in the United States is difficult to distinguish from the social-democratic traditions of post-war Western Europe, and there we find little evidence that a democracy becomes a dictatorship simply by providing such staples of basic social welfare as universal health care. At least, it is hard not to notice that the social-democratic governments of Europe have always gained power only by being voted into office, and have always relinquished it peacefully when voted out again. None of them has ever made war on free markets, even in attempting (often all too hesitantly) to impose prudent and ethically salutary regulations on business. Rather than gulags, death camps, secret police, arrests without warrant, summary executions, enormous propaganda machines, killing fields, and the like, their political achievements have been more in the line of the milk-allowances given to British children in the post-war years, various national health services, free eyeglasses and orthodonture for children, school lunches, public pensions for the elderly and the disabled, humane public housing, adequate unemployment insurance, sane labor protections, and so forth, all of which have been accomplished without irreparable harm to economies or treasuries.
I suppose a social-democratic state could begin to gravitate toward true authoritarianism, in the way that any political arrangement can lead to just about any other. The Third Reich, after all, was born out of a functioning parliamentary democracy. The 2016 U.S. election proved that, even in a long-established democratic republic, just about anyone or anything, no matter how preposterously foul, can achieve political power if enough citizens are sufficiently credulous, cowardly, and vicious. In just the past few years, we have seen bland American neoconservatism rapidly evolving into populist, racist, openly fascist, mystical nationalism. Anything is possible.
All this being true, the classical social democrat or democratic socialist might be forgiven for thinking that Americans are curiously deluded regarding their own supposed inalienable liberties. He or she might contend, at any rate, that a state that uses its power chiefly to dilute consumer and environmental protections in the interests of large corporations and private investors, while withholding even the most basic civil goods that taxpayers have a right to expect (such as a well-maintained infrastructure or decent public transport), is no smaller—and certainly no less invasive and dictatorial—than one that is actually obliged by the popular will and the social contract to deliver services in exchange for the taxes it collects. He or she might think that a government whose engorged military budget is squandered on wasteful (because profitable) redundancy, but whose public services are minimal at best, presides over a far more controlled economy—and a far more coercive redistribution of wealth—than does a government forced to return public funds to its citizens in the forms of substantial civic benefits. He or she might even have the temerity to see social democracy, properly practiced, not as an enlargement of the state’s prerogatives, but quite the opposite: a democratic seizure of power from both state and corporate entities, as well as a greater democratic control over public policy, taxation, production, and trade.
After all, though we often speak as if the centralized state and corporate “free” enterprise were antagonists, they are in fact mutually sustaining. [...] Without the support of an omnicompetent, vastly prosperous, orderly, and violent state, global corporate capitalism could not thrive. Without corporations, the modern state would lack the resources necessary to perpetuate its supremacy over every sphere of life.
Finally, again as someone raised "Baptist" but firmly opposed to American Christianity in all its forms, I agree wholeheartedly with this assessment:
Contrary to conventional wisdom, Christianity has never really taken deep root in America or had any success in forming American consciousness; in its place, we have invented a kind of Orphic mystery religion of personal liberation, fecundated and sustained by a cult of Mammon.
The full article: Three Cheers for Socialism
The Dear Hunter — Wait
"I stood in lines to bow my head. I'd fold my hands and speak in tongues, to whisper worries to the dead. But I could tell no apparition heard a single word I said. But I'd still call my fear in to the air... then I said, wait."