Color Science

Consider a woman, Mary, who is color blind. She is a neuroscientist, and knows absolutely everything there is to know about color, all the wavelengths and all the effects of light on the eye, etc.. There is nothing she doesn’t know about it. But she has never, herself, experienced it. One day, a surgical procedure is developed, allowing her to see color for the first time. Does Mary gain any new knowledge?

So is formulated one of the modern classic philosophical arguments, and one that, honestly, is built on such a false premise that it’s difficult to believe that it’s had as much traction as it’s had. Rather than go through all the logical argumentation, I’ll simply rewrite the analogy, and the flaws should become glaringly obvious.

Consider a young man, Gary, who is a virgin. He is eighteen years old, and knows absolutely everything there is to know about sex. Thanks to the internet, there is nothing he does not know about it. But he, himself, has never actually had sex. One day, he goes on a date with a woman, and after dinner and a movie, they make love. Does Gary gain any new knowledge?

Obviously, Gary could not have full and complete knowledge of sex. Any claim as such is non-sense. There is no refutation of physicalism, because he never had the knowledge in the first place; he does not have all the physical knowledge, so his gain of new knowledge is nothing significant. He could imagine it, he could try to synthesize what it would be like, he could try to create a simulacrum via masturbation, etc. but any claim to absolute knowledge of the subject is necessarily false.

Advertisements

A hoax that got out of hand

GET FUCKING VACCINATED YOU IDIOTS

I was angry with Alan Sokal back in 1996. He missed the point, to say the least, and, subsequently, did a great job providing everyone who didn’t want to think with an easy piece of shorthand to gesture at when they wanted to dismiss entire schools of thought without engaging with them (see also: Dawkins, Richard re: religion, whose blistering insights about the Problem of Evil and the Teleological Argument have only been debated for ~1500 years or so before he was born; a spectacular case of getting everything right but missing the point completely).

And so, in the grand tradition of academia, I decided to get back at him. And how better than to get him at his own game? A hoax of my own. It would take time, planning, effort. It would require coordination with someone just unscrupulous enough to want to see a proud man taken down a peg. And we wouldn’t go after some podunk little journal with no readers and even less renown. No, we’d hit one of the big time journals, one of the unassailable, unimpeachable bastions of integrity that represent the industry as a whole. The kind that the lay people have heard of.

So, my friend and partner-in-crime Andy and I started cooking up a paper that, clearly, had to be bullshit. There was no way anyone could read it and think that we were serious, that we had found a correlation. We stacked on as much garbage as we could, putting in as many spurious claims as we could link, each trying to out do the other in a marathon session that involved many bottles of wine and many scratch pads, trying to come up with the most ridiculous theory possible. It would be the “Naked Came the Stranger” of science publications.

“Autistic Enterocolitis” was the name we settled on.

We then stapled together some photos of sad kids and their upset parents, cribbed together some sentences about MMR from other papers we found lying around Andy’s office, copied the index out of an old Vladivostok telephone directory, and sent it off.

For the first year, nothing. If you don’t know what waiting for peer review is like, it’s an agonizing period of hurry up and wait. You are scrutinized by an anonymous jackass whose main concerns are advancing their own careers and making sure that they’ve been cited enough times in the bibliography that they feel like a part of the intellectual community. You will be given this person’s lowest possible priority, your paper lost underneath some student work, the novel they started last year, a coffee ringed calendar with numerous other “important” events they are blowing off, and the three complimentary textbooks they’re considering reviewing for next semester. But, if you’re lucky, some intern will knock the stack over when they rush in with multiple vials of blood that they’ve mislabeled and are hoping (praying?) that there’s some test that can be performed to identify the people they belong to so that they don’t have to do 18 more draws from even more doners in order to complete the experiment, and your (now bloodstained) paper will end up on top when the student finishes picking everything up through tears and goes running down the hall, wondering why they thought they could ever do anything right. Then you get a glance through to check that their name is somewhere in there, and a few notes to make it look like they did something. (“Nice use of commas, too many semi-colons. One is enough per paper to make the author look like s/he knows English grammar. Could have written more about mathematical diabetes, but otherwise acceptable for publication”)

Imagine our surprise when, on 28 February 1998, The Lancet published  “Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children”

We were flabbergasted. Dumbstruck. It worked! We jumped up and down like teenage girls who’d just been asked to prom, hands clasped, the pictures on the walls of Andy’s apartment shaking each time we landed. How could they have taken the bait? Vaccines cause autism, which presents itself as a bowel disease we invented whole cloth? Who’d believe that garbage? And a sample size of only 12, a third of whom hadn’t presented autistic symptoms? We’d done it. All the was left was to call Social Text and let them in on the joke. They deserved it, after all they’d been through. They could drop the story and we’d let is spread like wildfire. We figured we’d let it stew for a week or two, then announce that we’d fabricated the whole thing — just enough time for the praise and adulation to start rolling in, but not enough that people would start actually acting on it. We didn’t want people to get hurt, after all. I was just wishing I could be there when Alan got the news that he’d been punked back.

But then, a funny thing happened. Not funny “ha ha”. More, funny “sad clown is going to hang himself and is on his way to the store to buy rope, and slips on a banana peel and falls off a bridge to his death”.

Somehow, people believed us. People really believed us. The newspapers and the TV didn’t bother to read the paper, they just ran with it. Those sorts of people emerged. Granola people. “I’m a Christian and a Mother and I Vote” people. Survivalists. Christian Scientists. The people who think the fluoride in the water contains mind control drugs. And these people convinced other people. It spread like a virus, like one of Dawkins’ memes, through the populace. It was too late. We’d let it loose, and there was no way to get it back into the cage.

And then Andy got weird; he started believing it too. We had an angry phone call one night, and we haven’t spoken since. He now denies that he ever knew me. It hurt, losing a close friend like that. That was 17 years ago. But he has famous friend now. Jenny McCartney loves him. So does Charlie Sheen. Alicia Silverstone. Donald Trump. One of the Kennedy kids. What use would he have for me?

It spiraled and snowballed, growing worse and worse. More and more people got in on the hoax. Old diseases came back, and came back with a vengeance. Children were disfigured. Babies born with horrific defects. Corpses piled up, needlessly. The Lancet finally retracted the paper, but the djinn was out of the bottle, the monkey’s paw had already closed one of its sinister fingers. I never wanted to hurt anyone. I never wanted to cause the extinction of the human race. I just wanted to rib back someone who’d ribbed us.

Gotcha, Alan. Ha ha?

Audio Mixing

How a song is mixed, and the levels at which the different instruments are recorded, edited, etc. produces different effects.

Observe:

Studio version, with rather muddy and otherwise flat mixing. The song is understandable, but so much of the detail is lost. The instruments blend together, the volumes on many things are wrong, and many of the subtleties are lost. Did you hear David Byrne humming near the end? How many different drummers are there? Can you hear Tina Weymouth’s bass clearly throughout?

You probably won’t notice is until you hear a differently mixed version, so now listen to this one:

From the first few seconds, you can hear difference in Abdou M’Boup congas and Tina Weymouth’s bass. They’re clearer. It’s like you’re in an empty room with them. They haven’t been cut down and flattened out. And, mind, you’re probably listening to this through a pair of laptop speakers or cheaper headphones. You don’t need fancy systems or expensive gear to hear the difference.

It’s different from a live recording, though. A bit more polished and pure. Live shows have their own energy and vibe. Like so:

(Sadly, Talking Heads broke up shortly after Naked was released, so there are only recordings of David Bryne playing the song, not the full band)

It’s not as perfect, because you only have one shot to get it right. But that’s alright too. You aren’t going to a live show to see perfection, at least not with this style of music. The performer coming out, holding up a CD player, and hitting “play” would be a statement of its own, but I doubt it would please the audience much. Just as important is seeing the band enjoy itself, or not enjoy itself, depending on the music — some people want KISS or Bowie style spectacle, others want a gas station attendant looking at his shoes the entire time, and a lot of folks want somewhere in between.

V.

“A phrase (it often happened when he was exhausted) kept cycling round and round, preconsicously, just under the threshold of lip and tongue movement: “Events seem to be ordered into an ominous logic.” It repeated itself automatically and Stencil improved upon on it each time, placing emphasis on different words—“events seem”; “seem to be ordered”; “ominous logic”—pronouncing them differently, changing the “tone of voice” from sepulchral to jaunty: round and round and round. Events seem to be ordered into an ominous logic.”

pengv66

“A schlemihl is a schlemihl. What can you “make” out of one? What can one make out of himself? You reach a point, and Profane knew he had reached it, where you know how much you can and cannot do. But every now and again he got attacks of acute optimism.”

v-advance-copy-1963

“Some of us are afraid of dying; others of human loneliness. Profane was afraid of land or seascapes like this, where nothing else lived but himself.”

v-paperback-perennial-1990

“For that moment at least they seemed to give up external plans, theories, and codes, even the inescapable romantic curiosity about one another, to indulge in being simply and purely young, to share that sense of the world’s affliction, that outgoing sorrow at the spectacle of Our Human Condition which anyone this age regards as reward or gratuity for having survived adolescence.”

v_pb2

“Time of course has showed the question up in all its young illogic. We can justify any apologia simply by calling life a successive rejection of personalities. No apologia is any more than a romance—half a fiction—in which all the successive identities are taken on and rejected by the writer as a function of linear time are treated as separate characters. The writing itself even constitutes another rejection, another “character” added to the past. So we do sell our souls: paying them away to history in little installments. It isn’t so much to pay for eyes clear enough to see past the fiction of continuity, the fiction of cause and effect, the fiction of a humanized history endowed with “reason.”

v_it_pb

“It takes, unhappily, no more than a desk and writing supplies to turn any room into a confessional. This may have nothing to do with the acts we have committed, or the humors we do go in and out of. It may be only the room–a cube–having no persuasive powers of its own. The room simply is. To occupy it, and find a metaphor there for memory, is our own fault.”

51yuxknneel-_sx322_bo1204203200_

“What of Thought? The Crew had developed a kind of shorthand whereby they could set forth any visions that might come their way. Conversations at the Spoon had become little more than proper nouns, literary allusions, critical or philosophical terms linked in certain ways. Depending on how you arranged the building blocks at your disposal, you were smart or stupid. Depending on how others reacted they were In or Out. The number of blocks, however, was finite.
“Mathematically, boy,” he told himself, “if nobody else original comes along, they’re bound to run out of arrangements someday. What then?” What indeed. This sort of arranging and rearranging was Decadence, but the exhaustion of all possible permutations and combinations was death.”

9782020089265-us-300

“Could we have been so much in the midst of life? With such a sense of grand adventure about it all?”

5777470

“Life’s single lesson: that there is more accident to it than a man can ever admit to in a lifetime and stay sane.”

 

Intellectual Laziness

Something that has existed since, well, probably forever (even though this paragraph originally started ‘Something that’s become more and more of a problem…’, it’s almost certainly been an forever), is the problem of intellectual laziness.

If you’ve only glanced at a complicated topic, something that people have doctorates in, have written long books about, have done extensive research in, etc. etc., you probably don’t understand it very well, and any criticism you’re going to make of it is going to be rather surface level and will merely question a few of the basic assumptions made by the field of study, as though said baseis have never been questioned before.

For example:

  • Why do people think God exists when something would have had to make God, and also Evil exists?
  • Math has no use in the real world, so why am I bothering to learn this?
  • We should get the government out of things, because all those regulations do it make it harder for people.
  • That’s not art, it’s just a bunch of crap thrown on a canvas. My kid could do that.
  • Postmodernism is just a bunch of gibberish.
  • Postmodernism is just a bunch of really simple ideas dressed up in fancy terminology.
  • Science has been wrong before, so why should I trust it now?

I could go on, but you get the idea. It seems to go in three stages:

  1. A negative gut reaction to whatever is being presented.
  2. A refusal to actually engage with the material, which might provide evidence counter to the gut reaction.
  3. Repetitions of the same tired criticisms that everyone else makes, especially dismissal of anyone who cares enough to really be invested in “that crap”.

Odds are, if your criticisms can be found in the first two links of a google search for Anti-[whatever], and those aren’t from .edu sites, or are being shouted on YouTube by a man with an ill-chosen pseudonym, you haven’t engaged deeply enough.

(A parable: a fellow was writing a story about a leprechaun, and doing some research into the origins of the mythical figure — what they represented, why they endured as symbols of Irish culture and heritage, how their depictions had changed over time, how the stories told about them gave differing moral lessons. While out drinking one night, a friend of his, one of those folks prone to outbursts and moods, yelled “Your work is shit. You believe in nothing. Leprechaun’s don’t exist. I don’t need a degree in leprehuanology to know that!” Obviously, thought the fellow, but didn’t bother saying anything out loud, because you know how those types can be once they’re in their cups.)

A better method, albeit one that requires effort and opens one up to actual criticism, is Dennett’s “Steelmanning”. Described in his book Intuition Pumps and Other Tools for Thinking, the Steelman is the opposite of a Strawman. You attempt to present the other person’s argument in the strongest terms possible, giving them the most charitable interpretations, making an actual case for them being correct, and demonstrating that you understand them completely. Then, and only then, do you begin any criticism.

Now this, of course, requires more than skimming the Wikipedia article on a given subject, and then making up what you think someone (some idiot?) might think about this (stupid) topic. Books are involved. Knowledge of the different schools of thought within a discipline. Replying to actual assertions, rather than the simply the existence of what you assume the thing is.

An short example contrasting the two:

“John Searle is stupid. His stupid Chinese Room doesn’t prove anything about learning or AI, because a person isn’t a computer. Some guy with a bunch of books wouldn’t be able to translate Chinese as well as a super computer, and therefore the other person would notice, and the Turing test would fail. What an idiot.”

vs.

“Searle’s Chinese Room thought experiment, in which it is asserted that there is no difference between a computer interpreting commands and a person executing commands in a language they do not understand, is an interesting thought problem. The basic conclusion is that no artificial intelligence will be capable of contemplating itself, or understanding it’s own actions, just as the person manually executing the ‘program’ will not understand the language they are working in. There will be no ‘Strong AI’, to use Searle’s term. However, many objections have been raised to this analogy, and the one which I find the most compelling is that Searle’s conclusion (“Therefore there is no Strong AI”) does not follow from his premises. He assumes a dualism between Strong and Weak AI, and, because his experiment seems to demonstrate that there is not a Strong AI, he assumes it must be Weak. This does not follow. It merely proves that, in this particular instance, thought is not just computation. It does nothing to positively identify criteria for thought, nor to establish that computers are incapable of it. It does not prove that, simply because computational processes and their output can occur in the absence of a cognitive state, that thought is not occurring in this instance. Is there any way for Searle to prove to me that he himself is thinking, and not simply interpreting and executing external commands which his unconscious interior does not actually understand? Even more generally, is there a difference between the real thing and a perfect simulacrum? That is a question far too broad for discussion here. Needless to say, despite its numerous flaws, the Chinese Room is an interesting thought project that has entertained philosophers and AI researchers for years.”

As much as I might disagree with John Searle, and find many of his ideas based on incorrect premises, I would never call him a stupid person, or think that he should stop writing. It is chiefly because he is such an intelligent person that he is capable of producing such brilliant (if wrong) things as the Chinese Room. And I assume he’s writing in good faith, because he’s a doctor of philosophy at UC Berkeley.

Try it out next time you feel tempted to, say, claim that Islam is horrible because of the actions of a tiny minority of Wahhabists, or that Feminism is a cancerous political movement because of Andrea Dworkin rather than a multifaceted approach to cultural theory through which any number of subjects can be interpreted, or if you’re about to type “That’s Economics 101!” while not realizing that there is a 102, a 301, a 505, and other much more complicated classes that expand on and systematize the dumbed down and simplified explanations given in 101 classes so that students aren’t overwhelmed and can basic concepts (By analogy, they don’t cover friction when calculating motion in Physics 101. Does friction exist?).

Update: There will be further discussion of John Searle, rest assured.I have a lot more to say about Chinese Rooms, Limited INC., and, well… A lot. We’ll get there.

Apocalypse and Revelation: the Televisualization of Movies

X-Men: Apocalypse is a hot mess of a film, with some lovely action sequences, some well done CGI, fairly good acting, good make-up and costuming, and an overstuffed plot that has a few too many twists and characters to make its nearly 2 1/2 hour run time feel worth it. Compounding the strangeness is that I haven’t seen Days of Future Past, nor First Class, and the older X-Men films are memories from a decade ago. This isn’t the fault of the film, which is explicitly billed as the third entry in the franchise, but which also makes a number of concessions to newer viewers via flashbacks and expository dialogue; it’s entirely my own.

There are some needless sequences. The one that stands out most prominently is the kidnapping by Colonel Stryker and the bit in the Weapon X facility, a half-hour detour that is eventually just a transparent excuse to have a Hugh Jackman cameo. It’s a nice shout out to Barry Windsor-Smith’s iconic run, and a naked Jackman is rarely a bad thing to have in a film, but just as easy would have been to excise that entire part. Nicholas Hoult already showed off his fancy new plane to Jennifer Lawrence. They could get in that and go straight to Cairo.

In fact, it was difficult to get a bead on who was supposed to be the protagonist. The only people with character arcs are Oscar Isaac (who comes back to hate the modern world, tries to destroy it, and fails), Michael Fassbender (who comes back to the public world after the death of his family, tries to destroy it, and  has a change of heart), and Evan Peters (who comes out of hiding to look for his father, finds him, and then decides not to tell him the truth). Everyone else is either static, or only hits partway towards a change, without resolution. The movie itself is made with a sequel in mind.

And this is an interesting thing. It felt more like watching a few episodes of a television show, albeit one with a much much larger budget than usual, spliced together, than a feature film. Detours like the Weapon X one make sense if it were just an episode of a program. You wouldn’t need to keep cutting back to Oscar Isaac to remind you of why the characters need to hurry.

This makes perfect sense from a financial standpoint: movies are very expensive to make, and therefore if one can get a franchise going, the odds of getting another film made are even better. This leads to an automatic draw at the box office, easier branding, easier promotion, etc. etc. However, it comes at the expense of an actually satisfying film experience. The questions that are posed by the film have to be interesting ones, and far too often, they simply aren’t. There’s a limited number of twists and turns that an audience will accept, and because there are so many competing franchises, and due to the internet’s obsessive theorizing and analyzing of any given piece of media, the answers are inevitably unsatisfying.

Time was, you could leave things open ended, and that was alright. The Maltese Falcon, for example, doesn’t delve deeply into Spade and Archer’s relationship. The film famously doesn’t even get into the real mystery of the falcon itself; such a thing is beside the point of the story. These are left to the imagination of the viewer. And yet it could have easily been developed into a franchise — in fact there was an Adventures of Sam Spade radio serial that ran from 1946-1951. But constant call backs and references to the past were not the point.

Such things make sense for the finale of a TV season. The ongoing subplots can be resolved, everything can be wrapped up, the villain who has been directing things can be defeated, and so on. One goes in nowadays knowing that it is the culmination of a build up of 12 or 25 or whatever previous episodes. But when the call backs become the point, when the plot is an excuse to make references that the long term fans will pick up on, then you’ve insured that you will not be successful. You’ve turned your product into something insular and incestuous, doubly so if it is full of things that can only be found by becoming involved in the internet fandoms. Assuming that your viewers have seen the previous films in the franchise is acceptable. Assuming that they’ve seen them ten times is not.

Which winds us back to Apocalypse. I wasn’t lost at all, because I’ve read almost all the X-Men comics produced from 1963-1993. It was simple to say “Oh, that’s (so and so)” based on casual details or “Oh, they’re doing (that plot)” based on things I recognized. And unlike some, I don’t mind seeing details change; if I wanted to see the same story again, I’d just fish my comics out of the longbox and reread them. I want something mixed up and served differently. Make it unfamiliar enough that I can’t guess exactly what’s going to happen next. Tell new stories with the old characters — I don’t care if it contradicts issue 213 where Wolverine missed Shadowcat’s birthday because he was held up in traffic, not because the subway was stopped due to a track malfunction. There is joy in recognition, but it’s much sweeter to not predict where a story is going.

What some people want, judging from their reactions, is much closer to the original video animations (OVAs) released to tie in with popular manga in the 80s and 90s. The idea was that when a series reached a certain level, the company would commission two episodes of a cartoon, adapting two popular stories from the comic, and sell it direct to video at a terribly expensive mark-up, and Japanese fan culture being what it is, they would sell well enough to make it worthwhile. Nowadays they simply adapt their entire series into a half or full season, with perhaps an OVA to serve as a capstone, but then things are different from how they were thirty years ago. I can understand the appeal of seeing the book in color and motion, with voice acting and sound effects. People enjoy different things. But is it too much to ask that some effort be put in as well?

“I am Always Late to the Party” by Donna Greenhauser

I’m not one of those people who are glued to book reviews and clamoring to read the latest thing, despite my profession. I prefer to give books a bit of time to age, and to see if it’s just going to be a flash in the pan that no one will care about in a year or so (Water for ElephantsSarah’s Key) or will actually enter the modern canon as worthy of the time it takes to really read a novel.

Because I’m not, nor have I ever been, one of those folks who can speed through page after page, skimming through the boring bits, glancing over descriptions, jumping ahead to the action. For better or for worse, I read every word.

This isn’t a moral stand or a judgement on those who can read faster than I can. I’d find it rather useful if I could hustle through a novel in a weekend, or knock off six chapters in an evening. And I’m not going to make some preposterous claim, like that I enjoy slowly read novels on a deeper level or something. I just don’t read very fast. It’s something I’ve accepted.

It also means that, for me, while reading is a pleasurable and leisurely activity, it is also one that is undertaken with great care. Is whatever book I’m about to embark upon going to be more worth my time than Kant or Hegel? Woolf or Joyce? Pynchon or Wallace?

(It does seem to make reading certain philosophy easier, because I’m used to reading at a very slow pace, whereas, for example, a companion of mine once flew into a rage because she couldn’t deal with the long sentences, but also couldn’t turn her long ingrained speed-reading off. Another friend of mine used to fast-forward through parts of movies that she found boring, and then get angry at the films when she couldn’t understand what was happening (she was the quintessential person in the movie theatre “Why did they kill that guy? I thought he was with them? He was with the bad guys? When did they say that?”), but that’s a separate problem)

Which brings us around, the long way, to Greenhauser’s I am Always Late to the Party.

It’s a novel that came out a few years back, and one that I didn’t pay much attention to on its release, though a lot of folks seemed quite taken by it.

A quick plot summary: Esther, a woman in her late 20s, attempts to “rationalize” her life, by making everything she possibly can completely optimal. She counts her steps, she notes how many times she chews each different type of food, she measures how long she needs to sleep given what activities she has performed each day, etc. etc. She figures that by doing this, she will save herself enough time and effort that she’ll have time to be happy, that the main source of her unhappiness comes from how busy she is, and if she had the time to relax, she wouldn’t hate herself nor the world around her. She has encounters with various folks, there’s minor plot lines running throughout the book about her landlord trying to get with her sister, her boss dissolving the company because of mismanaged funds, her ex-boyfriend who lives down the street from her job trying to get back on his feet after their recent break-up, but the main thrust is Esther herself trying to solve the condition of her life by making more time.

She fails, as you might imagine.

But what struck me as brilliant about the book was not the set up, nor the rather predictable ending where her best friend lets her know The Secret that life isn’t just a series of tasks to be performed, but something to be relished and enjoyed, and that if you spend all your time trying to make yourself happy, rather than finding happiness, you’ll never succeed, and all that… No, it’s that this section comes while there’s still a good third of the book left, and Esther’s reply is “Yeah, no shit. But I don’t have any money. My job is falling apart. I can’t just fuck off to India for two months, Siobhan,” which, needless to say, isn’t the reaction she was expecting.

Now, they don’t have a big breakdown shouting match or anything, which is another point I liked, because far too often female friendships are depicted as fragile or petty, and this honestly felt like a realistic relationship. Siobhan takes it in stride, and lets Esther complain some more. She’s a good friend. They go out drinking, and through the strange vicissitudes of fate, end up crashing a very fancy party hosted by Simon, who is an amalgamation of a number of business person stereotypes. Esther looks like Simon’s ex-girlfriend from behind, and he ends up shouting a ton of nasty things at her, which Esther initially takes as criticism at her crashing the party. But then he gets more personal, going on about parts of their “relationship”, and finishes by calling her Grace. Only after he’s made a fool of himself does she turn around and say “I think you meant to say that to someone else.”

Esther and Siobhan get back to Esther’s apartment, and Siobhan passes out in Esther’s bed. Esther tries sleeping on the couch, but finds that this has fully thrown her attempts to control her life astray. It will take weeks for her to get back on track. But when she controlled everything, she wasn’t happy. What was she doing with all that spare time? Trying to figure out ways to arrange for more spare time? And when she let herself go and didn’t care, at the end of the night, regardless of how good a time she had, she was still back in the same place. What was the point? Why bother with any of it? She goes to the top of her building, intending to jump, but finds that the entire roof has been encircled with fencing and safety nets to prevent this very thing. She laughs, “A sad, private laugh, the sort you’d imagine coming from a clown’s tent as he takes off his make-up after the night is over”, and goes back downstairs.

There’s some plot wrapping up after that (Siobhan punches Judd, Esther’s ex-, when he shows up the next morning, her landlord and her sister finally go on a date, her boss sells the company and Esther doesn’t lose her job, she meets Grace and learns what an ass Simon was during their relationship, etc.), but this is really where the novel ends in terms of character development and significant action.

It isn’t that she chooses not to commit suicide, it is that suicide is made just inconvenient enough for her not to bother making the effort. Her life isn’t good, per se, but she is forbidden from stopping it easily. She must go on living, happy or not, unless she really doesn’t want to. And, which is why I’m glad the novel doesn’t end on the rooftop, the world doesn’t care whether or not she likes it or hates it. Life still moves on for other people.

One will hope that it will keep its place in the literary consciousness, but sadly, I’ve not seen a copy in bookstores since I bought mine.

So it goes, I guess.