Authors, Artists and the Internet

Because it’s easy to become overwhelmed by tech minutia, particularly if you hail from the arts, I thought it might be useful to step back from the discussion of SEO in the previous post and consider the internet in broader context. If you’re not into technology most tech-speak probably sounds like gibberish, but you probably also have faith that it all makes sense to someone somewhere. If the internet is a mystery to you as an artist or author, you trust that the smart, wonderful, benevolent people who created the internet in order to help you reach both your intended audience and your creative potential really do understand what it’s all about.

The internet is an amazing creation, and has come to dominate our lives in an amazingly short amount of time. Backed by hundreds of billions of dollars in investment, infrastructure and advertising, the internet is clearly the place to be, at least according to the internet. Beyond making a lot of people rich, however, the internet as a method of communication has democratized conversations that were previously controlled by self-interested if not bigoted gatekeepers, meaning voices that were perpetually overlooked or muted can now be heard on issues of critical importance. In every way the internet imitates life, and at times even imitates art.

The problem with that feel-good appraisal is that it ignores another fundamental truth about the internet, which is that is completely insane. And in saying that I do not mean the internet is exasperating or wildly avante-garde, nor am I being hyperbolic or pejorative. Rather, I mean that as a cold, clinical appraisal. If you are an author or artist the maze of technologies driving the internet may make it hard to perceive the systemic dysfunction emanating from your screen (though the phrase virtual reality is itself a shrill clue), but you are in fact better positioned than most to understand it. All you need to do is recast your conception of the internet in familiar terms.

If you’re a writer, think of the internet as having been authored by Joseph Heller or Kurt Vonnegut. If you’re an artist, think of the internet as a work by Salvador Dali or René Magritte. Which is to say that the internet is not simply the sum of its technologies and techniques, but a construct, space, and experience informed and distorted by human perception and imagination.  Read more ]


Authors, Artists and SEO

Over the past month or so I’ve been catching up on a lot of old to-do items, including a year’s worth of site maintenance that I kept putting off until I switched ISP’s, which, oddly enough, I kept putting off until a couple of months ago. Anyway, while researching something or other I ran across a truly useful article on the subject of search engine optimization (SEO), which had the refreshing candor to acknowledge that SEO advocates speak gibberish:

As you sit down with your new SEO consultant it starts out well, but soon he says “We’ll need to implement a good 301 redirect plan so that you don’t lose organic rankings and traffic.” Then he says something about title tags, which you’ve heard of although you’re not quite sure exactly what they are or what they do, or why it’s important to update them as your consultant is recommending, although it all sounds good. Then he starts using other jargon like “indexing,” “link equity,” and “canonicalization,” and with every word you feel your grasp on reality slipping and the need to take a nap.

The entire piece contains an excellent glossary of terms that come up again and again in SEO articles. Unfortunately, the very fact that SEO is such an enigma confuses the question of how sole-proprietors — including particularly independent artists and authors — might best make use of SEO without themselves becoming confused or lapsing into gibberish.

So here’s the truth about SEO if you’re an artist. For the most part SEO is not something you need to be concerned about. Whatever time you might put into SEO, or whatever time you might convert into money in order to pay someone else to worry about SEO, can usually be more profitably spent creating whatever it is that you create.  Read more ]


Google is the New Microsoft

Two days ago I went to log into Gmail and found that the login screen I had been using for, what — an entire decade? — was suddenly behaving differently. Now, as a longtime web user I’ve been taught that any time something seems phishy I should make sure that what I’m seeing is actually what it purports to be. That is in fact the lesson all large web companies preach — be vigilant!

The problem, of course, is that the level of criminal sophistication perpetrating such deceptions keeps growing, to the point that almost anything seems possible. How do I know that someone hasn’t figured out a way to show me the appropriate URL, then redirect my traffic or keystrokes to a hostile server? I mean, I’m a reasonably sophisticated web user, but that only means I’m that much more aware of what I don’t know.

As it turns out, the change to my Gmail login ritual was not only initiated by Google, it was rolled out on the sly without, ironically, so much as an email that a change was coming. Meaning I had to get on the internet to find out that other users around the country and around the world were being confronted by that same autocratic change before I knew it was safe to log into my Gmail account.

Somewhere in the high-tech bowels of Google a group of very highly paid people got together and decided that they would roll out a new login scheme which requires twice as many clicks as the old scheme, that they would do so without giving notice to anyone who used that scheme, and that they would give no reason for doing so. That is exactly how the world ended up with Windows 8, and a whole host of other Microsoft initiatives to win market share and own technology spaces in complete disregard for its customers.

I suspect that the Gmail change has something to do with Google’s recognition that the world is going mobile, but the real story here is the contempt with which Google views its users. That is in fact the signature moment in any tech company’s life cycle — the one where current users are considered to be, at best, nothing more that a population to be exploited, and at worst, a hindrance to corporate goals that have completely diverged from the products and services being offered and utilized.

In terms of righteous indignation this barely qualifies as a 2, so I’m not suggesting anyone leave Gmail, but simply that you take a step back and get your mind around the contempt that any company would have to have in order to suddenly change the portal to your email account. Because those are the same people who have said they are not reading your hosted emails, or personally identifying your web traffic, or doing anything else you wouldn’t want them to do because they promised they wouldn’t be evil.

Update: It occurred to me last night that the new Gmail requirement that users click on two separate screens in order to log in, instead of only one as before, may have been initiated as a means of encouraging people to stay logged in all the time. While presenting as an initial annoyance, once users gave in and complied it would strengthen Google’s brand association with email products and the user’s reliance on same, preventing people from migrating to other platforms for chat and video, etc. The downside, obviously, is that it would actually make Gmail accounts significantly less secure if an always-logged-in device fell into the wrong hands.

— Mark Barrett


The Best Monitor / Display for Text

A few weeks ago, just before my keyboard died, my monitor momentarily flickered ever so subtly between displaying white as full white and white as soft pink. It happened so quickly, and the change was so faint, that at first I thought my eyes were playing tricks on me. Fortunately, a day or two later the same thing happened, allowing me to determine that the monitor itself was hinky.

While I have no qualms about opening up a keyboard to see if I can rectify a problem, or just about any other gadget you could name, I draw the line at messing around inside devices that contain potentially lethal capacitors. Combine that reticence with the flickering I had seen, and the low, staticky hum that had been building up in my monitor for the past year or two, and it suddenly seemed prudent to once again peruse the state-of-the-art display offerings available in the market before the very device I would need to rely on to do so failed completely.

(There are all kinds of things that can go wrong with a computer, and the most maddening aspect of all of them is that those issues immediately make it impossible to access the internet, which is where all the solutions are. If your operating system locks up you need to access another computer to research the problem. If your monitor dies you need to have another display on hand in order to order a replacement, which you would not need if you already had one on hand. Speaking of which, even if you use an add-on graphics card, the motherboard in your computer should always have its own graphics chip for exactly that reason. If your card dies — and graphics cards are always dying, or freaking out — you can still drive your monitor and access the web.)

As was the case with my venerable old keyboard, I was not at all surprised that my monitor might be at the end of its useful life. In fact — and you will no doubt find this amusing or absurd — I am still using a second-hand CRT that I bought in the mid-aughts for the lofty price of twenty-five dollars. While that in itself is comical, the real scream is that the monitor was manufactured in 2001, meaning it’s close to fifteen years old. Yet until a couple of weeks ago it had been working flawlessly all that time.

The monitor is a 19″ Viewsonic A90, and I can’t say I’ve ever had a single complaint about it. It replaced my beloved old Sony Trinitron G400, which borked one day without the slightest hint that something might be amiss. Scrambling to get myself back in freelance mode I scanned Craigslist and found a used monitor that would allow me to limp along until I found a better permanent solution. Eight or so years later here I am, still using the same A90. (In internet time, of course, those eight years are more like eight hundred. Not only were LCD’s, and later, LED’s, pricey back then, yet quite raw in terms of performance, but you could also reasonably expect to use Craigslist without being murdered.)

Between then and now I have kept track of changes in the price, size, functionality and technology of flat-panel monitors, and more than once researched display ratings with the thought that I might join the twenty-first century. Each time, however, three issues kept me from pulling the trigger.

First, while all that snazzy new technology was indeed snazzy and new, relative to CRT technology it was still immature, requiring compromises I was not willing to make in terms of display quality and potential effects on my eyes. Having always been sensitive to flickering monitors, I was not eager to throw money at a problem I did not have — or worse, buy myself a problem I did not want. (As a general rule, putting off any tech purchase as long as possible pays off twice, because what you end up with later is almost always better and cheaper than what you can purchase today.)

Second, at the time I was primarily freelancing in the interactive industry, which meant I was working with a lot of beta-version software that had not been fully tested with every conceivable display technology. Using lagging tech at both the graphics and display level meant I could be reasonably confident that whatever I was working on would draw to my screen, at least sufficiently to allow me to do my part.

Third — and this relates somewhat to the second point — one advantage CRT’s had and still have over LCD/LED displays is that they do not have a native resolution:

The native resolution of a LCD, LCoS or other flat panel display refers to its single fixed resolution. As an LCD display consists of a fixed raster, it cannot change resolution to match the signal being displayed as a CRT monitor can, meaning that optimal display quality can be reached only when the signal input matches the native resolution.

Whether you run a CRT at 1024×768 or 1600×1200 you’re going to get pretty much the same image quality, albeit at different scales. The fact that I could switch my A90 to any resolution was a boon while working in the games biz, because I could adjust my monitor to fit whatever was best for any game while still preserving the detail and clarity of whatever documents I was working on.

While imagery is and always has been the lusty focus of monitor reviews, there is almost nothing more difficult to clearly render using pixels of light than the sharply delineated, high-contrast symbols we call text. Because LCD/LED monitors have a native resolution, attempting to scale text (or anything else) introduces another problem:

While CRT monitors can usually display images at various resolutions, an LCD monitor has to rely on interpolation (scaling of the image), which causes a loss of image quality. An LCD has to scale up a smaller image to fit into the area of the native resolution. This is the same principle as taking a smaller image in an image editing program and enlarging it; the smaller image loses its sharpness when it is expanded.

The key word there is interpolation. If you run your LCD/LED at anything other than its native resolution what you see on your screen will almost inevitably be less sharp. While that may not matter when you’re watching a DVD or playing a game, interpolating text is one of the more difficult things to do well. Particularly in early flat panels the degradation from interpolation was considerable, making anything other than the native resolution ill-suited for word processing.  Read more ]


Requiem for a Keyboard

A week or so ago I was bashing away at my keyboard when I suddenly began producing g’s that looked like this — ‘g — and h’s that looked like this: -h. Having spent a fair amount of time working with computers I knew there were various reasons why my keyboard might suddenly become possessed, none of them even remotely exciting. Because those multi-character glitches were crippling my ability to type, however, I quickly set about diagnosing the cause.

Occasionally, when my mind is firing faster than my fingers can move, I’ll accidentally enter a key combination that performs some automated feat I did not intend to perform. Worse, because I don’t know which keys I hit in which order, I won’t know if some preexisting configuration was altered that will trip me up later. If I’m lucky the result will be something obvious, as happens from time to time with Google Mail, which seems to delight in auto-sending messages before I’m through with them. And of course there’s the perpetually irritating StickyKeys feature, which, as far as I know, exists only to remind me that my left pinky is loitering on the shift key while I’m thinking about what I want to type.

Because there are so many default key combinations that come with any word processing software, to say nothing of the keyboard software itself, and because it’s possible to invoke such macros by accident — including changing the output of specific keys — my first diagnostic act was to uninstall both Intellitype and the keyboard driver to see if that solved the problem. Which it did not.

My next concern — which grew rapidly — was that the errant behavior I was seeing was the result of a virus or malware. After running scans for both, however, my machine was deemed clean, which meant, almost certainly, that I was having a hardware problem.

My keyboard of choice is the Microsoft Natural Ergonomic Keyboard 4000 (MNEK4K), which also has, I believe, the longest name in the history of keyboards. The model flaking out on my desk was the most recent incarnation of the same split-board device I had been using for close to twenty years, since just after the MNEK4K debuted. It was at least six or seven years old, had seen regular use almost daily over that time, and given how loose and clicky the keys had become I wasn’t particularly surprised that it might have reached the end of its useful life. (Conservatively, total key presses on that board easily topped ten million, with the bulk taking placing on the most-used letter keys. Speaking of which, years ago the A, S, D and arrow keys lost their labels due to repeated use, and the M key only displayed the upper-left corner of that letter.)

As with any keyboard, from time to time a key had gotten stuck, so that was my first thought in terms of mechanical failures. Close inspection turned up nothing obvious, however, so my next idea was that six or seven years worth of wayward hairs, lint and food debris might be causing trouble I could not see. And that meant I would have to open up the board.

Now, if you’re not used to taking things apart, the idea of opening up a keyboard may seem fraught with risk. Fortunately, I had two things going for me. First, I’ve taken a lot of things apart over the years, from computer tech to automobile engines, so I have some familiarity with the procedures and practices involved. Second, my keyboard had been out of warranty for years, meaning I could hardly make things worse. If I didn’t open it up I would have to by a new one, and if I broke it by opening it I would have to buy a new one, so there was literally nothing to lose other than time and a little DIY dignity.  Read more ]


Graphics and Interactive Storytelling

In the mid-nineties I became fascinated by the storytelling potential of interactive entertainment. My interest peaked in the early aughts, during what I now think of as the second great wave of interactive storytelling mania. While the potential of interactive storytelling seems obvious to everyone, the mechanisms — the actual techniques — by which interactive stories might be told are complex and at times counterintuitive.

After finding my way into the interactive industry and meeting with some professional success, I was asked in 2000 to write an article for SIGGRAPH’s Computer Graphics magazine about the future of interactive storytelling. While great effort was being put into replicating techniques from passive mediums, including, particularly, film, it seemed to me that such an imitative approach had everything exactly backwards.

Recently, while conducting periodic maintenance on my computer and sprucing up Ditchwalk, I ran across that article, which for some reason I had never gotten around to adding to the Docs page on this site. That omission now stands corrected.

The title of the article is Graphics — the Language of Interactive Storytelling. Coming from someone who primarily made a living with words that may seem odd, but it and the accompanying text goes to the heart of the interactive storytelling problem, and why so little progress has been made. In fact, the only thing that’s changed is that we no longer worry about having enough processing power to do what we want — yet today’s enviably high hardware ceiling is still rarely used to facilitate aspects of interaction that might truly drive emotional involvement.

Fifteen years on, during the fourth great wave of interactive storytelling mania now taking place in the industry, little has changed. Another generation of eager developers is grappling with the same questions, reaching the same inherently limiting conclusions, attempting to once again adapt non-interactive techniques from passive mediums, and confusing the revelation of pre-designed outcomes with choices that determine outcomes.

— Mark Barrett

Site Seeing: Daniel Menaker

One additional nugget I managed to recover while fixing broken links was a post on the Barnes & Noble site, written by Daniel Menaker. Who is Daniel Menaker? Well, at the time I knew almost nothing about him, to the point that I described him — hilariously in retrospect — as “another dirt-dishing voice” in the publishing industry. (Saving me somewhat, I also noted that he was a former Editor-in-Chief at Random House and Fiction Editor at The New Yorker.)

Re-reading the B&N post after five years, however, I found myself more curious about Mr. Menaker than about publishing. A quick search led me to a memoir he’d written, titled My Mistake, which was published in 2013. Interestingly, in reading that book I found that the context of Mr. Menaker’s life gave more weight to the views he expressed in the B&N post, as well as those in that book and in other writings I discovered.

Now, it may be that confirmation bias played a part in my reaction because much of what Mr. Menaker had to say jibed with my own conclusions, but I don’t think that’s the case. Not only do I think he would disagree with some of my grousing here on Ditchwalk, but my interest in understanding the publishing industry has decreased so much in the past five years that I now consider such questions moot at best. (For example, five years ago I would have deemed this story important. Today it seems meaningless.)

Still, as an outsider corroboration is useful when you’re assessing any human endeavor, to say nothing of doing so from the relative orbit of, say, Neptune. In reading My Mistake I found a fair bit of corroboration for conclusions I’d previously reached, yet after I finished the book I also decided to see what others had to say as a hedge against my own potential bias. That impetus quickly led to this review in The New York Times, which caused me to stare agape at my screen as I read what seemed to be a bizarro-world take on the same text I’d just digested:

Make no mistake, this is an angry book. Menaker is angry at himself for his character flaws (a flippant one-­upmanship that alienates others), and he is thin-skinned, remembering every slight. As a former executive editor in chief of Random House, he is proud to have nurtured writers who went on to win literary acclaim (the Pulitzer Prize winner Elizabeth Strout, the National Book Award winner Colum McCann). Menaker is understandably upset over being ousted from that job in 2007, but what seems to truly infuriate him is being shunned by the publisher, Gina Centrello, during a transition period.

I honestly don’t know what that reviewer is talking about. My Mistake is not an angry book, unless your definition of anger includes expressing an opinion. And no, Mr. Menaker is not infuriated about being shunned by anyone — or at least not anyone in the publishing biz. If anything, he’s infuriated by his own serial incapacity to connect with other human beings in his life, though over time — and particularly in the writing and structure of My Mistake — I think he belatedly squares things with his departed father.

Then again, that’s the publishing industry in a nutshell. You can spend a year or two writing a book, yet when it’s reviewed — in this case, by no less than the self-anointed consensus cultural steward of commercial literary criticism — you can still end up being cleaved by a reviewer with an axe to grind, or mischaracterized because of a reviewer’s blind spots or personal acidity. (If you also worked in publishing for a time you might even be the recipient of some score settling.)

Read more ]


Site Seeing: Laura Resnick

Speaking of reclaiming busted links, one benefit I didn’t anticipate was that chasing down lost pages put me back in touch with information and sources I previously found valuable. For example, while I was ultimately thwarted in my ability to recover an excellent post by Laura Resnick concerning cover design, digging around on the web for that missing content led to two informative discoveries.

First, I eventually found what I think is a more recent discussion of the same subject here. (The first link at the bottom of that interview is the same busted link I was trying to track down.) Second, when I went to Laura’s new site I found a great resource page that every independent author should bookmark and peruse.

Sure, the fact that I don’t have a resources page suddenly makes me look very bad in comparison, but that’s all the more reason to visit Laura’s site and check it out.

— Mark Barrett

2015 Iowa Poetry Writing MOOC

Several of the busted links I recently splinted together involved last year’s poetry and fiction MOOC’s from the University of Iowa. In tracking down those errant U of I pages — or at least the most recent placeholders — I ran across mention of this year’s free offerings.

Fortuitously, the 2015 version of How Writers Write Poetry just opened for registration on April 13th, after an initial delay. Registration will close on June 1st.

Having weathered an avalanche of entrepreneurial hyperbole about MOOC’s coming from Silicon Valley and its academic proxies, I think it is a good sign that the University of Iowa is continuing to make these courses available. The bottom line with any MOOC — as with anything else in life — is that you’re only going to get out of it what you put into it. For people in far-flung locations around the globe, however, having access to such experiences could be life changing.

— Mark Barrett


Link Rot Postmortem

Ugh. So here’s what I learned while banishing broken links…

* Broken Link Checker isn’t completely intuitive, but as of this date it’s current and supported. If you have a question you may not get the answer you’re looking for, but you’ll get an answer, and you’ll profit from it.

* Over the past year the BLC plugin reported (via email) in fits and starts for reasons I did not understand. In reading up and poking around, however, I discovered a ‘server load’ setting which seems to act like a throttle. If you set it very low — meaning lower than the reported server load — you effectively idle BLC until the server load drops. Or at least I think that’s what happens. In any case, when I raised the number above the reported load, BLC sprang into action, so if you’re not getting activity when you expect it I would check that setting. (Also, if you’re on shared hosting, consider changing that setting at night when the server load is low. BLC may run much faster.)

* When you’re working on each individual broken link, going slowly and searching for missing pages on the web can be surprisingly fruitful. I fixed quite a few dead links where the missing page’s URL had been altered without a redirect. Once located, copying and pasting the current active link in place of the broken link solved the problem.

* I initially decided to deal with a minimum of twenty-five links each day, but the first day was tough. As it turned out, however, much of the struggle was due to the fact that I had no process or workflow to follow, so what I was really fighting was the learning curve, not the task. On the second day I probably fixed or killed fifty or so links, then the following day I finished off the remaining sixty or so, meaning it took me three days to get through my backlog. (If you’re in the weeds like I was, or worse, just make link-fixing a chore that you come back to again and again until it’s done.)  Read more ]