The headline says it all! I finished my entry, polished it, had it beta-read, and sent it two weeks before the deadline. Last year, I barely finished a draft under the word limit by the deadline, let alone performed quality assurance. Further, I managed to do all this while only cutting two scenes—maybe 500 words. Last year, I cut 4,000 words—the length of my entire entry.
I may, at last, be getting the hang of this fiction-writing thing.
I’m surprised that this year, Daylight Savings Time (DST) is only a minor divot in my productivity. Most years it lands me on my butt for two weeks or more. Here’s what I think is different:
DST may be bad, but more daylight in general is good. I’m getting out in it, too.
I’m solidly on my keto diet, more solidly than I’ve been for three months.
I’m really focusing on using the tools I’ve found helpful and ditching those that aren’t.
I’ve found that revising a novel outline (or even creating one! but that’s another post…) is difficult, so I spent some time setting up a system to be sure that I can use my best tools.
Limiting how far I can wander when distracted. It’s way too easy for me to go crazy writing in the index card/synopsis area of Scrivener. I’ve abandoned projects before because I’d put Too. Much. Information. into the outline. Now that I’m using a variant of Save the Cat! Writes a Novel (Jessica Brody) outlining, physical index cards for outlining are a suggestion. The nice thing about physical index cards is that there’s a limit to how much I can put on one; the crummy thing is that getting them back into Scrivener is a pain. So I artificially limit the size of my Scrivener synopses. I feel better if each synopsis (beat) only contains the amount of info I can jam onto a 5″×3″ index card (11 lines of 43 old-fashioned typewritten characters each.) Yes, I counted. Yes, I’m obsessive. Sue me.
Handwriting. Misery, thy name is “dictation,” but I find that typing is not ideal for my fiction, either. It works for my nonfiction, but my fiction works better if I handwrite. Everything. (Not on paper! I use a handwriting keyboard.) When I’m stuck, I get antsy and want to have a pencil in my hand. Today I spent some time setting up both my iPad and my iPhone to be effective handwriting editors for those 437 characters. I now have a smooth outlining workflow from Scrivener Mac to the iThings and back.
The result? I managed to get seven Save the Cat!-style beats revised today. The appropriate sections affected by changed beats are marked for intensive revision. I feel like I can rip right along now and get all the stuff I dictated in November docketed and slated for revision in a week.
Damn! That is amazing! Here’s hoping it’s not a flash in the pan.
While waiting on my beta readers for my NaNo Los Angeles submission to get back to me, I picked up the novel that I’ve been working on two. Freaking. Years. Now.
But I’m not frustrated… much.
The problem is that I started the novel about three outlining methods ago. My most recent notes are Story Genius (Lisa Cron) character background scenes. (I’m still using them—very useful and I’ll never start a novel again without them. Don’t need them for short stories, though.) Less recent notes are Save the Cat (Blake Snyder) (40 chapter “scene” cards), all of which need updating at the least. I even have some very old notes that date back to Rock Your Plot (Cathy Yardley).
What I’m finding, to my sorrow, is that I never used the logline template from Save the Cat! Strikes Back (Blake Snyder.)1 Since I started the work, I’ve learned that if I can’t fill out the logline template, I don’t have a story—yet. When I filled it out Wednesday, I realised that one of my favourite characters has to die.
In fact, she needs to die about 20% of the way through the book. Dammit. But keeping the poor woman alive was twisting my story. I’d begun to dread writing her scenes. I couldn’t figure out how for her to interact with anyone else.
That’s because she was supposed to be dead already.
When I showed my new logline to my son, his reaction was, “Of course.”
The good news is that I doubt I’ll need to throw out more than about 3K words out of my 90K target. She has that little impact on the story.
That’s how much she needs to die.
Excuse me while I go make my murder mystery more murderous.
Wow! This is amazing. There’s no comparison to writing my entry for last year’s NaNo Los Angeles anthology. Last year I scrapped a 4,000 word draft at about this point, and was scrambling to make the March 21 deadline from scratch. This year, I’ve only scrapped about 900 words. I will have time to adjust and polish before the March 31 deadline.
What’s the difference? There’s two parts. First, last year was when I finally created my own method of, for lack of a better term, “outlining.” I learned how to identify what I think of as “islands” and Save the Cat! (Blake Snyder) describes as “beats”. (I hate that term, by the way, but I’ll use it so as to communicate.)
I know that Mr. Snyder insists that all 15 of his beats be nailed down before starting to write, but I… well, I just couldn’t. Not the way he does it. What I did was write a logline using the “enhanced logline template” from Save the Cat! Strikes Back (also Blake Snyder), Chapter 1. This rocks for me.
The enhanced logline hits the high points of the beats (“islands”) without forcing me to think of a precise beginning or ending in advance. It skips several of Mr. Snyder’s fifteen basic beats—but the logline structure enables me to just write in my usual “seat of the pants” manner. When I start writing, I think, “Okay, I have a 4000 word hard limit, so that means everything up to and including ‘breaks into the second act’ has to happen in the first 900 words or so…” This helps keep me focused. Usually no more than a paragraph into a diversion, I’ll be able to ask myself, “Will this get me to my next island in 900 words?” If not, it goes. Often I don’t even bother writing the paragraph. “Yeah, that’s interesting, and if I had words to spare I’d go there, but…”
(By the way, if you want the lessons of Blake Snyder’s books condensed to one volume and re-phrased for narrative fiction rather than screenplays, Save the Cat! Writes a Novel (Jessica Brody) will save you the tedious translation of screenplay jargon into novel jargon.)
The other difference is that last year, I hadn’t had the experience of working with professional editors on fiction. I hadn’t had five subplots ruthlessly cut because—Dun! Dun! Dun!—they had nothing to do with the main story. In a 4000 word (max) science fiction or fantasy story, I can’t mess around. I need to get my universe established without wasting words, and get the story moving—fast. Maybe I can wander a bit in a novel—but if a subplot has no effect on the finale, I can and should chop it no matter how personally interesting I find it. Not helping me get to an island (beat)? It’s got to go.
In case you think that this eliminates all the art in a story—it doesn’t. I composed my best sentences in last year’s story because I absolutely had to cut something and yet convey its concept better. My story was more compelling for cutting extraneous events to the bone.
So onward! This year’s NaNo Los Angeles deadline is coming!
Once again I’ll be submitting an entry for the sixth annual NaNo Los Angeles anthology! This year’s theme is “exploration and the unknown”, with an added element of “something left behind.” I’m already two-thirds through it, despite medical appointments. I might even finish it before the deadline this time!
I’m a visual and hands-on learner. I learn from reading, from diagrams, and by doing. Lectures or videos (i.e., listening)… well, just send me the notes, Professor.
Why should I be surprised that my storytelling is just as non-auditory? I don’t mentally “hear” words I’m writing. When I imagine scenes, I imagine mostly action and images—dialogue comes third. And when I need to overcome some problem when writing, I pick up a whiteboard marker and start drawing diagrams.
So, the best way for me to end-run writer’s block is to pick up a stylus and use handwriting recognition. It’s closer to drawing diagrams than is typing. Typing’s faster, but that’s ok. If I’m blocked, I need time to connect words to the diagram of the story in my head.
Why a Handwriting Keyboard
Many other Scrivener authors prefer handwriting. Many of those use note-taking apps to convert their handwritten copy to text. They then paste that text into iOS (or even Mac) Scrivener. I find this process cumbersome.
You see, it’s not over when I convert the text and paste it into Scrivener. I proofread for missed recognitions. I change dumb punctuation to smart punctuation. I fix fouled-up line and paragraph breaks. Finally, I add any needed rich formatting.
I grant you that Scrivener (especially desktop Scrivener) automates some of this, but still… that’s a lot of cleanup. It’s not as painful as cleaning up Siri-transcribed dictation, but it’s not fun, either.
That’s why I prefer to use an iOS third-party soft keyboard that has handwriting recognition. These add handwriting input to any iOS app that accepts text—even Scrivener. With such a keyboard, I correct or prevent missed recognitions as I go along. I add smart punctuation from Scrivener’s extended keyboard row. I ensure that line and paragraph breaks are right to begin with. I add rich formatting as I go along, just as if I were typing.
When I’ve finished writing for the day, there’s no cleanup to be done. It’s all already in Scrivener. In short, there’s much less friction between my handwritten output and Scrivener’s input.
Why Not a Handwriting Keyboard
Some folks have trouble getting decent recognition from a keyboard, no matter how much they tweak settings. For them, a note-taking app may work better.
Then there’s the iOS “full access” issue. iOS gives third-party keyboard processes only a small amount of memory and storage to use, and strictly prohibits network access. But in order to recognize characters and access their dictionaries, the keyboards need access to their standalone app—which means they need “full access”. Therefore the app—as well as the keyboard process—has access to your keystrokes. Apps are free to use network resources. An unscrupulous app developer could conceivably send your keystrokes via the internet to, well, anyone.
To be fair, 99% of iOS third-party keyboards ask for full access. The only one I’ve tried that didn’t, crashed. A lot. And of course, any ordinary iOS app could transmit your information without your knowledge.
I’ve used these keyboards since 2014 and never had a security issue. But if this bothers you, by all means avoid third-party soft keyboards.
Tips For Using Handwriting Recognition
Explore settings. If an app has settings such as length of pause before conversion, telling it the shape of the characters you write, and so forth—experiment with them! I’ve never had a handwriting keyboard app that I was happy with out of the box. A little time spent in customization can pay big dividends in accuracy of recognition.
Avoid slanting your letters. Even in cursive, you’re better off writing your letters vertically. Arrange your device so that your letters come out straight up-and-down.
Exaggerate word spacing. Word separation that’s perfectly fine for human reading can confuse a handwriting recognition app. You may need to increase the spacing between words if more than one word can be recognized at a time. Conversely, if you’re trying to write hyphenated or compound words that aren’t in the dictionary, you may need to crowd the letters a bit, or use single letter input.
Apps That Provide a Handwriting Keyboard Usable From Any iOS App
WritePad I has a handwriting note-taking app integrated into its main app, and that’s where you set options for the keyboard process. My review addresses only the keyboard process, not the note-taking app.
The WritePad I (WPI) keyboard has a lot to love.
WPI offers 15 different possible languages/dictionaries.
WPI offers continuous cursive input.
It will accept cursive input after a (selectable) recognition delay (as MyScript stylus did). If the primary recognition isn’t correct, you’ll need to select an alternate recognition before the delay expires.
Otherwise, you can set up what WPI calls “continuous writing”:
You have as long as you like to look at alternative recognitions.
If none suits, you can back up in your line of writing and redo some words.
If you’re satisfied with the first alternative, you can keep writing by overwriting the line you’ve just written. WPI will enter your overwritten line and start recognizing the new line. Otherwise, choose an alternate recognition, and that will be entered and you can start writing again. This is my own preferred mode; I glance at the alternatives and if the first is OK, I keep on writing with hardly a pause.
WPI follows the color scheme of the app you’re using it in, light or dark.
You can customize each character—for example, for the letter “A” you get several choices as to how you draw a capital “A” and several for lower-case “a”. You get to mark these choices as “frequent”, “rare”, or “never use”. Do take the time to set these up.
It automatically adds new words to a user dictionary, which you can edit via the main app.
You can set up shortcuts, which you can then access with a pop-over menu while using the keyboards.
It offers an AI training for your handwriting.
Once you set it up, its recognition is very good.
On the other hand, there are a lot of fiddly settings and it’s not always clear which apply to note-taking and which to the keyboard process. Some apply to both. Best to plan on an hour or three experimenting to find what suits you best.
This is the iPhone version of WritePad I, but it’s frankly not as good. It, too, has a handwriting note-taking app integrated into its main app, which is where you set options for the keyboard process. Again, I’m reviewing only the keyboard.
First, it’s no longer being actively developed, so while it’s still available on the USA App Store, I suspect it will go bye-bye at the first incompatible iOS update.
Aside from that sad news, in portrait mode it only recognizes single characters. (If you remember Palm Graffiti, it’s like that.) Landscape mode, though, has the continuous cursive capability of WritePad I.
It’s easier to describe what Penquills doesn’t have, compared to WritePad I:
It offers 8 languages/dictionaries instead of WritePad I’s 15.
It has no AI handwriting training.
Other than that, it’s identical to WritePad I’s keyboard. I do use it on my iPhone, but only in Scrivener. In any app that’s forced to portrait, its one letter at a time pace is too painfully slow. (NOTE: Scrivener will only work in landscape mode on an iPhone with an iPhone 6 size screen or larger. iPhone 4/4s/5/5c/5s/SE screens are too small.)
I admit I don’t like Mazec EN. It has the superb recognition that MyScript Stylus boasted, but that’s as far as it goes.
It’s comparatively expensive. It’s always in light mode—it doesn’t match dark background apps. Most annoyingly, I always have to tap “enter” in order to enter text—it doesn’t have recognition after delay (as MyScript Stylus did and WritePad I does) or semi-automatic entry (as WritePad I does.)
And as the name implies, it only recognizes English, and that with only one dictionary. If you can’t get decent recognition with WritePad I or Penquills, and you always write in American English, it’s worth a try, I suppose. But personally, I’d rather use the Phatware products.
Seasonal affective disorder has gotten me again. Illness (including looking forward to two surgeries, oh boy) hasn’t helped. I’m sort of keeping on writing, but I haven’t made the progress I’d like. (A big surprise. Again.)
I’ve been using handwriting input to make what progress I’ve made. Handwritten input jumpstarts my writing, although it doesn’t help me proceed with speed. Next post, for sure, I’ll put up an article about the current state of iOS handwriting input, both keyboards and standalone apps.
In November, I wrote about giving your older Mac laptop a “Retina” screen by enabling it with SwitchResX. Well, I just upgraded my MacBook Air 11 to Mojave (MacOS 10.14.2) via a “clean install”, and it took away my beautiful high-resolution, pseudo-Retina screen! As an obsessive nerd, I could not possibly let that one stand.
It seems that years and years ago, I installed Apple’s dev tools, which enabled HiDPI (pseudo-Retina), unbeknownst to me. When I did a clean install of Mojave, I wiped out both the tools and the HiDPI capability. Oops.
Start from an account for which you have admin privileges.
Open the Terminal app (you can find it in Applications/Utilities)
Copy and paste the following command: sudo defaults write /Library/Preferences/com.apple.windowserver.plist DisplayResolutionEnabled -bool true
Press return. Terminal will ask for your admin password. Provide it and press return.
Restart your Mac.
Now HiDPI (AKA “Retina”) resolutions should be available in the SwitchResX menu, assuming your laptop screen is capable. Enjoy!
N.B.: This works because of two effects: SwitchResX enables you to go to a scaled resolution larger than the largest “native” resolution on many monitors (AKA “stretched” resolution)—including the MacBook Air 11 built-in monitor. Enabling HiDPI enables you to use a “half-resolution” or HiDPI. Thus, the text is four times sharper because it uses four times as many pixels to render text. With both these effects in place, I can get a 1280×720 “Retina” resolution on my old MacBook Air 11. If a monitor can’t display a stretched resolution, the best it can do for a HiDPI is half the resolution of the maximum native resolution. For example, this doesn’t do much of anything for my LG Ultrawide, which can’t display a stretched resolution. So results are entirely dependent on what SwitchResX can do with your display hardware.
I swear before anything sacred you care to mention: I will never attempt to dictate fiction again. If my hands become too arthritic to write, I’ll just have to give up writing.
There are writers who find that dictation frees their inner creativity better than any form of writing. I’m happy for them. On the other hand, I look at the words I dictated during November—
and I feel no connection to them. Nothing. It’s as if someone else had written them.
It’s not that, as raw words, they suck much worse than any other writing I’ve ever done. They’re just not my words. I don’t connect to them.
My own theory is that it’s because I use a different area of my brain to speak rather than to handwrite or type. A part that isn’t as fluent in English as my fingers. Certainly I can’t speak extemporaneously, and any attempt to engage in spirited intellectual debate comes to a dead stop when my brain refuses to produce a word that my mouth can form. My family are accustomed to thirty-to-sixty second pauses while my brain—which knows darn well what concept it wants to express—struggles to come up with English words in spoken form to express it.
And there those pauses are, on tape.
I hear thirty to sixty seconds of dead air in the middle of a phrase—not at the beginning of a paragraph or the start of a sentence, when I might be planning the writing to come, but in the middle of a freaking phrase I’ve already started—during which my brain must have been desperately scrabbling for verbal sounds to go with an idea I was trying to express. When the words come out, they’re… feeble. I listen and I know they’re wide of the mark. Not only that, but they’re also not what I would have written. The idea was processed by a different system.
I like the words I write much more than the words I dictate. I feel connected to the words I write. Speech, if it has a place in my workflow, will come after I write, pointing out awkward phrasing. It’s a QA tool, not a manufacturing tool.
And yes, I’m still processing the bloody dictation. Dammit.