Old way to qualify typing

I’ve mentioned this before but never get info about it and I can’t find it searching the web.

Everything I find talks about, for most situations, that 97% accuracy is considered fine for a typing job. A few special cases are higher, but 97% is pretty much the norm apparently.

I still don’t get it since that means you make an error once every 33 characters. I can see that as being okay in a digital world where corrections are so easy - I can often feel the mistake as it happens and automatically correct it. But even if not caught until I proof read, it is easy to fix.

But I have a hard time seeing this as being acceptable in an analog world like I grew up in. But while I’m old enough, I never took typing class so I really don’t know what was considered good back then.

After all, even if I catch an error as I type, I’d have to stop, use white-out or one of those little tape things to cover up the error and type over it. If you find the error after removing the paper from the typewriter, you have to also hope you can properly line everything up the way it was originally. All this would be many times more time consuming than typing digitally which means one error in 33 characters would be bad to me. Personally, I’m trying to get up to 99%.

But there must be people here who learned to type in a class back in the analog days and could tell me what they were taught was consider “good” for a job. Or was it the same as now and we were just really wasting a lot of time? Or allowing errors to remain?

Because we don’t type the same way now as then. Back then we often hand wrote things and then typed up the final version. And you don’t remember tippex (what Americans call whiteout)?

But no matter that your source is, the typing is essentially the same which brings us back to the acceptable error rate for a typical job.

I remember whiteout - I referred to it in my post. And I remember the correction tape where if you accidentally typed an “e”, you would move back to it, insert the tape and hit “e” again, which would deposit white covering on the “e” and then you could go back again and type the correct letter.

But this all goes to my question. These are nowhere near as quick as hitting something wrong and digitally correcting it. A person with an effective speed digitally of 60 wpm, with error adjustment, would be much faster than the another person typing just as fast with the same number of errors, because of the time it takes to make corrections.

In my typing tests, even a tiny delay (hit wrong key, hit delete, hit correct key) will show an obvious reduction in typing speed in a one minute test. If I am doing 60 wpm, that’s 300 characters. That’s 9 errors in that one minute.

Now, doing it on a manual typewriter, I make a mistake and have two options:

Whiteout: apply the stuff, wait for it to dry, go back and type over it.

Tape: Go back to error, slide tape in, type same error to cover it up (sometimes had to do it more than once to really cover it), type correct letter.

All this takes many times longer. Think of how many letters I could type, if my speed is 60 wpm, in the time to just fix one error either of those ways.

Not sure, but I think eventually they made built in correction ribbons which would be much faster, but they didn’t always have those.

So, it still seems to me there must have been some differences in how typists were evaluated, which MIGHT include:

They had to be more accurate than that we accept today.

The speeds were calculated differently - like not considering how long it would take to fix an error. So if you typed 60 wpm and made 9 errors, they simply counted it as 51 wpm with no consideration for how long it would actually take to correct (thus on the test you would never stop to correct an error as you go or at the end either).

A certain number of errors in documents was just accepted.

They used whiteout at the end and manually wrote in the correction even though it wouldn’t look very good.

Whether they are typing from written text, a book, from their own thoughts, or anything else, I don’t think that matters much.

It’s been just over 50 years since I took my typing class, but it was just a timed test and calculated your WPM. It wasn’t about an “acceptable error rate”. If you made a typo it wasn’t counted in your WPM (you didn’t go back and correct your typos). Everything was about WPM. Theoretically, one could have a 100% accuracy rate, but only type 3 WPM. It really was (and probably still is) all about WPM.

But I do appreciate that I learned how to be a touch typist (didn’t realize at the time how valuable that would be for me – most helpful in graduate school). :grin:

And thank goodness we are well past typewriters…

In Gestetner days no errors were acceptable. Then we got white-out. Of course the more white-out you apply the slower your words per minute. Now we have predictive text and auto spell check. One error in 33 characters = 1 in 6.5 words is wrong as in possible gibberish. Sobbering?

Let me make sure I understand this. I figured you couldn’t actually go back to make a correction, but when you say a typo wasn’t counted, what does that actually mean?

You see, I could imagine several possibilities:

You type 300 characters (60 wpm), but make 9 typos that are not fixed. Today that would usually subtract one word for each error, thus be calculated as 51 words a minute.

But you might mean they just count it as 60, no matter how many errors, as long as you have 300 characters.

Or it might mean you take the 300, subtract 9 typos, giving you 291 characters which would be 58 wpm.

Does that mean if they found a typo (before whiteout or correction tape), they simply had to type the whole page over? That would pretty much require perfection over a course of hundreds of words.

Anyway, this is why I really question today’s standards and whether they applied the same before. As you say, the rate of errors is really high, but only (to me) acceptable because of the ease of making corrections.

I remember trying to explain the value of a computer to a co-worker and it was really hard to get him to understand. The example I used was a telephone list which he had to periodically update.

Each time, he had to type the whole thing over to insert names, change numbers sometimes, etc. And he also found himself starting over because of typos.

He felt it was no different with a computer. That is, you still type and you still make errors and you still have to update sometimes. He couldn’t seem to grasp the concept that without a computer, every effort to retype because of typos means you might get a new typo. B the computer would maintain everything that was already correct. You just correct the item needed it and hit the print button. Couldn’t get him to understand.

“Or it might mean you take the 300, subtract 9 typos, giving you 291 characters which would be 58 wpm.”

That was how it worked in my high school class way back when. You didn’t fix your errors, and any errors were deducted from your total WPM.

Thanks. That’s quite different then I see now. One wrong character takes away one wpm rather than just the individual character only. Were you ever taught what kind of speed or accuracy was expected in typing jobs?

Gosh, not sure I can remember the specifics, but it seems like you needed 80 WPM with 97% accuracy to get an “A” - could be wrong though.

Still really glad I took that class in hindsight!

So the same percentage I have now but more wpm (usually I see 70). However, with the difference in how calculated and assuming 97% accuracy, to get 80 wpm would be 400 characters. But that’s with no errors. With errors you have twelve more characters that were wrong. So 412 characters total. Calculated the modern way would be 82 minus 12 which would be 70 today.

I don’t think typing is the same back then as it is now – that’s what I’m trying to say. Whereas now we are inclined to directly think via typing (which I would argue creates more chance for typing errors, which is fine), typing back in the typewriter age was less about using the typewriter to think on paper (of course, except some novelists and certain writers actually did so, but that was less common then) and more using handwriting to think on paper, then transferring the “final” thoughts into typed format.

That makes typewriting different back then than now.

Oh, I know my speed will be slower as I’m thinking of writing something original compared to the same kind of material reading off the screen - the process of thinking of the words to use causes small delays.

But today the typing tests are normally reading off a screen and I think the typing tests in the old days were also typing that you were reading. So, when it comes to tests, the process seems like it would be the same - other than where the words are that you are reading from, of course.

I took typing in 1956 as a summer school class between 9th and 10th grades. Speed was computed based on 100% accuracy in the class I took. If your accuracy was not 100% then your speed test didn’t count. :slight_smile:

How long was the test? Or was it simply “finish the text no matter how long it takes”?

I can get 100% on 1 minute tests and I may have managed it once or so on a 5 minute test, but neither of these things are normal for me. So I mostly hope to get 99% or better. Don’t get that always either!

As I recall we were tested regularly with texts of varying lengths. All tests were timed to determine how long it took each student to complete the typing. That was 60 years ago so my memory of the class sessions is somewhat less than crystal clear. :slight_smile:

Thanks. This typing stuff can get complicated. I’ve done some online tests where the material is a set length and some take well under a minute. Some are exactly a minute. So can be set by the user.

I personally always felt that a 5 minute test was probably best, but it just takes too long when doing lots of tests. Depending on what test I’m doing, I’m usually either doing a 1 minute test or a 30 second one if I want a quick result.