I’ve mentioned this before but never get info about it and I can’t find it searching the web.
Everything I find talks about, for most situations, that 97% accuracy is considered fine for a typing job. A few special cases are higher, but 97% is pretty much the norm apparently.
I still don’t get it since that means you make an error once every 33 characters. I can see that as being okay in a digital world where corrections are so easy - I can often feel the mistake as it happens and automatically correct it. But even if not caught until I proof read, it is easy to fix.
But I have a hard time seeing this as being acceptable in an analog world like I grew up in. But while I’m old enough, I never took typing class so I really don’t know what was considered good back then.
After all, even if I catch an error as I type, I’d have to stop, use white-out or one of those little tape things to cover up the error and type over it. If you find the error after removing the paper from the typewriter, you have to also hope you can properly line everything up the way it was originally. All this would be many times more time consuming than typing digitally which means one error in 33 characters would be bad to me. Personally, I’m trying to get up to 99%.
But there must be people here who learned to type in a class back in the analog days and could tell me what they were taught was consider “good” for a job. Or was it the same as now and we were just really wasting a lot of time? Or allowing errors to remain?