Last edited by John K Jordan; 03-28-2019 at 10:37 AM.
Sigh away. This is like pulling teeth.
Here, maybe this will help explain why, to get a valid measurement, the dial indicator's probe must be perpendicular (90°) in two planes:
- forward and aft horizontally, and
- up and down vertically
to get a decent measurement.
Even the slightest deviation away from perfectly perpendicular, in either the horizontal or vertical planes, even by just a couple of degrees, will throw the measurement off, and by more than the .003" that is supposedly "a lot" of misalignment.
In this video, the tip of the probe's point of contact with the tooth does not move appreciably. (I can can tell when its point of contact on the tooth moves, because it makes noise when it drags across the carbide, and it didn't when I did this.) The only thing that changes when I rock the blade back and forth (by maybe 3/16") is the angle of the indicator probe, away from 90° and then back through 90° and beyond, and then back, in two planes. Watch what the needle on the dial indicator does.
And as I and Andrew have said repeatedly, this is just one of several places where errors – or machine tolerances – of more than .003" can be introduced.
This is why I say that beyond a certain point, it is "silly" to pursue it further, because you're chasing your tail. You simply don't know, and can't know, without a lot more sophisticated instruments and procedures, where the discrepancy lies. It gets lost in the "noise." You might as well debate how many angels can dance on the head of a pin.
Furthermore, it's "good enough" (gasp) for woodworking because other sources of error – how well you hold the wood to the fence and down to the table, etc etc etc etc etc ad nauseum infinitum – will be orders of magnitude larger than any remaining misalignment.
And with that, I declare this horse officially beaten into dogmeat.
Have fun!
Last edited by Jacob Reverb; 03-28-2019 at 12:28 PM.
Jacob, what you are demonstrating in the video is not errors of machine tolerance, it is the use of the wrong tool and a flaw in your method of measuring. It has no relation to the cut the accuracy of a cut on a piece of wood. Clamp a piece of wood to the mitre fence/ sliding table and cut it, and look at it, that will tell you a whole lot more relevant information.
Whoa. I thought it said table saw "turning" and not tuning. I was curious to know who was so brave. Lol.
Exactly what I was attempting to show:
"And as I and Andrew have said repeatedly, this is just one of several places where errors – or machine tolerances – of more than .003" can be introduced."
I should also add that I agree with you: To align my TS, I don't use the much-vaunted dial indicator. I just clamp a rule to the miter fence and align the trunnion such that a particular tooth just drags against the end edge of the rule at the front of the table and at the back of the table. In my experience, using a dial indicator for this is a good way to start drinking and tearing your hair out in big wads. Does this method get me within .003" or .0003" or .0000000000000000000003"? I don't know, and care even less.
Finally, anyone who doubts my claim – that "After a certain point, the whole exercise becomes silly" – needs only to look at this thread.
Last edited by Jacob Reverb; 03-28-2019 at 1:03 PM.
Jacob, if the probe is moving then the fixing point is not strong enough.
Also I’d probably use a shorter indicator and get it closer to the blade with a magnet stand (I have a good old mitutoyo which locks firmly.
Last edited by Brian Holcombe; 03-28-2019 at 1:07 PM.
Bumbling forward into the unknown.
The video is kind of fuzzy, is the tip flexing or is the whole indicator rotating in its mount? Either way, that is a different source of error- movement or flexure of the measuring tool. Not from angular deviation of the inidicator while taking a proper reading, which is what you seemed to be arguing.
If you zero the indicator out against an object, then rotate the whole indicator two degrees in its mount, then the measurement will change based on a triangle that extends from the pivot point to the tip, which in your case might be 4 or 5 inches long. So you would see quite a lot of error and the reading would be garbage.
But if you lock your indicator down rigidly like you should, prior to zero-ing, and the indicator happens to be some small angle out of perpindicular, then the measurement error will be described by a triangle whose long side is the length of the discrepancy you are measuring (a few thousandths). So the error is extremely small.
I am trying to address the specific claim that you can't rely on dial indicators because you can't ensure they are 100% perpendicular to what you are measuring. This is not correct. Yes, it's critical that the indicator mount be rigid, and the accuracy of the reference surface (in this case the miter slot) limits the reliability of the measurement. But the indicator being a few degrees out of perpendicular is not a problem. Dial indicators are extremely useful, inexpensive, and can be used in myriad ways if you are clever. If your original claims were true, they would be entirely useless. And despite all of the potential sources of error, can you seriously claim that you can get a better setup on a tablesaw with a ruler than with an indicator and a good mount?
I am sympathetic to the general point that there are many sources of error, and that the smaller you try to measure the more significant they all become. I get that it annoys you that some people insist on 0.001 accuracy. For me, if I am doing a setup and it is not that much extra work to get it to, say, 0.002 according to my indicator instead of 0.010, then why would I not do that?
Last edited by Robert Hazelwood; 03-28-2019 at 2:09 PM.
That's not a bad method. If you are picky about how much drag you accept, then you can get it just as accurate as with an indicator. You can set jointer knives by seeing how far they move a ruler or stick when you rotate the cutterhead- also a good method, and very sensitive. But I prefer a dial indicator for the latter, not because of accuracy so much as that I can get real-time feedback as I raise and lower the knives, instead of having to tighten everything up before I can see what I've accomplished with an adjustment.
Thanks, Robert.
Now I understand what you guys are saying – that the dial indicator should be mounted such that it can't get out of square or parallel with the table top – and I apologize for the confusion.
It's been years since I messed with that thing (I think it's called a "Superbar" jig or something like that, supposedly designed for TS alignment), and I now remember that the last time I used it, I realized that there was no way to use it accurately if it could get out of square or parallel to the table, and I remember cutting a block of wood to put under the probe to keep it parallel to the table, but I didn't figure out a really rock-solid way to keep the probe square to the miter slot, so it was an exercise in frustration more than anything, at least for me. Or maybe I wasn't using it right (not sure whether I re-read the instructions the last time I used it).
Anyway, guys, please excuse my boneheadedness on what you were trying to explain regarding the idea that "you can't get the dial indicator out of square." I see what you're saying now.
But I still contend that .003" of (apparent) misalignment between fence and blade (I think this was what the OP was talking about) is more than "good enough." Or at least it would be for me ... but I use fire axes for my "woodworking" as much as anything else.
Cheers-
Jacob
Last edited by Jacob Reverb; 03-28-2019 at 4:59 PM.
Jacob said:
Now I understand what you guys are saying – that the dial indicator should be mounted such that it can't get out of square or parallel with the table top – and I apologize for the confusion.
But...and thanks to Brian I came to understand this...if the dial indicator is not perfectly square, the measurement will actually be larger than the true measurement. So you when done your result will be BETTER than you think!
Before Brian's post (above) I thought an angular error in the dial indicator would fool me into thinking I was better than I was...but the angular error will actually fool me into believing what I am measuing is actually worse than it is.
Too much to do...Not enough time...life is too short!
Well, if I'm not mistaken again the result could appear either better, or worse, than it is. It just depends on where you got the error.
As an example, assume that the trunnion was cocked clockwise, viewed overhead, in relation to the table, meaning that the true, accurately-measured distance between miter slot and blade was greater at the back of the blade than at the front of the blade.
If you measured accurately (perpendicular to miter slot and parallel to table) at the back of the blade, but got the indicator probe cocked at the front of the blade (either not perpendicular to the miter slot and/or not parallel to the table), this would make the misalignment appear less (and thus better) than it actually was.
But if you did the reverse – measured accurately at the front of the blade, but then got the indicator probe non-parallel to the table and/or non-square to the miter slot at the back of the blade – then the misalignment would appear to be greater (and thus worse) than it actually was.
As a third, though exceedingly unlikely, possibility, let's say you got the exact same degree of measurement error from having the indicator probe non-parallel or non-square, at both positions – front and back. In that case, since the (true, accurate) distance between miter slot and blade is greater at the back of the blade than at the front, your trunnion misalignment would appear (very slightly) worse than it is, because leg b of the right triangle (the true, accurate distance between miter slot and blade) is larger to begin with (by definition from the premise) at the back of the blade than the front, meaning that the hypotenuse of that right triangle (the erroneously-measured distance) would also be larger than it is.
Or that's my story, and I'm sticking to it (for now)...
I feel like an idiot now because in cleaning up shop, I found the missing piece to use the dial indicator – a machined aluminum plate instead of the blade – and either I've never used the dial indicator with jig and the plate or I forgot about it, but I put in the plate today and measured with the SuperGage gizmo, and as best I could read it, arbor was maybe 1.5 mils out of parallel to the miter gage slot in 10". So I guess the clamp-rule-miter gage method works, too.
So maybe .003" isn't out of the question to measure. But it's getting down into the weeds...
Thanks for the patience, and please excuse my ignorance, in trying to explain it to me, fellas.
Not sure how my father aligned his saw, but I'm sure he never used a dial indicator though as an engineer he would have known how. He spent a day under his Craftsman contractor's saw loosening/tightening/checking the trunnion and by gosh and by gollying it, and eventually called it good and probably didn't touch it again. And he made custom furniture for years with some nicely machined joints (mostly done on TS) and it was good work. It'd be interesting to know how well aligned his saw was, but that was back in the '70s.
No worries, it was nice if you to reply with your findings. That aluminum plate is a really good idea for this procedure, eliminates the possibility of the blade being the culprit.
One thing I did to dial in my chopsaw was to lightly hone my arbor flanges to ensure there were no high spots causing the blade to wobble.
Last edited by Brian Holcombe; 03-31-2019 at 12:33 PM.
Bumbling forward into the unknown.
Oh forget it.
Last edited by Patrick Walsh; 03-31-2019 at 6:28 PM.