User talk:Rich257/Archive/Archive-2008Mar

As I wrote before, I am new to wikipedia's incremental updating and messaging system. I appoligize if you are the wrong person who received this. I would appreciate if you forwarded this to the right person, if so.

Now to the matter. You did not seem to agree to have integrals on the tensor products at two places.

However, I insist on integrals. They could be omited if you tell somewhere that you are assuming this possibility implicitly. Else, the readers will have to deal with the following inconsistency.

You say right after S (for 2D) "Eigen-decomposition is then applied to the structure tensor matrix S to form the eigenvalues and eigenvectors (λ1,λ2) and (e_1, e_2)". This "Structure tensor", without averaging, would then have exactly one non-zero eigen vector and that is the gradient. This is a contradiction with the cited statement and the rest of the article because such an S will always be a line-tensor (also known as linear symmetry tensor) no matter what neighborhood. You can never have a ball-tensor (balanced direction tensor) with such a definition, since the gradient is well defined (non-nil) everywhere as soon as it is non-nil, and your S would exist as rank deficit automatically making it line-tensor. An analogous argument that it is an erroneous definition goes for the 3D, in this article.

The pure tensor-product (outer-product) of the gradient would be the infinitesimal structure tensor, not the structure tensor.