Recently I was reminded of what one of the classic books on the subject had to say.
Of course, the reason one uses a loupe or hand lens is to see more than the unaided eye can see.Kodak's publication N-12, [i]Close-up Photography & Photomacrography[/i] wrote:Fundamentally, a photomacrographic subject is one that would be visually examined with a loupe or with a hand lens.
This led me to propose an alternate definition:
Reasoning forward from that simple concept, I worked through some math that ended up sayingShooting "macro" means taking pictures that show more detail than unaided eyes could see in the real subject.
Tonight I took some time to test experimentally whether the concept and calculations seemed to be on the right track.Now we've got a numeric criterion: a digital image is "macro" if it has at least 12-24 useful pixels per mm of subject size. Using our forum's limit of 800 pixels, anything under 33 mm field width is definitely macro, anything over 66 mm could only be close-up, and stuff in the middle is ambiguous depending on how sharply it's rendered.
The experiment seemed simple: find a suitable resolution target, look at with my unaided eyes, shoot it with a few different camera & lens setups, and see how the results track the predictions.
Of course, the experiment ended up being more complicated than I had hoped. The problem was to find a suitable target. I needed something with small detail having a good range of size. A tiny eye chart would have been ideal, but I didn't have any such lying around. For a while I contemplated firing up the old darkroom and printing one, but that seemed like way too much trouble. Finally it occurred to me that I didn't really need a printed tiny eye chart -- one projected onto a matte screen or even just focused in the air would work fine.
I fired up Word, typed in a bunch of different text lines sized from 40-point font down to 4-point, printed them using a high quality inkjet, then set up an old camera lens to project a 1/10th size image to use as a test target. So, the test target was essentially tiny text, from 4 points down to 0.4 point.
Here's what the setup looked like:

On direct inspection, it turned out that I could just barely read the line labeled "20 point" (actually 2 point!) using my normal reading glasses. With a different set of glasses that I use for hobby work, it was more like "17 point" (actually 1.7).
Then I shot the target with three camera & lens setups:
- Canon 300D with a Sigma 105 macro lens at 1:1, producing a field width of 22.7 mm,
- Canon SD700 IS at approximately closest focus, with a field width somewhat over 27 mm (I'm not sure exactly what the autofocus did), and
- Canon 300D with a Sigma 18-125mm "macro focusing" zoom, at minimum field width 101 mm.
The results are shown below, and I'm happy to see that they agree pretty closely with predictions. At 800 pixels, the 22.7 mm field of the macro lens at 1:1 reveals something like 3 times finer detail than I could see with just my eyes. The SD700 point-and-shoot also reveals considerably more detail than just my eyes, though not as much as the 1:1 macro on the 300D. And the not-really-a-macro zoom lens falls short of showing even as much as my unaided eyes can see.

This approach feels right to me -- characterize a photo as "macro" or not based on the amount of detail that it shows.
I'd be interested in other people's thoughts. Comments?
Thanks,
--Rik