Blue light for high resolution - post processing questions

A forum to ask questions, post setups, and generally discuss anything having to do with photomacrography and photomicroscopy.

Moderators: rjlittlefield, ChrisR, Chris S., Pau

rjlittlefield
Site Admin
Posts: 23561
Joined: Tue Aug 01, 2006 8:34 am
Location: Richland, Washington State, USA
Contact:

Post by rjlittlefield »

Regarding the limitations of sRAW, https://photographylife.com/sraw-format-explained says this:
An uncompressed RAW / NEF file contains 14-bits of data per filtered pixel, so color and luminance information is demosaiced by software to form RGB pixel data. When you open a RAW file in Camera RAW or Lightroom, the software reconstructs the image in color by using a demosaicing algorithm on the bayer pattern. An sRAW file is already demosaiced and reconstructed by manufacturer’s camera firmware, so it does not contain most of the information from the RAW file.
In particular the sRAW does not contain information from the blue photosites in isolation, without interpolation or other contribution from the non-blue sites.

That particular article is talking about Nikon sRAW, but it seems that the description also applies to Canon's sRAW. See section 3.2, starting on page 10, of http://dougkerr.net/Pumpkin/articles/sRaw.pdf .

--Rik

Beatsy
Posts: 2105
Joined: Fri Jul 05, 2013 3:10 am
Location: Malvern, UK

Post by Beatsy »

Rik - after a few quick tests with full RAW files last night, I was beginning to suspect something like that was happening. Thanks for the confirmation.

However, I'm happy to report that I've cracked the problem of getting unaltered blue pixels out of the RAW files. Deep Sky Stacker's 'super pixels' were the answer - just needed to find the obscure checkbox that saves the individual files as processing progresses. Briefly...

1. Capture a series of RAW files for the stack. sRAW(2) works fine, but may be subject to the limitations discussed above. Best to use full RAW, although the large files seriously slow subsequent batch processes.

2. Run the RAW files through Deep Sky Stacker using the super pixel mode (with option enabled to save a half-size 'super pixel' TIFF for each RAW file while processing). Super pixels create a single RGB pixel from 4 pixels in the bayer grid (RGGB) with the two greens averaged together.

3. In Photoshop, use Channel Mixer to create a monochrome TIFF from the blue channel of a 'super pixel' TIFF. I assigned this to an action so it can be run as a batch process on all the files in a folder. I also added a 50% resampled size reduction to reduce file size and over sampling in the images to be stacked.

4. Use the resultant 16-bit monochrome TIFFs for stacking in Zerene.

So - later today, I can move on to the investigation I was *really* aiming at. Namely, does photographing in white light then extracting the blue pixels produce the same image quality and resolution as photographing in blue light then extracting the blue pixels. I'll post results and pics later.

Beatsy
Posts: 2105
Joined: Fri Jul 05, 2013 3:10 am
Location: Malvern, UK

Post by Beatsy »

Some results. Images shown are 100% crops of pmax stacks of Auliscus sp (103 images in each stack). I used a Plan Neofluar 40/0.95 water immersion objective and N.A. 1.4 condenser dry (so effectively N.A. 1.0).

Original images were full size RAW but reduced 50% by converting to "super pixel" images using Deep Sky Stacker. This makes single RGB pixels from each 2x2 group of RGBG pixels in the RAW bayer matrix. The 2 greens are averaged together while red and blue are used directly.

Image 1 was stacked "as is" using RGB "superpixel" tiffs, but the tiffs for images 2 and 3 were further processed before stacking to extract the blue channel (only) and convert them to 16-bit monochrome. Images 1 and 2 were illuminated with white light during photographing, image 3 with blue light. For consistency, all 3 images were composited together before adjusting levels etc, to ensure consistency for comparisons.

The results show a clear improvement in resolution using blue light, as expected. The interesting part is that it appears possible to illuminate with white light, shoot RAW, then extract the blue pixels from the bayer matrix for improved resolution (and contrast) instead of using a blue filter. The difference between pics 2 and 3 is not as great as I thought it would be (though 3 is a little better). So in some cases, it may be good enough to capture full colour RAW images and extract the blue channel (without debayering of course) rather than end up with *only* blue images by using blue filtered illumination. Note: images 1 and 2 are a good demonstration of this - they were both made from the same set of 103 RAW images.

Image

Pau
Site Admin
Posts: 6052
Joined: Wed Jan 20, 2010 8:57 am
Location: Valencia, Spain

Post by Pau »

Very interesting results!
I miss another image: RGB stacked illuminated with blue light, could you do it for comparison?
Pau

Beatsy
Posts: 2105
Joined: Fri Jul 05, 2013 3:10 am
Location: Malvern, UK

Post by Beatsy »

I also forgot to do an example of extracting blue from normal debayered RAWs. I know it will look similar to the first pic, but may as well do it for completeness.

I'll do 'em tomorrow. Going out now...

Lou Jost
Posts: 5944
Joined: Fri Sep 04, 2015 7:03 am
Location: Ecuador
Contact:

Post by Lou Jost »

Wow! Great result.

I had been experimenting with blue light a few months ago, but without worrying about the demosiacing. Looks like I should have done that!

But theoretically, you would get even better results if you used a blue LED as your light source instead of using a blue filter. The LED is monochromatic and so there will be no chromatic aberrations in the image. Blue-filtered white light won't be so perfectly monochromatic. There are cheap bright blue LED flashlights available, and that is what I have been using.

Lou Jost
Posts: 5944
Joined: Fri Sep 04, 2015 7:03 am
Location: Ecuador
Contact:

Post by Lou Jost »

If you intend to do this often, you might consider using one of the Pentax (full frame) or Olympus (micro 4/3) cameras that use sensor-shifting to upsample an image. They produce a "super-resolution" image by putting the blue-sensitive element in all four possible positions (and more, in Olympus) so there is no interpolation needed for blue light. You can just use the blue channel of the final composite "super-resolution" image directly (though I think that because of channel cross-talk, you'd still want to use a monochromatic blue light source if possible).

dennisua
Posts: 98
Joined: Tue May 31, 2011 9:35 am
Location: Kiev, Ukraine
Contact:

Post by dennisua »

You can use Iris software popular among amateur astrophotographers.
It's free and supports your camera. It will give you pure raw B channel.

You can get it here:
http://www.astrosurf.com/buil/us/iris/iris.htm

What you need is a SPLIT_CFA command (split_cfa2 for batches)
http://www.astrosurf.com/buil/iris/tuto ... c17_us.htm


It may look intimidating at first but works perfectly. And if you need help - I'll be glad to help. Quite long time ago I was doing experiments with split channel stacking and results were quite better than full-color ones.
Dennis

Post Reply Previous topicNext topic