Having run into some issues with data formats, I revisited this recently.
My camera (an Atik 460EX) delivers 16-bit output, presuambly rendered as 0 to 65535.
However, I see the integer format in Prism is 16-bit signed
Does this mean I can only represent pixels from 0 to 32767? Do I lose 1-bit of accuracy?
CCD cameras (CMOS/CCD), DSLRs, telescope mounts, focusers, autoguiders, domes, Roll off roofs, etc..
4 posts • Page 1 of 1
- Posts: 20
- Joined: Sun Apr 07, 2019 3:38 am
- Location: https://goo.gl/maps/AT1eV3nU1KhK9F7Z6
We have reviewed what you have described. I'm using an ATIK 383L + 16 bit ccd camera. First: I checked my image for the usual fits test. The Prism Fits header contains 16 bits and the maximum ADU pixel value is 65535. Under Linux with Siril, the test image data is 16 bit color depth. Under Linux, SAOimage DS9 has 16 bits and 65535 (max) values in the pixel value table in the galaxy core. Did you mean this under the 16-bit Prism / Settings / Software Setup / Images (JPEG / TIFF) tab?