Saturday, January 28, 2017

How I made a better Celestron AVX

I'm in no position to complain, I got my AVX for jellybeans because a surplus shop thought it was broken. But if I had paid full price I would not think well of the performance of the mount right out of the box.



It isn't difficult to achieve subs of a minute or two, but:
  • Glitches happen. I've reloaded my firmware a couple times for the occasional "bootloader error."
  • It's basically impossible to balance, as the mount is so "sticky." Other people call this stiff, but having worked on industrial machinery I know that's bollocks.
  • The motors are clearly struggling. Some of this may be my fault, having tightened everything down to minimize backlash.
  • There's some roughness in RA. I found out why.
So I disassembled the thing completely, did some things, and then I was able to take 15 minute subs. Here's what I found/did...

Step one, remove the dovetail clamp:


First discovery. Think these are assembled in a precision lab? Noooot so much. Yes, that's a rock. Just, randomly dropped in to this thing that's supposed to be accurate to a few arc-seconds. Is it anywhere that would be a problem? No, but it's not a good start.


 I unbolted the DEC assembly from the RA axis, it's just two bolts:


and disassembled the worm gear system. This all looked reasonably good, other than just having way too much grease on it. The grease is what makes the AVX so sticky. You want grease, especially where the worm meets the DEC drive, but they've overdone it quite a bit.


To disassembled the DEC rotating assembly you'll need to loosen two set screws through a small access hole on the back:


And then unscrew the tensioning...thing...from the assembly. I happened to have this tool for taking watches apart that worked well for this. I had to get creative later on though.


 Assembly apart- as you can see there are no bearings in this portion of the mount. It would be nice if they were there, but it's actually not too bad here. The mating surfaces have been turned nicely on a lathe, and the tensioning ring  tightens against a pair of large nylon washers. Would bearings be better? Of course. But it's probably not as bad as many make it out to be.


 Pulling the RA apart was a bit more trouble. I had a bicycle tool that worked well for taking the fancy orange covers off as well as the two tensioning rings:


To remove the assembly you need to loosen these three set screws; they hold a thin metal plate and nylon washer against the altitude adjustment. Once those are loose you can tap the threaded pipe out and free the RA:


Unfortunately my fancy tool wouldn't work on the RA tensioning ring. I had to improvise. Don't tell anyone where you learned this trick, but a couple of allen keys poked in to the holes and a pair of pliers will either save the day or ruin something. Thankfully my day worked out well:

Note: there are THREE set screws on the RA axis to loosen
That's everything apart. And, well, oh dear...


 Ok, one side uses a thrust bearing that gets tension against another machine surface. There are better options for this, sure, but this is probably fine too...


 This, on the other hand, feels like amateur hour. The side closest to the DEC is a regular radial bearing, which of course you shouldn't tension. So their solution was to mash it against another nylon washer. This is a terrible idea. Both the inner and outer rings of that bearing are pressed against the same surface.


This is a textbook use case for an angle bearing, which looks like this:


See? That is meant to be loaded in BOTH directions and still remain stable and solid. At this point I couldn't help but notice the example bearing that I literally grabbed from a box in the lab happened to look like it might just fit...


Ok, crazy lucky: the center part was a matched fit for the rotating portion, and the outer ring was just a little bit small. You can see above that turned a little spacer on the lathe (which was a pain, it's way too thin to reasonably be clamped in the lathe, but I made it work). Ultimately I was able to get a pressed fit on both mating surfaces of the spacer:

by pressed fit, I mean it needed some light bumps from a hammer

If you'd like to duplicate what I did here, the bearing is an L68110. Unfortunately I can't recommend a source, because mine was "that box with random things like bearings in it." But they seem to be a pretty common bearing and go for $5-15 on the internet. 

Other than that, I wiped down every greased up surface with a paper towel so that there was only a thin layer on everything. Once reassembled the mount would move much more freely, and I was able to do a much better job of tensioning each axis to prevent any free play in the system.

I still balance in DEC by setting the telescope assembly (with camera, all bolted to its dovetail) on a round object and teetering it until I find the balance point. I just mark that and slide it back on to the mount, centering the balance point in the dovetail clamp.

But RA couldn't be done like that, and now the RA axis moves very freely. I even went back later and re-tensioned after everything had settled, and was able to get a very tight, but free-spinning axis.

How much difference has this made? I don't have numbers yet as I'm still finding the right PHD settings, but so far things are more stable, it's easier to balance, and there's less play despite everything being much more aggressively tightened.

If you're like me, you really enjoy seeing all the bits of something that was taken apart. Enjoy!
Will this void your warranty? I would assume so. Could you un-do it? Probably. So save the original bearing and nylon washer, just in case...

Sunday, January 22, 2017

Astrophotography: High ISO with short exposures or low ISO with long exposures?

It is a very, very rainy night here and I've had a question nagging at my for a while. There are lots of suggestions on the internet for what sorts of settings to use while taking astro-photos. Lots of statements. Lots of advice. Very, very little data*. So here's the question I've seen asked and answered, but never proven. Boiled down to the point:

For every hour's worth of imaging time, is it better to take long exposures at a low ISO or shorter exposures at a high ISO?

And I ask this, wanting to find a real reason, because I keep seeing complete nonsense regarding how cameras and ISO work. So here's what I did:



  1. Set up the camera in a dark closet, with only the glow of a single amber LED, pointing at SpaceDuck.
  2. Take a crap-ton of pictures. ~30 minutes of ISO 6400, 10 second exposures and ~30 minutes of ISO 400, 160 second exposures. Those should be similar exposures, but of course it won't exactly be.
  3. Take 5 minutes of dark frames (same duration and ISO as their set) and 10 bias frames (/32,000ths of a second, but matching ISO of their set).
  4. Process them just like I would any astrophoto using DeepSkyStacker
  5. Present Results.
I legitimately went in to this having no idea what I'd get on the other end. I knew some of the information being shared was definitely wrong, but that there were almost certainly factors that I hadn't considered. I also legitimately don't know which way I'd like to see it go. I've worked pretty hard to make my Celestron AVX reliably track for 15+ minute exposures, but of course I don't think anyone really enjoys losing a frame that long because of a glitch/cloud/airplane/etc. 

Unfortunately I failed to match the two exactly, but here's the stats (they're close, slightly in the favor of the longer exposures, but probably not enough to matter):


High/ShortLow/Long
ISO6400400
Shutter Time(seconds)10160
F/Stop:2.82.8
Light Frames18012
Light Frame Time18001920
Dark Frames303
Dark Frame Time300480
Bias Frames1010
-->

I basically have an extra dark in there for the Low/Long set because 2 frames just seemed unfair as far as averaging goes.

These are all crops of the images because the originals are huge. But I'll share any originals if you want:

Single Frame, 10 seconds @ ISO 6400

Single Frame, 160 seconds at ISO 400
That's pretty predictable. And I chose this little section because it has a range of lightness from dark blue to white in the flag, plus shows a little ducky texture. Ever wonder why you have to take your dark frames at the same ISO and exposure length? Here's a couple singles, stretched using "auto tone" in Lightroom:
Dark, 10 seconds @ 6400 stretched

Dark, 160 seconds @ 400 stretched
and...
1/32000th @ 6400 stretched

1/32000th @ 400 stretched

Be careful recycling dark and bias frames!

So the next thing is that when you stack a lot of information together even tiny differences will multiply. This is my excuse for the next set of images not looking even as far as exposure is concerned, but ignore that and pay attention to how much information and detail is in them. First, the final results:

180 frames stacked together from 10 seconds each @ ISO 6400

12 frames stacked together from 160 seconds each @ ISO 400
and then with the gamma adjusted to 1.45 for the ISO 6400 final to get closer to the look of the ISO 400:

180 frames stacked together from 10 seconds each @ ISO 6400, 1.45 gamma correction

The plot thickens. If you ask me, those two are not all that far apart. Maybe there's a hint more ducky texture in the ISO 400 stack? Maybe that's just how the luminosity stretched? Let's step up the exposures on them in a couple increments to see what data is hiding, in case you were trying to pull faint nebulosity out of SpaceDuck. Note that these are stretched without the gamma, so the ISO 400 is going to appear a little lighter. Focus on the details:

+2 stops

180 x 10s @ 6400 +2 stops exposure

12 x 160s @ 400 +2 stops exposure

+ 5 stops

180 x 10s @ 6400 +5 stops exposure

12 x 160s @ 400 +5 stops exposure

+ 10 stops

180 x 10s @ 6400 +10 stops exposure

12 x 160s @ 400 +10 stops exposure

So what do I think? I think the two are remarkably close, with a slight edge to the long exposures/low ISO. But it's much closer than I thought it would be. I think in the end I'll end up setting the camera to a high ISO, maybe even 6400, and taking more frames. Why? Because I think at the end of the night I'll end up with more total useful integration time. Too many things are out of my control when everything has to stay just right for 15 minute blocks.


Other data/thoughts about the test in case you're wondering:
  • The camera is a Fuji X-T1. I don't know if Nikons, Canons, Panasonics, etc work the same way as far as gain/ISO is concerned. Fuji is known for being wonderfully weird so do your homework. Repeat the test even and post it, because that's how science works :)
  • All files were taken as Fuji RAF raw files, which DSS doesn't like for me. So they are converted to DNG raw files. This probably doesn't make any difference, but it might especially given Fuji's also weird bayer filter pattern.
  • More dark frames is likely advisable for this amount of integration time.
  • I discovered Fuji reports exposure length incorrectly in the exif data. Things kept coming back saying they were 9 seconds and 170 seconds. I double checked manually with a stopwatch, those very much were 10 and 160 second exposures. Not sure what the story is.
  • Yes, 1/32,000th's of a second. The X-T1 can do a purely electronic shutter at ultra high speed. This is helpful in gathering read noise without other information.
  • When I auto-toned the two fullsize final images there is definitely a shade more detail in the absolute blackest of black background sections for the ISO 400 image. In a real image I think I would be powerless to discern that as signal from the noise, and would opt for a longer exposure in either case if that was detail I was actually trying to capture.

*I'm not calling anyone out here. It happens. I can't even begin to explain how many times I've stopped after I said something in a barroom conversation and followed up with "now that I think about it I have no idea where I heard that or if it's true..."

Friday, January 13, 2017

Astrophotography resolution: what should "good" look like?

One of the interesting challenges of astrophotography is that there are so very many factors that go in to the result. Basically, it can be difficult to determine what went wrong. Pull down your face shields, we're going to do something similar to science...

I'm going to use my rig as an example, because, well, I did this math already. But that's ok because you can use use it to get some idea of things. I'll give you the math if you're really inspired to scribble on a big chalkboard and draw diagrams to impress your ladyfriend (or gentleman friend...and.... like other species geeks and nerds come in a variety of standards and even non-discrete units- gender/sexuality can't be determined by Millikan oil drop).

The non-variables:

  • I have a 6" f/4 telescope with optics of unknown quality
  • Through these optics, each pixel represents 1.46 arc-seconds of sky/star
  • My tracking scope (ST80) and its little camera reproduce 1.93 arc-seconds per pixel
  • I have some messy data from my tracking scope in operation (used below)
  • The angular diameter of Sirius is 0.006 arc-seconds
I obtained the arc-seconds/pixel numbers for the two scopes by plate solving images taken from them using the wondrous tools at nova.astrometry.net.


So very simply, if the angular diameter of Sirius is less than one pixel of the sensor, and if everything was amazingly perfect it should look like this when magnified:

Not real life


If you claim to have seen such a thing you're a liar and a scoundrel.

Any telescope's resolution is limited by the physics of light, simply because the quality of an optic, and how cleanly it reflects is proportional to the wavelength of light itself. Even if the optic is beyond the light's wavelength, the pinpoint of light will reproduce a central disk with rings of interference around it in a ratio of 84% in the middle, 16% in the rings. How long you expose the image will determine how bright the rings are, up to a point that they no longer appear as rings on the sensor and are instead a larger disk. This is why images of bright stars fill more pixels than dim ones. The size of this disk is determined by the diffraction limit, which is:

1.22x wavelength(cm)
----------------------
diameter(cm)

in radians. So light somewhere in the middle of the spectrum: 0.00005cm and diameter of 15.2cm, we get 0.83 arc-seconds. Astro-Tech lists the scope's resolution as 0.76 arc-seconds. Isn't that interesting? At any rate, that's the area of the central disk. So in theory a short enough exposure would still render a single pixel. The diffraction rings, which look a bit like this:

Not to scale with the other fake pixels


would render as pixels something like this with sufficient exposure:

Real life if you're in space

That's looking more like a star in a telescope like we're used to. This takes care of your Dawes numbers, Raleigh, or whatever else you subscribe to. Don't get too picky on the differences between those, we're taking pictures from the bottom of a deep pool.

Being in Florida I'm looking through about 30 feet of water in a best case scenario. It's a swampy, swampy state with a lot of dense, wet air starting at sea level. This produces  "seeing" quality issues. The dense/wet air refracts light the same way that a glass (refactor) telescope does, except that it is constantly shifting with air currents, hundreds of times per second. If I were on Mauna Kea again that would distort the location of a given star (and its diffraction rings) by about 0.4 arc-seconds on a good night. Here? It's probably 2 arc-seconds on a good night, and likely 3 most of the time. Let's go with 2.5, or 1.7 pixels. Yes, I'm skipping over the concept of FWHM here because it's a calculus problem and you don't really need it for this sort of back-of-the-envelope look at things. Maybe another time. New image:

Scuba/Swamp Vision

That's basically my expected detail if I get everything else completely right with a significant exposure. Shorter exposures of course could render a smaller image. In fact if the exposure was shorter than the frequency of eddie currents causing the seeing conditions (and I got lucky with a current that got very little distortion), and short enough to not expose the diffraction rings it could take up a single pixel. Instead, this is what I've got:

This should really only take up 4 or so pixels

The shift of colors from red to blue tells me there's chromatic aberration, and because I can see it on other more significant things in that image comatic aberration as well. How do I know the stretch isn't tracking?

PHD2 unfortunately doesn't like my camera and won't let me put in a pixel size value for it, so it only reports deviation in pixels. Since I was able to plate solve an actual image from the camera though, it's easy math (arc-seconds = 1/(pixels per arc-second*deviation)):
  • Average deviation of 0.45 pixels = .87 arc-seconds
  • peak deviation of 1.5 pixels = 2.95 arc-seconds
Those are for the ST-80, so arc-seconds being the common here that means going the other way for for the AT6IN/Fuji combo:
  • average deviation of .87 arc-seconds = ~0.6 pixels
  • peak deviation of 2.95 arc-seconds = ~2 pixels.
The image is stretched over at least 3 pixels, so the only other candidate beyond my optical issues is focus. I'm focusing using a Bahtinov mask, which results in one of those scientifically accurate levels of focus that I can't determine by looking at pixels alone.

So...collimation, optics, aberration.

For reference, here's a bright star with a longer exposure. It's a little harder to tell what's going on there, but you get the idea:

Astro-probs.




Thursday, January 12, 2017

Astrophotography updates, buying problems for myself

New optics!


The main scope (for imaging) is an Astro-Tech AT6IN, and the new tracking scope is an Orion ShortTube-80, or ST80. There's a dozen versions of the ST80 of different names, all made by Synta for the various retailers.

I wanted something with a wider field of view, in this case the focal length is 610mm. Where the previous scope (a Celestron C6 SCT) rendered about 1.1 arc-seconds/pixel on my Fuji X-T1 camera, this renders closer to 1.5. In other words, it sees more sky.

This offers a few advantages:
  • Tracking does not need to be as precise (we'll get to how much...)
  • I can image larger objects, such as the Pleiades, Rosette, and Horsehead/Flame nebula
  • Being the same aperture (6") but wider, that also means it's getting much more light every second the shutter is open.
How much more? If the previous scope was more or less F6.3, and this is F4, that's about 2.5x more light. So if I needed a 60 second exposure before, this would need a 24 second exposure. 

And now I'm going to tell you why this was a terrible decision.

A short focal length Newtonian is a mess. It naturally has a ridiculous amount of comatic abberation, which is inherent in all large optics, but is exaggerated the shorter the focal length. Without a corrector this scope is basically worthless.

The coma corrector (made by GSO) mostly helps this...but your collimation (having all the optics at perfect angles to each other so that the light path is focused evenly/flatly on the image sensor) has to be really, really perfect. I've seen some estimates that at F/4 the image breakdown occurs when the light path deviates by as little as 0.45mm from accurate.

0.45mm. Let that sink in. You know how wide the bullseye is on a typical laser collimator? About 4mm. Part of that is because the output optic for a typical laser diode is 3mm.

So what you're doing is taking a really nice, wide angle image that should be able to get beautifully sharp and subjecting it to something that will begin breaking down at a level of accuracy that is 8-9x more accurate than the equipment you're going to calibrate it with.

Now suppose you're like me, and are the type that will stretch a thin film over your collimator so that you can see when the return light path, which is focused to much smaller than the exit light path, makes a nice bullseye in the exit path. Assuming you also loaded your collimator in a lathe at some point and centered that path to within a few mm at 50ft, you're probably able to get it within the margin of error.

If.

If the rest of your optics are aligned correctly, which... mine were not. Worse, they were not able to be: if you have a closer look at the image above you'll spot some extra holes where the secondary is mounted. My secondary mirror was too far down the tube to align correctly. Yay.

And then it still won't be good enough.

You'll hang a heavy imaging train off the side of the scope, which will cause the focuser to flex off of center on its mount. Your imaging train will have to be especially awkward because there's a heavy corrector optic in it, which then has a spacing of about 75-80mm (mine does best at 78mm) before it finds an imaging plane, which is probably a mirrorless camera or DSLR. This will shift things out of alignment by a couple mm. Which is enough to notice.

I think Newtonians might just be a bad idea anyway.

Once you've done all of this, you'll have a system which is very out of balance for the mount. The camera will sit at a different axis from the finder and tracking scopes (otherwise it will be in their way). You could add weights opposite of the focusing assembly, but of course this stresses the mount even more.

So...now what?

I don't know. I'm going to keep playing with it for the moment, and try not to get any farther down the rabbit hole unless I think I can make it truly work out. I have managed to take a couple ok-ish images with it, but far short of what I think my setup could otherwise do:

Pleiades, stack of several 180s exposures from the Astro-Tech AT6IN and Fuji X-T1. Of course from my fully light polluted Central Florida skies.

I like the wider field of view very much. Note: this was taken when I was still trying to get the coma corrector spaced out just right, so it shows worse on here than it is in some of my tests.

Wednesday, July 6, 2016

360 degree microphotography


Summary: we took a microscope objective to my camera, built a focus stacking rig, and then combined it all with a miniature lazy susan / turntable to create a 360 degree rotation animation of a microscopic thing (in this case a common green long-legged fly, which has some pretty spectacular colors).

Quick details:

  • Each frame is a stack of 70 images
  • 160 frames (about 2.25 degrees of rotation per frame)
  • Final is the product of 11,200 frames (though we took well over 20,000 while testing/developing)
Gear setup:
  • Fuji X-T1
  • nameless eBay macro photography bellows
  • nameless eBay RMS adapter
  • AmScope PA4X microscope objective
  • Arduino Nano
  • Misc stepper motors and motor drivers
  • Custom motion control rig for focus stack and rotation

Saturday, November 14, 2015

Astrophotography: Solving Some Problems, Finding New Ones

I made something I like!





These are some of my changes:

  • Better polar alignment; not seeing field rotation. This is my longest exposure too, at 480 seconds, which I would think would show that sort of thing if it were off by much at all.
  • Neodymium filter for light pollution, it blocked quite a bit!
  • More spacers between the reducer/corrector and the camera to reduce the vignette problems to a minimum.
You can see I haven't fixed the primary tube reflection yet, still waiting on materials to be delivered.

I plate solved the image using Astrometry and came up with a 1.1 arc-seconds per pixel. The theoretical limit of my 6" scope is around 0.9 arc-seconds. Seeing conditions were much worse than that, and for where I am will probably never be better than about 2 arc-seconds.So I think we are in a good place.

I also calculated out an effective focal length of 926mm f/6.2, not far from the advertised 945mm f/6,3 the reducer/corrector is supposed to get. I didn't try plate solving previous images, but I can tell you with some certainty that the Celestron recommendation of 105mm between the corrector and focal plane, as well as the internet's prediction of 85mm are wrong. Total distance from the back of the corrector to the surface of my sensor is 155mm. If you have a Celestron C6 and the standard f/6.3 reducer this is probably about where you want to be, at least if you are using an APS-C sensor sized camera.





For pure resolution, however, our tracking is not quite perfect. Here's a single color channel from the image above, cropped to the center of the nebula where the trapezium stars are very close together:


And the same spot, but with a quick 1 second exposure:


You can clearly see all four stars as separate in the 1 second shot. This means that either my guiding isn't reacting fast enough, isn't predictable enough, or doesn't have enough resolution to keep things perfectly centered. I suspect the latter is my issue. I will probably need to get a small scope with a longer focal length to keep up with the main telescope's resolution. Seeing conditions are at play here too, but unless I was particularly lucky with that 1 second picture they should be effecting that image similarly.

For reference, the two stars closer together there are 8.7 arc-seconds apart:


Also, now that the field rotation is gone/minimized I can see we have some comatic aberration on stars at the outside edges of the frame:


Note that's coma, not chroma. This is the nature of the telescope. Stars in the center of the field focus to the same point from anywhere the light is gathered on the lens, but at the edges of the field those stars are focused at slightly different places depending on how close to the center of the lens the light was gathered. The reducer/corrector may be helping this or hurting it, depending on how much you believe I have moved the camera away from the "optimum" focal distance. There are additional correctors to help this, but this is so subtle I'm going to leave it alone for now.


Friday, November 13, 2015

Cheap light pollution filter

As soon as you say "filter" the public perception is that the images are photoshopped, or a little bit fake. I'm not one for heavily processing my images for other kinds of photography, and I'm more interested in being able to show what's up there in the night sky on the familiar terms that the general public is familiar with. There will always be limitations to this: "Is this what I would see?" doesn't exactly work in the world of nighttime photography in general.
  • Your eyes are more sensitive, but they can't accumulate light over time like a camera. So things can be much brighter in pictures.
  • Your eyes see vivid color in the day time, but the darker it gets the more your eyes rely on the "rods" of your retina, which are monochromatic; things appear a bit more blue than grey, but sensing reds, greens, and vivid blues is out.
  • You're looking through a telescope, which is a form of filtering on its own. Your viewing angle is cut down from maybe 170 degrees to 1-2 degrees. Your effective pupil size is also expanded from a few millimeters to the diameter of the telescope to gather more light.
But none of those things speak to real filters, which is what I'm adding to the system but going to try to keep the color "real" as much as possible, at least for now.

Astronomy filters can be very specific, and the cost of even the simpler ones is very high. I'm only a few miles outside of the city and live where the air is quite thick, which means street lamps add an orange glow to the sky and really get in the way of seeing what's up there. The goal is to take pictures of things outside of our atmosphere, not the atmosphere itself, right?

The orange comes from sodium, and the good news is that there is a cheap solution to this. "Red enhancing" or Didymium filters are made with neodymium. This happens to block that range of light without blocking much else. Amazon had the 52mm version for $23, which happens to be exactly what I need. I just had to remove it from the threaded lens mount so I could put it in to the telescope:



You can see by looking at the white cloth under the filter that it doesn't change the color very much.


I wedged it in to the T2-FX adapter and used some cardboard as a spacer. I will replace the cardboard with something better and less reflective soon, but this was more than enough to test with.

Thursday, November 12, 2015

Image artifacts: Astrophotography is touchy

Once I tied the camera to the back of my telescope and set up guiding I found plenty of new problems to solve:


I've boosted that image up a bit to make the flaws really obvious; this is a narrow view of the Pleiades. Surprisingly you can see some of the blue wisps of nebula around the stars (ignore the horseshoe shapes, those are artifacts)!

My scope setup is:
  • Celestron C6
  • Celestron f6.3 focal reducer/corrector
  • Standard Celestron SCT-T2 adapter
  • T2-FujiFX adapter
  • Fuji X-T1
Tracking is:
  • Really cheap Orion 9x50mm
  • Even cheaper Microsoft LifeCam with the filters and lenses removed
  • PHD2 giuding software
  • GPUSB-ST4 box
Everything that's wrong:
  • The orange-brown glow is my light polluted skies. We get nights that are clearer than that, but I think a filter would go a long way.
  • Stars in the center are nice and round, but the farther from center they are they are radially stretched. I believe this is a testament to how well autoguiding works. The system is locked on to the center star, but the alignment was off and so after a long exposure (300 seconds) the scope was not rotating exactly in tune with the skies. This makes the field seem to spin.
  • This thing that's going on:

    is vignetting; light is being blocked by the sides of the scope's center passthrough tube. This might mean the camera is the improper distance away from the focal reducer (I don't have this without a focal reducer, but then my field of view is crazy narrow). The outlet of the scope is also narrow, which will force me to put the camera at a not-so-optimal focal point.
  • This one is light reflecting off of the inside of the telescope:

    looks like a mix of the main body (very pale, I might not bother) and the primary mirror passthrough baffle. It's black but a bit glossy, and this is apparently a common complain for the C6. There's even a very light reflection inside the reflection going on for the brightest off-axis star, that's probably the T2 adapter tube.
The solutions are going to be a mix of things:
  • Better polar alignment when imaging
  • Light pollution filter of some kind. I'm guessing 90% of sources are sodium based.
  • Move the camera back from the scope until I don't have vignetting; this turns out to be pretty far back. I've heard the "correct" number quoted at either 85mm or 105mm, the first place I found it vignette free was close to 140mm between the corrector and image sensor plane.
  • Black out the primary tube with protostar stick-on flocking. This might never go away completely, apparently SCT style scopes often have this issue. Sadly I don't have access to that crazy science grade paint that blocks 99.99xx% of light. If you do give me a holler.
Overall I think this is going pretty well, but I'm sure I'll find new problems once I fix some of these...