This week a python script filled with shiny new functions descended from the heavens into my lap. Like a divine tablet I lugged “shift_methods.py” down from Mt. Moodle and into the desert of my working repository. I started the work of tearing my shoddily built shift functions from my module, sparks and wires flying, until the dust had settled. My new cross-correlation function, pieced together like a cyborg, was ready for action.
It worked spectacularly. The new functions our instructors passed along to us uses cross correlation to find patterns that are similar between two images, and then uses a centroid function to shift these similarity matches so that the images align. I used header information from our files to create lists of files from each cycle to stack so that we could sample our data more frequently. That timescale can be changed between cycle, night, or even observing run to create deep images.
Our next step in data analysis will be to reduce each night of data and begin to extract photometry from our stacked images.
Our proposal draft was submitted last week, and after getting comments back from that, our plan is to implement whatever changes are necessary on that front, and elaborate on our photometry methods. Because we are working in H-alpha emission as an accretion and activity tracer, differential photometry (which the rest of the class will be performing) becomes somewhat complicated.
Normally, differential photometry works by subtracting the flux from a star with constant luminosity (i.e. a non-variable star) from each other star in the frame (which may or may not be variable). This way, you get an understanding of the relative brightness for objects in your frame without having to calibrate photometry on a standard star.
The problem with what Lena and I want to do with our data is that, because we want to scale and subtract our narrowband images from each other to isolate our H-alpha emission, any non-accreting stars will disappear in that subtraction. Personally, I’m way too tired this Tuesday to figure out a solution to that problem, but I have no doubt between us and our lovely instructors, we’ll be able to figure out a plan by the end of the week.
A bit of a short post this week, but I’m busy at work in the mines of data reduction. Until next week!