i helped capture the first image of an exoplanet taken with JWST!

in which i walk on the shoulders of giants and see some cool photons

my PhD advisor Laurent messages me a few months ago letting me know that some new JWST data was taken by an Early Release Science team. we had talked previously about getting me involved in this project – an effort by the direct imaging community to take the first images of an exoplanet with JWST – but had decided against it, because i was spending much more of my time with my work on optical interferometry. so when the data first came down, our conversation went something like:

L: “MIRI ers data is online”

W: “woah. are ers ppl already on it?”

L: “Yes. But do it”

L: “Like if you think that’s fun”

[30 minutes later]

W: “GOT IT” (image attached)

my first pass at what would become the first images of the exoplanet HIP65426 b from JWST. i sent this to laurent the morning after the data was sent to earth from the telescope.

My first pass at the data resulted in a fairly poor signal-to-noise detection, because as it turns out, some post processing tricks work better for different optics on JWST than others – that’s what the ERS program was all about, determining what tricks we could pull to get the highest quality images. What we’ve learned through the course of commissioning and the ERS program is that different permutations of observing strategy, post processing, and starlight subtraction technique can be optimized for different planet separations. I used angular differential imaging (ADI) to remove the starlight in my first images, but for this particular four quadrant phase mask coronagraph, reference differential imaging (RDI) makes all the difference. Check out the image comparing ADI to RDI from our paper.

Figure 2 from Carter et al. 2022, showing our images of the planet captured using two different cameras (NIRCam on top, and MIRI on bottom) with two different coronagraphs (a round, Lyot coronagraph for NIRCAM and a four quadrant phase mask for MIRI), and the two post processing/observing strategies (ADI vs RDI vs both at once). My first glimpse at this planet was with MIRI/ADI, which turned out to be the most sub-optimal permutation!

It was pretty astounding to be one of the first people to see JWST’s first directly imaged exoplanet, and since I had gotten a handle on the data reduction so quickly, I pivoted my summer work to joining and helping out with the ERS project.

The experience really was standing on the shoulders of giants, joining last minute to a project and mashing code together written by many different collaborators to test and investigate this newfangled tech. I worked primarily on reproducing Aarynn’s data reduction and developing spaceKLIP, our code repository, in order to fit models to our images.

In the end, I had a lot of fun because the project brought me back to my roots, doing very similar science to my first ever research project and working on imaging data from the latest and greatest observatory out there. I’m really grateful for my advisor for giving me the encouragement to explore this stuff and for the ERS team for letting me crash the party and annoy them.

My absolute favorite thing was getting to turn this data that I had worked fairly hard on into art, and generate these false-color, photometrically balanced, RGB images of our NIRCAM data. To explain, since JWST observes in the infrared, there’s no way to map the images we take onto colors our eyes can see, but what I did was assign three monochromatic images taken in three different filters to red, green, and blue, with the shortest wavelength as blue, and the longest as red, and balance the colors to the actual brightness of the planet that we measured from the data. So the image is “false color” but if we had infrared eyes, we might see approximately the same colors as in my picture.

and a labeled version, so you know what you’re staring at

There are some funky things about the image, in particular the flower shaped pattern of the planet’s light is generated by the combination of the hexagon shaped mirrors on JWST, the coronagraph we put in the optical path to block the light from the host star, and the other optical elements that guide the light onto our camera. That flower pattern also gets bigger for longer wavelengths of light, so the petals of the flower look like little rainbows, because the petals don’t completely overlap in each color channel.

The big takeaway for me though, is that the planet appears fairly white, while the background stars appear very blue. This is what we’d expect, because very hot things (like stars) glow blue (think about the center of a blowtorch flame), while cooler things glow with white or reddish hues. How white or red the planet is can tell us about the carbon monoxide and water in the planet’s atmosphere – different amounts of these molecules can change how much light the planet emits at these wavelengths.

I mostly wrote this post so I’d have somewhere to post my pretty picture, but thanks for reading! Stay on the lookout for new direct imaging results from JWST (maybe even some from yours truly in a year or two). Until then, clear skies.





Leave a Reply

Your email address will not be published. Required fields are marked *