One thing they mentioned that kinda sends it home to me is that the telescope is aimed at the dark and away from the sun, and the other side is especially reflective[0]. In other words, one side is too dark to see, and the other is too bright to see.
It seems to me that that isn't quite right. "Dark" just means no visible light, but an infrared or thermal camera could see stuff. The article above mentions this possibility and the problems, which are harnessing and operating at cryogenic temperatures.
The point of the telescope is to capture IR light from space, if the telescope itself were generating IR light itself it would disrupt the sensor and data that it collects. Part of why its going to take so long for it to begin operating is that it needs to cool down to ultra cold temperatures before beginning operation.
Who said anything about generating infrared light? And I don't know what your last sentence is in reference to. Yes, it needs to cool down. Why is that relevant to this discussion other than what's already mentioned about that being a downside to operating cameras?
From the article so people can just debate NASA's own reporting:
> Although infrared or thermal-imaging cameras on the cold side could obviate the need for illumination, they would still present the same harnessing disadvantages. Furthermore, cameras on the cold side would have to work at very cold cryogenic temperatures.
The JWST has an extremely sensitive IR camera. And it is going to cool down to near absolute zero so that the IR sensors don't get flooded with radiation from the rest of the telescope. Because it will get so cold, it is not going to be emitting much IR (I think this is what the GP said when they said "generating"). Your typical deployment observation IR camera's won't be of much help in that case.
That makes sense, and I got thrown off (by maybe misreading the comment) thinking they meant the camera or something else would generate additional IR to illuminate the telescope, which I got confused by. It makes sense now that the comment was meaning that the cool down was to reduce as much telescope generated IR noise as possible.
I think that article is leaving out some of the other problems (which is understandable, it already covers plenty of downsides). The whole point of keeping the cold side cold is so that it doesn't emit any IR radiation (otherwise the JWST couldn't work). So by design it's "dark" to an IR or thermal camera in the same way it's dark to a visible light camera. Certainly you can build a camera to see it anyway (without extra lights), but the question then becomes how big that camera has to be, and I would assume it's way to big to be realistic.
First it’s possible to send up a camera that would be sensitive enough to use starlight to capture images on the dark side. However, those things aren’t light and would come at the cost of less propellant for minimal gain. Adding a tiny camera and a tiny light at the same time would also be possible but there really isn’t anything worth looking at via a single camera.
As to using infra red camera, they simply don’t work on objects the same temperature as the camera. NASA could have sent one up with a cooling system to look at the cold side of the sunshade, but again weight and pointlessness means it’s just wasteful.
It's probably not practical for JWST, but starlight is actually fairly bright. If you travel to somewhere with essentially zero light pollution, like a desert, if it's a clear sky you can see with starlight. At the South Pole on a clear night we would walk out to the dark sector (where the telescopes are) with our headlamps off because it just wasn't necessary. The only illumination sources outside were red safety lights[1] and we'd normally turn those off for long exposure photography. You do need a fast lens though, and that inevitably means lots of glass.
Whether the telescope is reflective enough to get a good photo is another matter - I would guess not, it's designed to minimise stray light.
As for IR, the telescope is probably bright compared to the background at least for now. The main issue though is whether you'd need to actively cool the engineering camera. Cooling stuff in space is difficult because you can only really dump heat via radiation. It's probably not worth the weight to carry a separate cryocooler just for that. The inability to conduct heat is partly why it takes so long to cool down, aside from minimising thermal stress on the infrastructure.
[1] The main reason they exist is so that you have a point of reference when walking about outside, particularly on foggy or cloud days. The meteo folks also use them as visibility markers in winter - e.g. IceCube is something like 1km from the station by line-of-sight. Turning them off for a brief period is usually fine and we'd notify the station before we did it. On a clear night though you don't need them at all.
If you travel to somewhere with essentially zero light pollution, like a desert, if it's a clear sky you can see with starlight
This reminds me... I need to experience this firsthand at some point. Thank you!
Question: how much (if any) of that light is due to atmospheric scattering? ie, the sun's light hitting the daytime side of the earth, being scattered in our atmosphere, and faintly illuminating the night side of our planet? Would the same be true in the zero-atmosphere environment of space?
Plenty of places around the world offer dark enough conditions (but make sure it’s a new moon) - in the US you just need to get out to Arizona or Joshua Tree (or similar empty places). In Europe and elsewhere look for International Dark Sky Reserves. In general go visit places where telescopes are built.
Possible that some of it is from scattering and then if course you wouldn't get that in space as there's no atmosphere to scatter from! Scattering is incredibly faint though, even compared to starlight.
See also gegenschein and the Zodiacal light - both are backscatter effects.
Slight addendum - actually you often want satellites to be reflective. Dark means good radiator, but also good absorber and a black spacecraft is about as bad as you can get for thermal management if you point it in the wrong direction. For an Earth orbiting craft that means it’ll overheat in sunlight and then dump everything in shadow - lots of thermal stress and not good for instruments that like a constant temperature. That’s why you often see sats covered in polymide foil.
Actually JWST orbits around L2 at appreciable altitude. The unique aspect of the orbit is that it is never in the shadow of the Earth or moon. It guarantees consistent solar power and thermal load.
One of the fascinating bits is how much engineering has gone into keeping the detectors cold. Because JWST is (primarily?) taking images in the mid-IR region, it is crucially important to keep the detectors cool otherwise you'd just be swamped in background noise. Keeping things cold in space is despite it being a cold place actually extremely difficult, because you can't use convective cooling. If something sits in the sun "all-day" it gets warm, so the whole reason for the sunshield is to keep the instruments cool. On top of that they are using active cooling for the (IIRC) first time on a satellite. I saw a presentation from one of the engineers two years ago and it's absolutely fascinating stuff.
My first thought is, "Why didn't they eschew solar power entirely and 'simply' park it in the shade?"
Surely they considered that, and the wins would have been massive -- no sunshields needed. So I'm sure the downsides must have been massive as well. Not enough plutonium fuel, or perhaps it just wouldn't provide enough power over the life of the mission. And of course the radioactive fuel would generate its own heat as well.
Now I need to find out more about that decision....
edit: Another commenter mentioned, "The earth's shadow never reaches L2 anyhow - it's only penumbra at that distance since the angular size of the earth is smaller then the angular size of the Sun." If that's correct, then there was never truly an option of "parking it in the shade" anyway.
The dark side of the Earth is very bright in IR. Getting far away from the Earth and moon makes them dimmer. Also, going for L2 makes sure the Earth, moon, and Sun are all always in the same direction, so the sun shield will block all significant IR sources.
Given that this does not really give the orbit any real significance in terms of being shaded by the Earth from the Sun, is there another reason it's there?
I guess it gives you a very stable position for observations, but why not just put it in solar orbit then? Then again this is kind of a solar orbit that happens to stay close to the Earth at all times so that's a plus for comms.
That's exactly what L2 is. Otherwise the orbits of JWST and Earth would have different periods. It's handy to be able to have constant high speed communications and not need large antennas with blackouts.
It also is handy to keep the closest IR sources (Earth and moon) in the same direction as the sun at all times so there is never a point where you have a significant IR source above the sun shield.
JW does indeed circle that Lagrange point, but it is definitely not in Earth’s shadow.
The JW orbit semi major axis (about the L2 point) is order of 500,000 km. The radius of Earth is about 6500 km. Thus, the shadow of the Earth is extremely small compared with the excursions of JW.
The earth's shadow never reaches L2 anyhow - it's only penumbra at that distance since the angular size of the earth is smaller then the angular size of the Sun
Besides the sun being so stupendously large that the L2 point isn't in the direct shade cone anymore. At the orbit it is in full sunshine as though the Earth isn't there.
We're talking about "see the faintest light from stars millions of lightyears away"-dark (for that reason you also can't shine a light on it), with our sun in the background. To see something on that side, you'd basically need the vision capabilities the telescope itself has and that's obviously not going to happen.
It has a selfie 'lens': NIRcam has extra optics that can be swung in that allows the camera to focus on the primary mirror instead of out in the great infinity.
So in this sense webb is already a camera which can photograph itself!
This capability exists for mirror alignment: They'll point webb at an isolated star, switch to focusing on the primary mirror, swap in optics that cause small phase differences to result in diffraction patterns, and then they can use the resulting images to fine tune the positioning of the mirror segments to a small fraction of the wavelength of light that they're using.
> to see something on that side, you'd basically need the vision capabilities the telescope itself
I'm not sure if that's correct - there's a lot of photons hitting the telescope (the night sky), and night vision systems can work off of a relatively small number of photons
[0]:https://twitter.com/NASAWebb/status/1479161991252131841