5

The BBC article Event Horizon Telescope ready to image black hole describes the Event Horizon Telescope, a coordinated observing technique with several radio telescope arrays across the globe forming a synthetic aperture with an Earth-sized baseline.

$$\frac{\lambda}{r_{Earth}} \sim \frac{r_{Sag A*}}{D_{Sag A*}} \sim 10^{-11}$$

...when plugging in 1 millimeter for $\lambda$, and with $r_{Sag A*}$ and $D_{Sag A*}$ the radius of, and distance from Earth to Sagittarius A*, the black hole in the center of the Milky Way Galaxy of 20 million km and 26,000 light years, respectively. (values taken from the article).

The equation I've written shows that millimeter wavelength interferometry with an Earth-sized baseline has the possibility to resolve the existence of some structure with scale of the black hole's event horizon.

My question is How does the Event Horizon Telescope implement the interferometry? It would certainly be impossible to bring all signals together to a central site and perform the interference there in real time as down-converted analog signals, and quite difficult/expensive to do it with dedicated, synchronized digital optical fiber lines. Is the massive amounts of data sent as IP packets over the internet to a central correlator (numerical interferometer)?

The article mentions atomic clocks and lots of hard drives, and I have a hunch these have something to do with it.

The eventual EHT array will have 12 widely spaced participating radio facilities". From The BBC's February 16, 2017 "Event Horizon Telescope ready to image black hole" http://www.bbc.com/news/science-environment-38937141

above: "The eventual EHT array will have 12 widely spaced participating radio facilities". From The BBC's February 16, 2017 Event Horizon Telescope ready to image black hole.

uhoh
  • 31,151
  • 9
  • 89
  • 293
  • 1
    If I had to guess - since you can't synchronize these telescopes in real time - it is that each telescope is programmed to take measurements at predefined points in time on the atomic clock and then the interference is computed on a central server (or rather several servers) for all the data recorded. Would be a very complex setup that involves some pretty nasty calculations, but could actually be possible. – Adwaenyth Feb 17 '17 at 09:44

2 Answers2

4

Assuming I understand your question:

As in actually mentioned in the article, all data collected are stored on hard drives, timestamped with an atomic clock, and then flown to a central location where the interferometry actually takes place.

Further reading:

uhoh
  • 31,151
  • 9
  • 89
  • 293
  • 1
    That certainly sounds like how they will facilitate it. But to implement the interferometry there has to be something more complex going on. Are they just blindly writing GHz raw data directly to a hard drive? Is there a local oscillator? Each site has a substantially different doppler shift due to the rotation of the Earth. I don't think proper interferometry can be done with just timestamps. – uhoh Feb 17 '17 at 15:07
  • 2
    @uhoh "Blindly" probably isn't the best word to describe the methodology. We know very precisely the location from which certain data sets were taken, and at what time. We also know the relative position in the sky of the target object very precisely. Therefore, any sort of disturbance resulting from geophysical location (Doppler redshift and otherwise) should be able to be compensated for. As for a local oscillator, I cannot say for certain—although, I would certainly not be surprised if there were some local heterodyning taking place. –  Feb 17 '17 at 15:23
  • You are right - it was the wrong word. I meant to draw a contrast between just recording everything possible as raw data - the ADC outputs, ionospheric conditions, water vapor, atomic clock time, as one big data stream to hard drive, vs trying to heterodyne to an oscillator that's been first synchronized to the clocks, then frequency/phase shifted based on velocity, etc. I don't have a good word for a brute force approach to "record now, correct later", but the word I used was not what I meant to say! – uhoh Feb 17 '17 at 15:30
  • Come to think about it, if there are other objects very close by, it's possible they can be used as a radio "guide star" to compensate for atmospherics off line. – uhoh Feb 17 '17 at 15:38
  • There is more information in Computer World. – uhoh Feb 17 '17 at 15:46
  • @uhoh I think I saw a figure of 64 Gbps for the raw data rate at each telescope, which suggests that they are pretty much just writing the baseband data. – Steve Linton Dec 29 '18 at 12:35
  • @SteveLinton If you're ever able to find a source for it, that would be great! – uhoh Dec 29 '18 at 13:02
  • 1
    @uhoh https://www.datacenterdynamics.com/news/helium-filled-drives-store-black-hole-data/ – Steve Linton Dec 29 '18 at 13:27
  • @SteveLinton excellent! I found this open access paper in this list and it seems to cover an example of the processing, packaging, and writing-to-disk of the data in gory detail. If you are interested please feel free to use a little bit of it plus the helium-filled hard drives and write an answer, and I think we can put this question to rest! – uhoh Dec 29 '18 at 14:33
  • @SteveLinton just fyi; after writing the end of this answer I have just asked Will the SKA's total bandwidth still exceed the Earth's internet's bandwidth? – uhoh Jan 07 '19 at 03:32
1

Supplementary graphical representation for the accepted answer, from tweet:

enter image description here

uhoh
  • 31,151
  • 9
  • 89
  • 293
  • 1
    нαηαzσησ lαη∂ commented: "at the end of the data collection there were over 5 petabytes of data stored in over 100 stacks of hard drives like the ones on the picture." Also, Peter Telford "You'd think black hole data would compress really well ." – Keith McClary Jul 24 '19 at 02:48
  • @KeithMcClary hah! – uhoh Jul 24 '19 at 02:56
  • @KeithMcClary okay now I am curious, where are these quotes from exactly? – uhoh May 15 '22 at 01:54