That’s fucking scary
#AVirusWithShoes
Genuinely epic ! Just the communications with this thing are breathtaking. I don’t know about the uplink but ISTR that the signal-to-noise ratio on the downlink is now substantially less than 1. I think they may also be having to worry about photon statistical noise, and these are radio photons so in any other application we can forget about their individuality as there are always a fantastically large number of them. But not here.
Thanks for this. It’s not often you, of all people, present us with the opportunity to be this pedantic (and I’m sorry if I’ve jumped the queue Kev). But … it’s not a satellite. It’s a probe. Or a spacecraft.
Not at all. I seem to have lost that role and acquired a new one
So not MEG2 then?
How embarrassing! And it gets worse, because it came-up in my dreams last night that I’d used the word “satellite”! WTF? A cry of existential angst from my subconscious if ever there was one…
Of course, if space was positively curved…
It depresses me to note how scientists desperate for funding misrepresent and ‘oversell’ their work - in this case, a mere refinement of knowledge (i.e. that ptychodonts were Lamniform sharks, and not the last of the hybodonts, plus some species being pelagic, which has been evident for 30+ years from various Western Interior Seaway specimens).
Not much point banging-on about the other details a newspaper article gets wrong…
Keep an eye open for shiny grey squares covered in ripple-like ridge, 1-2cm across, in the Chalk near where you live - Ptychodus teeth are pretty common fossils down your way.
Is orbiting the super massive black hole at the centre of the galaxy?
So are all those used McDonald’s cups which are blowing around the local shopping development here. They feel like junk but not really like ‘space junk’, or is that just me ? Then again, I was aiming high with the pedantry …
The way everything is going the Restaurant at the end of the universe will have golden arches.
Mind boggling.
I cannot even begin to imagine how that’s technically possible!
It relies on illuminating the object we’re ‘photographing’ with a ‘chirped pulse’ of laser light. We’ve had chirped pulses for decades now.
It’s fundamental to very short pulses of anything, including light, that they are made up of a range of colours (=wavelengths, or frequencies). The shorter the pulse, the broader the range needs to be.
Such pulses can be ‘chirped’ i.e. the different colours can be delayed by different amounts so, say, the blue end of the range arrives at the object first and the red end arrives last. It’s this relationship which gives us the ultrafast time resolution. If we imagine the chirped pulse illuminating a bursting balloon, the blue light, which arrives first, will image the balloon unburst. The red light, which arrives last, will image the frayed wreckage. The ‘camera’, as described, is an optical setup for separating the different coloured images thereby allowing us to watch the balloon burst as it happens. Frame-by-frame is in fact colour-by-colour.
How do you know everything?
I don’t.
I did, however, spend thirty-odd years of my life working with lasers, often ultrafast ones, and I’ve met both Gerard Mourou and Donna Strickland who shared the 2018 Physics Nobel Prize for Chirped Pulse Amplification.
I was in the EPAC building on the Harwell campus literally yesterday.
Agreed.