Probably the biggest reason why I chose physics, was the belief that it is completely dictated by rational arguments. If you can show that from A comes B because of C, then everyone will accept it as being true. Given of course that you didn't mess up the calculations and the premise is realistic.
However, a few years of work in this field has proven that belief to be false. You can prove your point rigorously with widely accepted mathematical tools and leave no wiggling room. And still there will be people who do not agree with your reasoning.
It wouldn't be a problem if these doubters were some internet trolls who obviously have no idea what they're saying, since you can just ignore them. Unfortunately, they can also be esteemed scientists in the field. Senior researchers, award winning scientists, professors, even so called living legends.
Sure, it's fine to make mistakes, and everyone does make them. I have been locked in several heated discussions just to find out in the end that I was wrong. But you know what? I learned something from those discussions.
Then again, some people have the trait of defending their flawed view till the end of time, no matter what. In the worst case, even if they do not agree with your reasoning, they still know all the math to be true, which is logical nonsense!
I have had the pleasure of being involved with such people, in both academia and industry.
A few years back I was briefly working at a technology company and there was a terrible senior researcher who I had to work with. I mean he was absolutely horrendous. Like panic attack inducing bad. Don't get me wrong, he was a nice guy. He was just completely lost in physics and in absolute denial of the fact.
One time, we were talking about diffraction from a single slit (
a historically well known problem). For those of you who are not so optically inclined, here is a short summary: when light goes through a large opening, like a window, the pattern the light makes is in the shape of the said window.
If you pass light through some small aperture, like the eye of a needle, the shape the light takes resembles the opening, but is slightly wider. As you keep reducing the size of the aperture, the pattern gets wider and wider, until you end up at a spherical wave.
So, this senior told me that "if you have an infinitesimally small slit, then the light you see at the far-field observation plane is a delta function." This statement is against everything we know about diffraction. It is actually against all common sense, since the delta function exists only in one point in space and it has an infinite value in that point. It is not a very physical entity.
I immediately asked whether he meant that it becomes a spherical wave (what actually happens) to which he replied no. He really thought that if you have a small enough aperture, diffraction does not occur. He then continued to show, with the Rayleigh-Sommerfeld diffraction integral, why he thinks so.
In this integral, URS is the light coming out of the aperture (observed in the far-field) and UO is the light coming to it. Usually, one assumes that UO is a plane-wave, which is a good assumption when the aperture is very small. Then, the diffraction pattern can be found by simply integrating over the area of the aperture. For an infinitesimally small aperture we have
which is a spherical wave. This is a well-known and non-controversial result that is backed up by a few hundred years worth of experiments.
Instead, what my senior did, was that he placed a delta function as the incoming wave and integrated over infinity, which obviously yields the value of the field in only one point. I cannot emphasize enough how utterly and completely wrong this procedure is.
I pointed out the mistake he had made and told him how you are supposed to use the diffraction integral. He completely rejected what I had to say and actually got kinda mad. What ensued was about two months of arguing over something that was not even open for discussion. I couldn't change mathematical facts even if I wanted to!
And of course, this was not the only thing we disagreed on. He made several statements that were just as ridiculous as this one and he held on to them, real tight. I lost a lot of hair during that time.
At one point he told me that he had a PhD from a major university (a really famous one, actually) and therefore he didn't need any lectures on the topic. It took every last bit of my willpower to keep my mouth shut.
It is beyond me how he was even able to graduate. Knowing how high the tuition fees are in the top universities, all I can say is: money well spent.
Now I'm just glad that chapter of my life is in the past, and I also learned something. It really doesn't matter at all where you studied. You could have gone to Harvard or Yale or (insert university here) and still be completely oblivious. Which is kinda sad.
This rant went on longer than intended, and I still have more of where that came from. Seems like I'll just have to continue at some other time, so stay tuned for the next episode!
Comments
Post a Comment