The paper omits DD's contact information (i.e. email), which I think is really bad.
The theory of computation was originally intended only as a mathematical technique for studying proof (Turing 1936), not a branch of physics. Then, as now, there was a widespread assumption – which I shall call the mathematicians’ misconception – that what the rules of logical inference are, and hence what constitutes a proof, are a priori logical issues, independent of the laws of physics. This is analogous to Kant’s (1781) misconception that he knew with certainty what the geometry of space is. In fact proof and computation are, like geometry, attributes of the physical world. Different laws of physics would in general make different functions computable and therefore different mathematical assertions provable.This prestige-seeking reference to Kant isn't useful. Few readers will know what DD's talking about, and he doesn't even try to explain. It doesn't add anything.
It's there as a social convention, both to gain prestige and because just starting by saying "in fact my position" is frowned on. But if you say "Kant was wrong. In fact my position" somehow that's seen as better, even though it's worse. The Kant thing helps disguise the asserting-rather-than-arguing.
But this supposed deficiency is shared by all scientific theories: Tests always depend on background knowledge – assumptions about other laws and about how measuring instruments work (Popper 1963, ch. 10 §4). Logically, should any theory fail a test, one always has the option of retaining it by denying one of those assumptions.The ideas that make up background knowledge can be assumptions, but don't have to be. They are often pretty good ideas which are argued and explained, rather than being assumed.