--- /dev/null
+---
+postid: 021
+title: Re: Go-to statement considered harmful
+excerpt: Yet another rant on the difficulty of the discipline of programming.
+date: May 24, 2014
+author: Lucian Mogoșanu
+tags: tech
+---
+
+The late Edsger Dijkstra, indeed one of the greats in this rather odd field of
+Computer Science[^1], at the end of the 1960s wrote a letter (or short essay,
+if you will) called "Go-to statement considered harmful", published in the 11th
+volume of the Communications of the ACM. Now, remember that those were dark
+times, before Java, Haskell or Scala, and before even C became popular as a
+programming language. Said letter might have even had some influence on the
+design of these languages, who knows.
+
+About forty-five years later, give or take a few, the case against "goto"s
+still stands, given that C isn't dead. As a matter of fact it's thriving as
+a systems language a-bit-above-assembly and we don't see it going away too
+soon. That doesn't mean programmers use "goto" statements as often as they did
+in the aforementioned dark times, when BASIC ruled the world (or did it?), but
+they often do, and when they do, chaos ensues (or does it?).
+
+I will in this essay play the devil's advocate. Having said that, if you're not
+proficient in at least three or four programming languages, then I urge you to
+go and read [SICP][1] or another introductive Computer Science writing.
+Otherwise, proceed to read Dijkstra's essay[^2] and return here afterwards.
+
+My first issue with Dijkstra's reasoning lies in one of his assumptions,
+namely, and I quote:
+
+> My second remark is that our intellectual powers are rather geared to master
+> static relations and that our powers to visualize processes evolving in time
+> are relatively poorly developed. For that reason we should do (as wise
+> programmers aware of our limitations) our utmost to shorten the conceptual
+> gap between the static program and the dynamic process, to make the
+> correspondence between the program (spread out in text space) and the process
+> (spread out in time) as trivial as possible.
+
+The first sentence is indeed true: humans generally find it hard to grasp
+processes, be they of mathematical, algorithmical or some other description. We
+reason poorly when it comes to temporal phenomena, despite the (seemingly
+paradoxical) fact that we are well suited to handle them unconsciously or
+subconsciously, or develop some feedback-based mechanism to approximate a
+solution[^3].
+
+I don't agree with the remainder of the paragraph, though, due to the fact that
+it implies (or it doesn't and I'm getting it all wrong) that we should strive
+to make programming languages more "comfy", and once we do that, it's all going
+to be nice and dandy and "structured programming" is going to solve all our
+problems. There are at least a million examples which can invalidate this line
+of reasoning.
+
+The fact is, we can't make programming more "comfortable", no matter how we put
+it, and Edsger seems to have [figured that out][4] way before I did it, that's
+for sure.
+
+Look at procedural programming for example: C is one of the most "comfortable"
+programming languages there are, given that you know how electronic computing
+machines work, and yet most undergrads have great difficulty in understanding
+how pointers work, not to mention dynamic memory allocation and other such
+techniques which are crucial to a programmer's survival. Abstracting structures
+as objects doesn't make things any easier, since making assumptions about flow
+control or memory allocation doesn't remove the dirt, it simply sweeps it under
+the rug; and let's not even get into functional or logic programming.
+
+Thus, given that programming seems to be an inherently unintuitive method and
+programming languages being inherently hard to master tools, the only thing
+that's left is to do it the other way around, i.e. struggle to adapt our
+reasoning to our programs' structure[^4]: the existing programming paradigms
+aren't enough to express what we want to achieve, so we have to combine them
+and employ new methods of thinking, serving as right tools for the right tasks.
+This is indeed a painful problem, but one that we must not ignore, so that it
+doesn't bite us back when we least expect it.
+
+Given this new context, the "goto" issue becomes but a drop in an ocean.
+Surely, the damned instruction will still cause us trouble, but this is nothing
+compared to expressing it in the form of continuations, which, by the way,
+provide us with a way to formally reason about the whole thing. Programs, or
+"processes", won't become any easier to visualise, but at least we'll have
+proper ways of fighting the dragon.
+
+Finally, I feel the need to mention that "structured" programming, like any
+other programming paradigm in existence, deals poorly with corner cases. For
+example, have you ever written a procedure that never returns? Well, I have,
+and you can bet that you're running them each fraction of a second in your
+operating system or some other obscure library that either does this because
+it's more efficient, or because it *absolutely*, *needs to*, do this.
+
+Silver bullets, holy grails, there are no such things. So I guess we'll just
+have to run like hell away from our "goto"s; unless we really *need* to use
+them.
+
+[^1]: Why odd? Well, I would call it a branch of mathematics, but then many
+mathematicians would reprove me for mixing algorithms, "design" and other
+mumbo-jumbo, with mathematics. I could instead call it a branch of science, but
+then physicists would tell me to take my abstract "oracle" toy-Turing machines
+and go elsewhere. Every existing taxonomy would fail me, since Computer Science
+is a distinct field of science, even though it makes use of or is used in all
+the others.
+
+[^2]: Available [online][2] or [as a pdf][3] file.
+
+[^3]: We learn to play music, tennis, to do martial arts, make love etc., but
+we don't learn it *consciously*. Rather, our brain learns the mechanism based
+on trial and error and positive reward. I suppose this is well-documented in
+literature, Pavlov's dog being one of the first examples that comes into mind.
+
+[^4]: As any architect, or artist, the programmer should in theory know what
+his goal is. His difficulty lies instead in finding the magic formula, the
+right words, the proper chant, to make the pieces fall into place so that their
+composition does whatever they intended it to. Sounds so easy, doesn't it?
+
+[1]: http://mitpress.mit.edu/sicp/full-text/book/book.html
+[2]: http://www.u.arizona.edu/~rubinson/copyright_violations/Go_To_Considered_Harmful.html
+[3]: https://www.cs.utexas.edu/users/EWD/ewd02xx/EWD215.PDF
+[4]: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EWD1036.html
--- /dev/null
+---
+postid: 022
+title: Bitcoin as infrastructure [ii]
+date: June 1, 2014
+author: Lucian Mogoșanu
+tags: cogitatio, tech
+---
+
+Also read: [Part I][1].
+
+## Part II: The fractal nature of computing infrastructure
+
+In the previous part of my essay I argued and described the importance of
+"networks" in the structure and organization of natural things, and, naturally,
+in human society, illustrating it with historical examples. A crucial stepping
+stone towards the Bitcoin-as-infrastructure was and is the computing machine,
+which is why I will rest upon it for a moment, also partly because it has been
+one of my main areas of study for the past seven years or so.
+
+I am afraid (and I hope that I am wrong in this regard) that many of the people
+sitting in front of their computers reading this text don't consider the deep
+meaning and implication of what I call "computing machines". To be entirely
+honest, even the more enlightened scientists, engineers and philosophers are
+quite baffled by the concept: computers are machines that "do stuff", "on their
+own", and we only manage to communicate with them properly in the universal
+language of mathematics[^3]. Amusingly, most probably computer scientists won't
+hate me (too much) for providing such a vague definition, mostly because they
+aren't able to provide a (much) better definition themselves[^4], and I know
+that because I call myself a computer scientist.
+
+The breakthrough in computing sciences was set by a few theoretical models, of
+which I will remind the Turing Machine devised by Alan Turing, the Lambda
+Calculus conceived by Alonzo Church (Turing's doctoral advisor), the Markov
+algorithmic machine named after Andrey Markov and, last but not least, the
+First-Order Predicate Calculus. These all describe "Turing-equivalent"
+machines, upon which mathematicians and engineers, the most notable being Jon
+von Neumann, laid the foundation for the first electrical computers, about one
+century after Babbage's mechanical Difference Engine.
+
+In essence, electrical computers are based upon electrical circuits, which are
+nothing more than networks through which electrical signals propagate. To
+encode and process useful information, said circuits use binary voltage
+differences and amplifiers, translated conceptually into boolean logic gates.
+Of course, things are a bit more complicated, since signals propagate in time
+with the use of a clock, which paves the way for sequential circuits. Thus,
+essentially, processing units are sequential circuits which read instructions
+from a memory and, based on their encoding, generate some side effect into a
+memory -- the same or another, this is not relevant for the definition.
+
+From vacuum tubes to transistors and then to integrated circuits, from ENIAC to
+the smartphone in your pocket, all computers are based on the same principle.
+Nothing much has changed from mid 20th century to the present day in terms of
+theoretical achievements[^5], all hardware improvements being purely of
+technological nature. Qualitative improvements arised, however, firstly in
+terms of computing scalability and secondly in terms of software
+sophistication.
+
+"The fractal nature of computing infrastructure" might sound like a rather
+pompous formulation. I'm not going to argue, it probably is, but it's also
+true: computers, networks by nature, have evolved into networks-of-networks as
+computer networks arised, this evolution giving birth first to local networks,
+which then extended to a global network which we now know as the Internet. It's
+important to understand that the consolidation of our current networking
+infrastructure required little in terms of scientific innovation, as they were
+formed naturally as the number of computers in the world grew. And as the
+telegraph was tied to railroads in the 1800s, so the Internet was tied to
+telephone lines at the end of the 1990s, until the infrastructure was updated
+to optical fiber. And then the Internet itself, barely understood by anyone,
+gave way to chaos by becoming infrastructure for some of the software
+projects.
+
+One of these projects is the hypertext developed by Tim Berners-Lee sometime
+around 1990 at CERN. This in turn evolved into the Web, which itself is a
+network within a network and an infrastructure for content and applications.
+Now, if we stop for a moment to reflect upon this fractal nature of computers,
+we notice that it emerges from the property of Turing machines to run other
+Turing machines, facilitating the stacking of layers upon layers of
+complexity[^6].
+
+You are probably aware of the rest of the story: search engines, blogs, social
+networks, the Cloud, all of them fascinating products of the age of the
+Internet. While this comprises no more than about fifteen years of history,
+it's way too much to fit here. Besides, some of these things will pass, some
+will live on, while some will be remembered in the future; which brings me to
+the next part of my essay.
+
+[^3]: Those which you call "programming languages" could as well be considered
+a morphologically and syntactically altered subset of mathematics.
+
+[^4]: Unlike, for example, electrical engineers, who would stab me to death if
+I defined the capacitor as a "bucket of electrons with a small hole in it". And
+here we come to one of the fundamental problems of computer science, that of
+providing precise definitions to otherwise confusing concepts. To paraphrase
+Phil Karlton, "There are two hard things in computer science: cache
+invalidation, naming things and off-by-one errors".
+
+[^5]: If we rule out quantum computers, which are still a subject of research.
+
+[^6]: It is, after all, turtles all the way down.
+
+[1]: /posts/y00/01f-bitcoin-as-infrastructure-i.html