Thank you, Arnold, for this concise and direct message.
This was a fun article to write (here on Code Project). The neat thing about the solver (besides the interactive UI) is that I demonstrate three techniques for solving the puzzle:
- Recursive with the C# yield operator
- Recursive, demonstrating a “step and continue” approach
From Code Project:
If it walks like a duck, and quacks like a duck, it’s probably gonna throw exceptions at runtime. – Mladen Janković
Granted, I borrowed that image from here, where the author views people like me as “unhappy dogmatists who can’t bring themselves to leave the kiddie pool of strong typing.” Yes, I am rather dogmatic about strong typing! But to the point…
David R. Maclver wrote an interesting post “Static typing will not save us from broken software.”
davidMacIver = HesGotSomeGoodPoints() davidMacIver = ButHesAlsoWrong()
Make it more expensive to write broken software – write in a non-static typed language
Make it cheaper to write correct software – write in a static typed language
But because these bugs [those that happen in production] are relatively minor
due to ground-based computer software which produced output in non-SI units of pound-seconds (lbf s) instead of the SI units of newton-seconds (N s) specified in the contract between NASA and Lockheed.
would not have cost the taxpayers $327.6 million
Dealing with type errors in a non-static language is not cheaper. I spend time writing unit tests and verifying the runtime by hand to ensure type correctness that I do NOT spend with static typed languages. Running an app only to discover I’m using a duck-typed variable before I’ve assigned something to it is a time waster. The benefits of static typing far outweigh the benefits of non-static typed languages, and I’m sorry David, that’s not opinion, but measurable fact.
Can duck-typed languages save time? Sure! I can replace a class instance that interfaces with hardware with a class instance that mocks the hardware as simply as the code above. I can add fields to a class outside of the class simply by making an assignment.
All of which are cheats and laziness, break the “code as documentation” fallback, and make duck-typed code more error prone and difficult to maintain.
Most existing static type systems also come with a build time cost that makes testing in general more expensive.
And the unit tests that duck-typed languages require often (in my experience with a small Ruby on Rails web app) take longer to run than compiling much more complex web sites.
Perbot: a tech support person that behaves like a bot (robot) by asking canned questions, reading the replies, and clicking on canned responses.
How did I even find TopTal? In a Code Project newsletter! I’d never even heard of them before.
Now, the 90 minute timed test at Codility asked me to solve three problems:
- the point in which in an array, the count of X from the left != count of X from the right.
- given some bit encoding scheme, convert N to -N with the least number of bits.
- the optimum number of moves a chess knight must make to get from (0, 0) to (m, n)
Regarding the last, someone on Code Project had posted about the idiocy of that last question a few months back.
In fact, they all are extremely poor tests of skill. Each (particularly #2 and #3) probably involve a simple trick to figure it out, and if you don’t know the trick (like me) you’re spending a lot of time just thinking about the problem. At least I was.
How exactly does solving an arbitrary algorithm test coding skills? How does it demonstrate good OOP practices, or DB architecture skills, or an understanding of things like Linq, or really much else other than “getting the trick?”
After working hard at #1, writing assertions, commenting the code, testing edge cases, optimizing the algorithm for O(1) performance, I realized I had spent an hour on one stupid problem. That left 30 minutes for the remaining two. Riiiight.
Now, Codility says something like “don’t worry if you don’t complete all three tests, just show your best work, even if you only complete one test.”
And then the test results are amusing. The requirements don’t state what to do with incorrect inputs into the “solution” method, and they clearly can’t handle exceptions being thrown — I noticed my score in their example test dropped dramatically when I used exceptions.
Now to TopTal. I got an automated email rejecting me because my score was too low. It was pretty obvious from the email that no one had even bothered to look at my code! Relying on automated test results, not even looking at the quality of code, those are not ways to gain points with me.
Sadly, this sort of testing methods is probably going to be used more and more.
Now, oddly, I don’t particularly have a problem with the concept behind Codility (except of course that it doesn’t really test skills) but I definitely have a problem with how TopTal handled it – I would have expected, at a minimum, a follow up discussion with a technical person.
LaserFiche wrote an article on How You First Programming Language Warps Your Brain. An interesting read that got me thinking and wondering where my biases for both strongly typed languages and my overall architectural views come from.
Technically, my first language was the numeric keyboard encoding on the HP25C, capable of 49 instruction memory. I learned a lot about code optimization.
I then moved up to the HP67, which I think offered 200 instruction memory, then the HP41C. One thing that stood out for me as a kid with regards to the HP products was the documentation – high quality, the programs that came with calculators were explained in concise and clear language. That became a guiding influence in my programming / documentation style.
Somewhere in the middle I was doing BASIC programming on a teletype on a PDP-11, where I finally wrapped my head around the idea that when you tell the computer
a = 5 and later say
print a+1, there’s some memory somewhere that contains 5, a reference somewhere that
a refers to that memory cell. That actually took a while.
When I progressed to the Commodore PET, the world opened up to hardware, registers, and 6502 assembly language. Not to mention early concepts of a BIOS. Again, the quality of documentation, but this time for the hardware, was very influential.
Doing a lot of assembly programming kept teaching me skills like code optimization, DRY principles way before the acronym was invented, good documentation and variable / entry point naming skills, and most certainly, good debugging skills.
I also got introduced to parallel processing in a Pascal-like language called SAIL, which was sort of a mind-bender because line numbers and goto / jmp (in assembly) disappeared. I really had no concept of a stack yet at that point, and when, in a summer class I took at Stanford by a guy that would pick his zit scabs and eat them (I kid you not, it was and his classes were being video-taped!) I simply could not wrap my head around recursion, stack-based variables, and ended up spending my time in front of the computer playing Star Trek instead of writing the quicksort algorithm that was our homework. Heck, I didn’t even know what question I should be asking, my programming brain was so oriented around linear-sequence programming. I’m not sure I finally grokked the concept of a stack vs. a heap until years later.
My introduction to C was one of “run away”. All those ridiculous symbols and weird behaviors, like
++i. Really? Who invented this horribly confusing language to write and read? Pascal was my buddy by then, courtesy of Borland’s $99 Turbo Pascal.
So I basically skipped C, but really like C++. It made sense — classes, encapsulation, polymorphism. But I got “base class” wrong. It’s a BASE class right? So like the base of a pyramid, it’s at the bottom of the derivation hierarchy, right??? I wrote a whole app that way. Templates were amazing, they also made so much sense.
In many ways, C++ very much refined my ideas of separating code into isolated blocks of code. Still, I found that there was too much inter-dependency between classes. Re-use was a myth, not a reality. Tight coupling of code made the code monolithic. It was only by careful planning and using higher level architectures, like a publisher-subscriber pattern, that I began to disentangle the monolithic nature of applications. This also lead me down the path of loading DLL’s dynamically at runtime and separating out declarative aspects of the code from imperative ones. XML didn’t really exist, so I wrote my own XAML-like syntax for rendering UI’s with MFC.
Other things became apparent too – the tight coupling of the data access layer (and embedded SQL statements) that were rampant at that time. Again, a custom scripting language to separate out the SQL statements from the application had a direct impact on how quickly code could be changed to deal with changing / new requirements. Everyone’s jaw would drop when I could run a C++ app, change some declarative markup and SQL, and have new functionality, without compiling a line of code (heck, without even exiting the application.) I had one coworker tell me “I don’t think I can ever go back to the way everyone else does programming!” Another coworker refused to use my framework on the grounds of “What? Am I supposed to put on my resume that I used Marc Clifton’s framework?” An educational moment for me, that I was starting to walk down a lonely path. Then again, looking back, yes, he could certainly have put that on his resume!
About that time, the Design Patterns book hit the streets and everyone was yammering about DP’s, and I thought, wow, this is so old hat!
Also about that time I read Vital Dust[^] which changed how I thought about programming forever. It’s quite fascinating that a book about the roots of life changed my thinking about programming – something no programming book had (and has) ever done. Around that time, I discovered Code Project (was using a 64K ISDN modem at the time) and so, on May 26 2002, I took my ideas from Vital Dust and the experiences I had gained and wrote my first article, Organic Programming Environment[^]. (Years later, the concept was revisited again with HOPE[^] originally posted exactly (-1 day) 12 years later!)
And certainly, my history of compiled, strong typed languages has bent my thinking, such that script, duck-typed languages are, to be frank, something a joke (and a bad one at that) in my opinion.
So I continue on the path of further refining the ideas of decoupling code and creating modular data-driven rather than imperative workflow-driven applications, and where concepts like dependency injection are, in my opinion, totally the wrong direction to go, (I can’t believe people still use DI), and to the ire of the people that look at my code and the supporting framework, who don’t get it and never will get it.
Over the last 30 years, my exposure to a certain path of languages and a certain path of problems has led me to a particularly lonely corner of the programming universe! But thank God for Code Project, where you can at least peek under the bed to see what monsters lurk in the bedroom of my programming mind.
The downside to this particular path is that I pretty much march the beat of my own drum. Sure, I use frameworks like .NET, EF, DevExpress, jQuery, Bootstrap, Knockout, Backbone, etc., but I rarely use them in their native form — they always get wrapped in what could loosely be called an “automation layer” but is actually my way of decoupling dependencies, optimizing re-use, and always working on trying to make my work repeatable and robust.