What Skills has Working in Tech Actually Given You?

KokcharovSkillHierarchy2015.jpg

In TechChrunch, the author writes:

Do you work in software? Do you have more than a decade of experience? You do? I’m sorry to hear that.  That means there’s a strong possibility that much of what you know is already obsolete.

Certainly much of what I have learned in the technical realm is certainly obsolete. I haven’t coded in 6502, Z80, 8086, Cobol, Fortran, Pascal, C, or C++ in ages.

But much of what I’ve learned about writing documentation, communication, debugging, creating flexible architectures, making accurate time estimates, designing intuitive UI’s, and getting skilled at teaching others, those are skills that never become obsolete.

The most important skill, one that truly doesn’t get old, is the meta-skill of constantly learning new things … and that meta-skill can rust and wither away, too, if it languishes unused.

Agreed, but the skill of learning new things isn’t the only meta-skill.

Static Typing and Broken Software

static_typing_duh.jpg

Granted, I borrowed that image from here, where the author views people like me as “unhappy dogmatists who can’t bring themselves to leave the kiddie pool of strong typing.”  Yes, I am rather dogmatic about strong typing!  But to the point…

David R. Maclver wrote an interesting post “Static typing will not save us from broken software.

Riiight.

davidMacIver = HesGotSomeGoodPoints()
davidMacIver = ButHesAlsoWrong()

Make it more expensive to write broken software – write in a non-static typed language
Make it cheaper to write correct software – write in a static typed language

But because these bugs [those that happen in production] are relatively minor

Now that’s where I totally disagree. If a true static typed language had been used, or had been used properly (ie, semantically, which is the next level of strong typing), this case in point[^]

due to ground-based computer software which produced output in non-SI units of pound-seconds (lbf s) instead of the SI units of newton-seconds (N s) specified in the contract between NASA and Lockheed.

would not have cost the taxpayers $327.6 million

Dealing with type errors in a non-static language is not cheaper. I spend time writing unit tests and verifying the runtime by hand to ensure type correctness that I do NOT spend with static typed languages. Running an app only to discover I’m using a duck-typed variable before I’ve assigned something to it is a time waster. The benefits of static typing far outweigh the benefits of non-static typed languages, and I’m sorry David, that’s not opinion, but measurable fact.

Can duck-typed languages save time? Sure! I can replace a class instance that interfaces with hardware with a class instance that mocks the hardware as simply as the code above. I can add fields to a class outside of the class simply by making an assignment.

All of which are cheats and laziness, break the “code as documentation” fallback, and make duck-typed code more error prone and difficult to maintain.

Most existing static type systems also come with a build time cost that makes testing in general more expensive.

And the unit tests that duck-typed languages require often (in my experience with a small Ruby on Rails web app) take longer to run than compiling much more complex web sites.

Fifteen Emerging Technologies that can Change the World

refugees.jpg

SD Times posted about three groups of technologies that can change the world.  One has to ignore the babble-speak:

  1. IoT software and solutions that bring customer engagement potential within reach.
  2. Augmented reality overlays digital information and experiences on the physical world using combinations of cameras and displays.
  3. Hybrid wireless technology will eventually create connected everything.

Quite so.  But I wonder at the tag line “change the world.”  I would assume they are referring to that part of the world that has access to electricity, potable water, food, health care, the Internet, and isn’t worried about starvation, disease, dictators, and terrorist groups. Yeah, that world.

TopTal, Codility, and Automated Skill Testing

Just to get this out of the way, I failed the TopTal[^] application.

How did I even find TopTal? In a Code Project newsletter!  I’d never even heard of them before.

TopTal’s primary screening process is to use Codility[^] to see how good your skills are.

Now, the 90 minute timed test at Codility asked me to solve three problems:

  1. the point in which in an array, the count of X from the left != count of X from the right.
  2. given some bit encoding scheme, convert N to -N with the least number of bits.
  3. the optimum number of moves a chess knight must make to get from (0, 0) to (m, n)

Regarding the last, someone on Code Project had posted about the idiocy of that last question a few months back.

In fact, they all are extremely poor tests of skill. Each (particularly #2 and #3) probably involve a simple trick to figure it out, and if you don’t know the trick (like me) you’re spending a lot of time just thinking about the problem. At least I was.

How exactly does solving an arbitrary algorithm test coding skills? How does it demonstrate good OOP practices, or DB architecture skills, or an understanding of things like Linq, or really much else other than “getting the trick?”

After working hard at #1, writing assertions, commenting the code, testing edge cases, optimizing the algorithm for O(1) performance, I realized I had spent an hour on one stupid problem. That left 30 minutes for the remaining two. Riiiight.

Now, Codility says something like “don’t worry if you don’t complete all three tests, just show your best work, even if you only complete one test.”

And then the test results are amusing. The requirements don’t state what to do with incorrect inputs into the “solution” method, and they clearly can’t handle exceptions being thrown — I noticed my score in their example test dropped dramatically when I used exceptions.

Now to TopTal. I got an automated email rejecting me because my score was too low. It was pretty obvious from the email that no one had even bothered to look at my code!  Relying on automated test results, not even looking at the quality of code, those are not ways to gain points with me.

Sadly, this sort of testing methods is probably going to be used more and more.

Now, oddly, I don’t particularly have a problem with the concept behind Codility (except of course that it doesn’t really test skills) but I definitely have a problem with how TopTal handled it – I would have expected, at a minimum, a follow up discussion with a technical person.

Why TDD Fails

Recently posted on Code Project:

I’m just now trying to adopt my thinking to Test Driven Development mode and I’m finding strange, inconsistent, and bizarre things happening in my thought processes and code in some ways as well. Is this pretty normal when you first try to learn TDD?

And as usual, this got me thinking.  My response:

Yes. TDD is for the birds, but even they didn’t first write tests to figure out if wings could support them, and then developed their wings.

In other words, TDD is in many cases ridiculous because it requires that you write code to test something you haven’t written yet.

The mindset for writing tests is orthogonal to writing the code you are going to test. When you do the latter, you’re thinking about abstraction, decoupling, performance, optimal structures, and, here’s the clincher, you’re thinking (or should be thinking) about how to structure the code so it can be tested.

When you’re writing the tests, you’re thinking about edge cases, parameter validation, exceptions trapping, whether the business logic is correct, and here’s the clincher, you’re thinking about how to ensure that all code paths are actually exercised.

Now I ask you, how can you test that all code paths are actually tested when you haven’t written the code?!?!?!

So of course TDD is a mind f***. It’s a different process, and it’s often a BAD process. Of course there are cases where you can write the tests first, but certainly in my experience, anything other than a very isolated piece of business logic, it doesn’t make sense. The main result of TDD ends up being that you’ve written a bunch of useless tests, and even worse, you’ve written code to meet the structure imposed by the test.

So what should TDD be then, you might ask? Well, it should be a concept of what needs to be tested – is it algorithm correctness, is it performance, is it multithreaded support, is it worth testing? Then write the code with the idea in mind of what you are wanting to test so that the code is testable, then write the tests.

Mark_Wallace on the Code Project also had an excellent response:

TDD is extremely good for two things:

0. Getting loose cannon devs on track.

We all know and have to deal with such guys, but it’s not hard to write tests with premises like “This is the name of the output module, and these are the expected outputs from it”.

— You know your inputs
— You don’t care what the processes are; that’s where you rely on other developers
— You know the outputs That Have To Result.

So write tests for it.

What TDD does is get that clear in everyone’s mind, so flying off in weird and unwanted directions is less likely to happen.

1. Forcing specs

For me, this is their primary function. Even “produce working code” is secondary.

Put it this way: It’s a way of introducing specs when the f***ing lazy management team hasn’t provided them — with the added bonus that the f***ing lazy management team has to sign off on it.

Amen.

How Programming Languages have Affected my Thinking

LaserFiche wrote an article on How You First Programming Language Warps Your Brain.  An interesting read that got me thinking and wondering where my biases for both strongly typed languages and my overall architectural views come from.

Technically, my first language was the numeric keyboard encoding on the HP25C, capable of 49 instruction memory. I learned a lot about code optimization. Smile | :)

I then moved up to the HP67, which I think offered 200 instruction memory, then the HP41C.  One thing that stood out for me as a kid with regards to the HP products was the documentation – high quality, the programs that came with calculators were explained in concise and clear language.  That became a guiding influence in my programming / documentation style.

Somewhere in the middle I was doing BASIC programming on a teletype on a PDP-11, where I finally wrapped my head around the idea that when you tell the computer a = 5 and later say print a+1, there’s some memory somewhere that contains 5, a reference somewhere that a refers to that memory cell. That actually took a while.

When I progressed to the Commodore PET, the world opened up to hardware, registers, and 6502 assembly language. Not to mention early concepts of a BIOS.  Again, the quality of documentation, but this time for the hardware, was very influential.

Doing a lot of assembly programming kept teaching me skills like code optimization, DRY principles way before the acronym was invented, good documentation and variable / entry point naming skills, and most certainly, good debugging skills.

I also got introduced to parallel processing in a Pascal-like language called SAIL, which was sort of a mind-bender because line numbers and goto / jmp (in assembly) disappeared. I really had no concept of a stack yet at that point, and when, in a summer class I took at Stanford by a guy that would pick his zit scabs and eat them (I kid you not, it was Dead | X| and his classes were being video-taped!) I simply could not wrap my head around recursion, stack-based variables, and ended up spending my time in front of the computer playing Star Trek instead of writing the quicksort algorithm that was our homework. Heck, I didn’t even know what question I should be asking, my programming brain was so oriented around linear-sequence programming. I’m not sure I finally grokked the concept of a stack vs. a heap until years later.

My introduction to C was one of “run away”. All those ridiculous symbols and weird behaviors, like i++ vs. ++i. Really? Who invented this horribly confusing language to write and read? Pascal was my buddy by then, courtesy of Borland’s $99 Turbo Pascal.

So I basically skipped C, but really like C++. It made sense — classes, encapsulation, polymorphism. But I got “base class” wrong. It’s a BASE class right? So like the base of a pyramid, it’s at the bottom of the derivation hierarchy, right??? I wrote a whole app that way. Templates were amazing, they also made so much sense.

In many ways, C++ very much refined my ideas of separating code into isolated blocks of code. Still, I found that there was too much inter-dependency between classes. Re-use was a myth, not a reality. Tight coupling of code made the code monolithic. It was only by careful planning and using higher level architectures, like a publisher-subscriber pattern, that I began to disentangle the monolithic nature of applications. This also lead me down the path of loading DLL’s dynamically at runtime and separating out declarative aspects of the code from imperative ones. XML didn’t really exist, so I wrote my own XAML-like syntax for rendering UI’s with MFC.

Other things became apparent too – the tight coupling of the data access layer (and embedded SQL statements) that were rampant at that time. Again, a custom scripting language to separate out the SQL statements from the application had a direct impact on how quickly code could be changed to deal with changing / new requirements. Everyone’s jaw would drop when I could run a C++ app, change some declarative markup and SQL, and have new functionality, without compiling a line of code (heck, without even exiting the application.)  I had one coworker tell me “I don’t think I can ever go back to the way everyone else does programming!”  Another coworker refused to use my framework on the grounds of “What? Am I supposed to put on my resume that I used Marc Clifton’s framework?”  An educational moment for me, that I was starting to walk down a lonely path.  Then again, looking back, yes, he could certainly have put that on his resume!

About that time, the Design Patterns book hit the streets and everyone was yammering about DP’s, and I thought, wow, this is so old hat!

Also about that time I read Vital Dust[^] which changed how I thought about programming forever. It’s quite fascinating that a book about the roots of life changed my thinking about programming – something no programming book had (and has) ever done. Around that time, I discovered Code Project (was using a 64K ISDN modem at the time) and so, on May 26 2002, I took my ideas from Vital Dust and the experiences I had gained and wrote my first article, Organic Programming Environment[^]. (Years later, the concept was revisited again with HOPE[^] originally posted exactly (-1 day) 12 years later!)

And then C# appeared. The rest was history. C# has so greatly influenced my programming and thinking style that I cringe whenever I have to use another language (which usually consists nowadays of Python and Javascript.) And it continues to do so, with functional aspects, LINQ, etc.

And certainly, my history of compiled, strong typed languages has bent my thinking, such that script, duck-typed languages are, to be frank, something a joke (and a bad one at that) in my opinion.

So I continue on the path of further refining the ideas of decoupling code and creating modular data-driven rather than imperative workflow-driven applications, and where concepts like dependency injection are, in my opinion, totally the wrong direction to go, (I can’t believe people still use DI), and to the ire of the people that look at my code and the supporting framework, who don’t get it and never will get it.

Over the last 30 years, my exposure to a certain path of languages and a certain path of problems has led me to a particularly lonely corner of the programming universe! But thank God for Code Project, where you can at least peek under the bed to see what monsters lurk in the bedroom of my programming mind. Smile | :)

The downside to this particular path is that I pretty much march the beat of my own drum.  Sure, I use frameworks like .NET, EF, DevExpress, jQuery, Bootstrap, Knockout, Backbone, etc., but I rarely use them in their native form — they always get wrapped in what could loosely be called an “automation layer” but is actually my way of decoupling dependencies, optimizing re-use, and always working on trying to make my work repeatable and robust.

Modern Agile: Reminds me of a Strip Club

stripclub.jpg

InfoQ recently posted an article about something called “Modern Agile” (I guess they couldn’t call it Agile Agile, or Extreme Agile) and it takes an already loosey-goosey concept to new heights of vapor-process.

Agile is modernizing. Thanks to Lean and Agile pioneers and practitioners, we now have simpler, safer, speedier ways to achieve awesome results.

While that sounds like the tag line for a vibrator rather than a software development process, what really got me was this statement:

Modern Agile has no roles, responsibilities or anointed practices.

In other words, Modern Agile is, well, nothing.  Nothing that provides you with any intelligent, meaningful, concrete suggestion for how to go about building software.

Instead, it has four guiding principles:

  1. Make People Awesome
  2. Deliver Value Continuously
  3. Make Safety a Prerequisite
  4. Experiment and Learn Rapidly

Why am I reminded of an adult entertainment club?

But back to software development.  First, can we, as professionals, please stop using the word “awesome?”

Awesome: extremely impressive or daunting; inspiring great admiration, apprehension, or fear.

Riiight.  That’s something to aspire to, eh?  Like I want (even if I could) make my peers into something that fills me with apprehension or fear.  You can’t make other people awesome, they have to figure out how to be awesome.  And given negative aspects of that definition, I’m not sure I want to go down that path.

Continuous: continuing without stopping : happening or existing without a break or interruption

Really?  If I were a customer receiving uninterrupted software updates, I would be yelling “STOP! DELIVER A USEFUL SET OF IMPROVEMENTS THAT WORKS, NOT CONSTANT TINY CHANGES.”  The exception might be a critical bug fix, but as a customer, I do NOT want to be inundated with constant software updates.

Value: usefulness or importance

There you go.  Continuous value is sort of an oxymoron, especially when what is valuable in a software delivery is often something different for each stake-holder.

Safety: the state of not being dangerous or harmful

And Modern Agile, being a vapor-process, is sooo good at explaining exactly how that’s done.

Experiment: a scientific test in which you perform a series of actions and carefully observe their effects in order to learn about something.  Something that is done as a test: something that you do to see how well or how badly it works

Absolutely.  But sadly software development is hardly scientific.

Agile is something that I’ve always considered a bit ridiculous, and Modern Agile goes to the next level: the ludicrous, the absurd.  Modern Agile sounds great, especially in this new-age feel-good world of software development, but it’s ultimately a collection of words that mean nothing in any usable sense.  Just like a strip club.

Joined TopTal

So I recently joined the TopTal Web Developer Community, and to accelerate the screening process, they wanted a blog entry mentioning them.  Now, I find this interesting.  First, it’s a great way for them to promote their site (though I don’t think my readership is that thrilling.)  Second, having no experience with this community, I am not in a position to say anything great (or not so great) about them.  So I find myself in the awkward position of creating a blog post, with no idea whether it will achieve anything.

So, to the TopTal screener, if you’re actually reading this and you want, I quote, “[a] showcase [of] your writing skills and clarity of thought”, I would actually refer you to my Code Project articles, all 200 of them, which you can find here.

And I might call your attention to my 200th article here.  Read the comments.  Here’s a select few:

Your articles are ALWAYS of high quality. I’m Always looking forward to your next article because I can learn a lot of them!

great project, very instructive explanation

Well done, I particularly enjoy the discussion of your design tradeoffs and habits.

Excellent Read

Good Article. I think the detail you have provided with help a lot of folks improve their coding! Thanks Marc

You’re are inspiration for other developers, great work!!

Superlative effort- thank you, sir.

Excellent read as usual. 

I appreciate the way you describe your decisions, and second-thoughts; it’s very valuable to get an idea of the process your mind goes through in conceptualizing, describing, testing … and, that kind of insight is rare.

Entertaining writeup and nice looking project. What more could we ask?

So, now the ball is in your court.  Is TopTal really the top 3%?  Will something amazing come of this?  We’ll see.