Each time I'm getting interviewed (e.g., while recording a podcast episode, during a conference), I struggle with answering one particular (and relatively frequent) question:

What advice would you give to aspiring programmers when it comes to tech skills? Which programming language, framework, or tech stack should they bet on (to maximize the ROI in the future)?

This question is problematic for several reasons:

  • first of all, no one wants to be Cpt. Obvious :), so an answer based on the current job market would not be valuable
  • well, actual tech disruptions (breakthroughs that could put you in very narrow avantgarde) are not predictable, neither is the future as distant as 5-10 years ahead - people who say otherwise are either fools or liars
  • personally, I never bet on a single tech/trend - as an 'E-shaped' individual, I do hedge my career choices
  • and finally - there's no single best answer, but there are many decent enough answers - one should pick an option that does resonate with her/him personally (it's easier commit fully to something you truly enjoy)

But I think I finally have an answer. I'm ready to handle this question with a piece of universal advice. The one I deeply believe in.

They should work on their skill of setting pieces together. Integrating or rather - assembling stuff from various elements: regardless of their origin, base technology, and exposed interfaces. To create (seamless) solutions much more valuable than sums of their ingredients.

Wait, is this even a skill?

I assure you it is. A demanding one that requires learning to deal with non-trivial mental barriers, like developers' innate tendency to favor 'build' over 'buy'. But I've written about that a lot in the past, so let's focus on the before-mentioned skill itself and why it's so important.

In a world where the velocity of change and the duration of the cycle time are among the most relevant metrics, organizations that want to succeed need to focus on building (by themselves) ONLY what's necessary (the essence of their unique value added) and integrating whatever else is needed (or useful, or can generate synergies).

The focus (on your novel value proposition) or its lack is the key difference between companies that achieve grandiosity and those that don't.

These days no one can afford to waste time on building what's considered a commodity product/service. It's not safe (e.g., security assets), not scalable (e.g., infrastructure), slow (when you need to build low-level capabilities like mathematical foundations of ML to use KNN only), and hardly profitable (if you're in food delivery, your clients probably won't appreciate the new web framework you've built ...)

That's why smart companies do integrate. A lot. Assemble the bricks. Set the pieces together. They are able to properly assess cost versus gained value and pick the best building blocks that are fit for their scenario. Sure, integration is still an effort, but if you're into smart cars, there's probably no point in building your own invoicing solution (a commodity).


And who can tell a commodity from stuff worth building? It's a topic for a separate discussion, but I can refer you to two valuable sources:

So, what's covered?

Yeah, let's coin a more precise definition. What's within this skill? What are the challenging parts? How to tell one who's good in that versus someone who isn't?

IMHO to be good in crafting solutions by assembling them from pieces, one has to be proficient at:

  • understanding how does the promise theory work in practice
  • abstract thinking - able to dissect the whole solution, understand the necessary contracts, their functions, and design decisions behind them
  • understanding various integration patterns & mechanisms, and knowing when to apply which of them
  • knowing 'glue' tools & technologies that simplify integrating other components (e.g., universal ETL tools, languages like SQL, flexible data interchange formats like JSON, data manipulation frameworks & libraries like pandas)
  • making the connections bullet-proof, e.g., by proper boundary testing, monitoring, logging, securing with popular, open standards
  • knowing how to keep all kinds of coupling under control (data, temporal, behavioral, etc.)

And last but not least: being able to quickly (and collaboratively!) craft working prototypes and proofs of concepts. Because if I want a sub-processing turned asynchronous and reliable, I simply expect a bloody queue, not a market analysis on available message brokers, a whitepaper on advantages of AMQP, or two months saga of setting up Kafka manually from source code.

"Sure, but I still think that ..."

No. Your programming language of choice will probably make no difference at all. Neither will do mine (sorry Elixir, sorry Rust). The Open Source supremacy era has several positive outcomes, one of them is the unquestioned reign of open standards for communication, integration and data exchange. Protocols, standards, and paradigms like HTTP, REST, JSON, OAuth, JDBC or gRPC are universal and can be freely utilized with pretty much any modern tech stack.

If you take a careful look at few past years, in spite of the massive popularity of JavaScript and Python, the programming language "market" doesn't consolidate. Quite the opposite - it gets fragmented even more. Relatively new languages prove to be fully productive and ready for prime-time, while the leaders of yesterday refuse to die but entrench themselves in some comfy niche.

The way cloud has expanded the service-oriented paradigm adds up to this equation. These days even the most 'sexy' topics (like Machine Learning or Kubernetes) are not A-Must-Learn (in-depth): as even those can be bundled, wrapped, and offered for consumption as a service. One doesn't have to understand how to manage a service mesh infrastructure - it's the cloud provider's concern, so all you have to do is reap the advantages of using the mesh (the one that was provisioned for you automatically in 5 mins) for your case.


Assembling solutions from the low-level building blocks may appear (for some) less challenging or less satisfying, but I disagree. Engineer's work should always be judged not by the intricacies of what's under the hood, but by the outcomes, their effects, usefulness and delivered value.

That's why the assemble work (as described above) is for me as close to the essence of what engineering is as possible. Re-inventing the wheel may be amusing and even educative (until some point), but that's not how you add value and probably neither what you're getting paid for.

Yes, some engineers will still be working on new compression methods, optimizing query engines or internals of machine learning libraries, but vast majority can benefit out of their effort and craft unimaginable (until now) solutions with a potential of improving everyone's lives.

So, paraphrasing the title of the videoblog of Dr Werner Vogels, "Now Go and Set the Pieces Together" :)

Share this post