The Full-stack Dev Is Bent Out of T-shape

Jasper Sprengers
4 min readDec 19, 2021

This article was first posted on DZone.
There are still plenty of openings for full-stack developers. If the stack is the gamut of programming languages, protocols, and middleware to build and maintain a serious internet application, then a full stack developer is the digital Jack of all trades who, contrary to the saying, has mastered them all. I once called myself such, and in 2001 it wasn’t necessarily an act of youthful hubris. The popular LAMP stack (Linux, Apache, MySQL, and Perl) was still manageable.

Illustration bij Sandra de Haan

Perl lost most of its popular appeal, while Linux and MySQL stay relevant and dominant. Unfortunately, the stack of old has grown to a high-rise. This is understandable: applications need to do lots more and interact with vital services that have become the backbone of modern societies. It’s serious business. Cybercrime is an increasingly attractive business model. Knowing about security is not optional. Before smartphones, there was only the browser platform to grapple with — although the battle between Netscape and Explorer (my standard is more standard than yours) drove everybody mad.

The game is now complex enough to turn disciplines that once fit a single job description into several full-fledged day jobs, with proper certificates and qualifications. There’s UI/UX, frontend (JavaScript) programming, backend, testing/QA, and deployment/operations. You need them all, but team members can’t be fungible experts anymore. Meanwhile, in the name of DevOps, administrative tasks that once were the purvey of long-haired sysadmins now land on the dev team’s desk. You built it, you own it, you run it, you fix it. But you don’t become a Cloud specialist in an afternoon.

We have CI/CD servers and Helm charts to roll out software wherein in the olden days you would copy a file through FTP (not even securely) to the production server. The requisite knowledge to devise and set up today’s ecosystems has become an order of magnitude more complex. If you want to become an expert in the highly vendor-specific nitty-gritty of the big boys’ Cloud offerings (Amazon AWS, Microsoft Azure, Google Cloud), you must choose. Your customer or employer is not likely to use services from all three to an equal degree. Knowing all is not a sensible investment of your time, even more, because this is typically the type of knowledge that fades quickly if you don’t keep it up. It’s like the Unix command line leaving my muscle memory. At least shell commands are as stable as the Old Testament compared to the Cloud APIs of the Big three.

This brings me to T-shaping, the metaphor to visualize the balance between specialized knowledge in one area versus the broader and more shallow acquaintance with related disciplines. The core of your domain is where you stay up to date with current developments and act as a source of information for others. It’s the knowledge that is so familiar, we shouldn’t have to look it up. Based on that criterion and forced by our limited brain capacity, we must assign more than we would like to the non-core, horizontal part of the T. It’s important to know your limitations in those areas and know who can help you out in the team, rather than try to commit too much to fleeting and unreliable memory. Part of what makes a successful software team is knowing who’s the person in the know about what.

It’s a challenge unique to IT: the expansion of the ecosystem forces us to specialize while the insanely short life cycle of competing technologies forces us to choose wisely. It’s a winner-takes-all world. Why has Git almost become synonymous with version control? Is it so much better than Mercurial or Bazaar? Was VHS better? No, but at one time it was all anybody used. Many innovations within hardware and software (like the web, OO, or relational databases), are comparatively ancient and new ones are infrequent. But that stokes our enthusiasm for new stuff even more, and we exaggerate the competitive advantage in equal measure. Most new languages and frameworks stand on the shoulder of giants. They make life only incrementally better, faster, or more user-friendly. After deduction of the costs for training and porting/rebuilding your legacy, the putative benefits evaporate. Sensible CTOs know this, which is why Java and C# remain firmly in the saddle.

Every attractive upgrade of your LinkedIn skills section carries an opportunity cost. Choosing A means foregoing the potential benefits of B. The original Latin meaning of the word decide (to cut off/from) is beautifully apt. You deliberately cut off other avenues. From a business point of view, it’s risky to put your money on one of the pretenders to the throne, that might not make it. As an individual with time to spare that consideration does not matter, as long as you learn something useful. Still, as a Java developer, I don’t need C# in my ‘horizontal’ skillset. Your non-core skills should be complementary to your core skills, and C# is not complementary to Java: it’s an alternative ecosystem. On the other hand, I can warmly recommend learning Kotlin and Scala, or even real functional languages on the JVM like Clojure and Frege. Even if they don’t make a visible dent in Java’s market share, they can make you a better all-around programmer.

Just in time learning is probably the only viable strategy to keep up. Up to a year ago, I worked a large-scale application in Scala. At present, I’m not doing Scala, not even as a hobby. I haven’t explored all the cool new features in version 3, but should a cool new Scala project come along I’m sure I’ll be up to speed soon. Compared to learning Chinese, a long time ago, every programming language is a piece of cake by comparison.

--

--

Jasper Sprengers

Writing about anything that delights and/or annoys me about the craft of software making.