A need for a better hiring process

Software engineering is a relatively new field. We are still trying to find a common denominator, a bunch of patterns, something that gives us a better overview about how things work, so that we can narrow them down and apply a process – like we do in any other engineering field, e.g., automotive, pharmaceuticals, etc. Something measurable, because when we have numbers we can actually do something. Without numbers it’s chaos and gut feelings, unicorns and failures.

One of those things we need to get better at is certainly the hiring process.

I am amazed by the amount of jobs out there that not always fit their own description – it almost seems they were written by a person within the company that dreams about that job day and night, a sort of Promised Land:

  • You will design and implement highly scalable services that need to serve customers all around the globe – we have already 20 million customers!
  • We work hard but play also hard – we have regular kicker tournaments and the best Espresso machine
  • We want you to learn and get better, we pay you 2000$ per year that you can spend with books and courses of your choice

These are only few of the promises. Sometimes they are quite good – I would not mind having 1k to spend in books and courses, to be honest. However, you will often find on Glassdoor that the same company offering you the Promised Land position has also the following cons, which often contradict what they claim in their job description:

  • management should trust employees more
  • no career opportunities
  • employees are expendable
  • high turnover rate

Similarly, I am quite certain that not every job applicant plays by the rules, and those boring years maintaining that legacy software will need more than some polishing.

While this is certainly a grey area, where ethics, self-promotion and personal/professional needs are different forces fighting against each other – or I should say “together”, maybe –  I am quite aware that people lie, so I prefer to focus on what the person and the company have to offer and how good their interactions are.

Ideally, everyone tells the truth during the hiring process. However, in order for this truth to come out, it needs to be told, written down somewhere, it has to become visible, and all potential ambiguities need to be cleared once for all during the interview process – questions need to be asked, especially difficult ones if you have doubts – there are lots of websites helping with what needs to be asked, just do your homework.

Also, let’s not forget that companies need to find the right candidate, because the first few months are just a cost (I think I read about it for the first time in Peopleware).

How can companies and candidates get to know each other better?

I believe that the best way to improve the hiring process in our field is by providing constant feedback and to put the candidate right in some of the team’s dynamics. Some companies I interviewed with in the last few years offer some feedback, but it’s almost never meant as a way to get to know each other. It’s like being at school over and over: how did I score? oh, cool, I made it!

For example, by analyzing what our daily tasks are, we could try to find out how candidates perform during a normal working day. This way, the feedback is mutual – candidates maybe don’t like the office, the managers, the colleagues, etc., while the company maybe doesn’t like how the candidate fits into the team. If that’s not possible, we could provide remote tools to support them. Accepting a candidate or a company shouldn’t  happen as soon as possible, but as soon as they are convinced about each other.

I do understand that we, as human beings, tend to do what we know best repeatedly, it’s smart because we use what we know already and we don’t need to invest new time. However, it doesn’t have to be that way. At least, when you find out it doesn’t work, maybe that has to be adapted a bit, at least. So, I am talking to all HR people – please, do something about it. Don’t just hire people to fulfill the expectations, to fill the empty desks, to reach the desired head count. Talk to your HR managers, to the CTOs and so forth, discuss about it. Find a better way to hire people.

Struggling with CI and CD

Continuous Integration and Continuous Delivery are two practices that are supposed to boost the development, integration and delivery of features, bug fixes, etc.

Someone may claim that they actually improve only the QA/release side of it, but I like to take the position according to which “done is released” – this is certainly not something I invented –  I can’t remember which book I have taken this from (maybe Continuous Delivery?).

Software development has changed, it has become an industry, still we struggle to reach common definitions. We are full of best practices, de-facto standards, and so forth, yet it’s so easy to end up with people without a clear understanding about certain topics, about the reasons things could be done in a certain way instead of doing it “because it has always worked for us”.

However, I have noticed a particular pattern: when something is a “practice” it gets often misunderstood. The best example is a REST web service. I really don’t want to get into this discussion again, because REST is an interesting concept, yet it has a million implementations. So many that developers in the end wind up frustrated with different ideas of REST.

Pragmatism is certainly important in our field, yet sometimes it’s important to have a common dictionary, so that we can refer to roughly the same concept, especially within the same company. This is unfortunately not always the case. CI and CD are only the tip of the iceberg, together with many other examples, because I think they are difficult to implement.

Why so? Because it’s actually difficult to understand how you want to do things and what you want to do with them. It’s a practice, not a tool, therefore it needs to be understood first, it can be adapted to your needs, why not, but it should not be seen as a savior, because it’s not going to solve all your problems.

If you want to “do” CI and CD correctly, you need to have the right problem, the right mindset, and the right tools. I would personally not do CD with medical software or space components – my experience with these fields is too little, so I can’t say much – it’s just a rough idea.

However, given the right problem (which is difficult to define!), you need to have people with the right mindset: we need to do something new, and this is going to have a severe impact on how we do things here – you won’t be anymore a QA or a DEV, but you will take care of everything from A to Z. Literally.

And, of course, you need the right tools, otherwise it becomes a pain in the *** to manage all these pipelines, tasks, failures, rollbacks, etc. Fortunately, nowadays we have plenty of them, even open source.

So, what is the sort of problems that CI/CD try to help us solve?

I would say that the first and foremost problem they help us with is the time to market – which is essential in business. Then everything else comes almost as a consequence, like “batteries included”:

  • code is always in a deployable state
    • it has passed all the QA rounds of testing, etc.
      • it offers a feeling that things are safe/green, which is always good to have
    • it was built, so it’s ready to be installed, etc.
  • tendency to have metrics-based pipelines
    • for example, if some component doesn’t reach X% of coverage etc., then it won’t be promoted to next stage

What kind of mindset does it require?

It asks people to take what they have always done, wrap it up, and throw it away. Sometimes it even asks them to wipe their *** with it. Pardon my French.

It requires people to think in a way that is deterministic, repeatable, stateless, yet in units that have to be integrated to make “the whole greater than the sum of its parts” (just to mention Aristotle).

It’s not enough to say “we need to commit on a daily basis”. It just doesn’t work that way. Same applies to “we need to achieve X% coverage so that the builds are self-testing”, where X is a ridiculously high number (considering that now the coverage is below the sea level). That will be not only counter productive, but will end up with frustration.

Depending on where you work, this may be harder or easier to implement. Having the right tools here helps a lot, educating people helps even more. In my opinion, the best way to achieve something here is by taking the time to explain the value this new approach offers, compared to what has been done so far, together with all the challenges this implies.

What’s the lesson here?

I think progress is never easy to achieve. It takes courage, an open mindset, some stubbornness, and sometimes also the honesty to say “ok, it doesn’t work this way, maybe it needs some improvements”.

Further Readings

There’s plenty of material to learn about CI/CD, however some of the most important articles about these two practices are from Martin Fowler:

Neolithic

I guess that when most people hear the word “Neolithic”, they simply think about something old, not very old, but old enough to be dated back to a past when our ancestors could not eat a juicy pizza or enjoy a fresh beer – it seems they’d be wrong in both cases.

Funny thing is that most people use the term “Paleolithic” to label something that is very old, something that has “aged”, which you wouldn’t do any more, that has most of the time a relatively negative connotation. Paleolithic and Neolithic, though, are both part of the same larger period known as Stone Age.

However, going back to Neolithic, I think it should be considered again, this time with a more correct and modern interpretation – this word should actually be associated to the “Neolithic revolution”, which according to Wikipedia was:

[…] the wide-scale transition of many human cultures from a lifestyle of hunting and gathering to one of agriculture and settlement, making possible an increasingly larger population.

Now, looking at our history, we can often find that we humans, or I should say Western people (as I don’t have enough knowledge about how this whole Stone Age thing is perceived in the East) tend to think that there are some periods that produced “wide-scale transitions […]”. One recent example is the Industrialization.

I frankly dislike to imagine that such periods happened suddenly, as if our ancestors in the Paleolithic period were all limited or not capable of farming, or as if people in the Middle Ages were all ignorant and illiterate – let’s try to remind a few things that we use and that were created roughly in such a “bad period”: glasses, universities, printed books. Progress is something that doesn’t happen right away, and if people in the Neolithic had the chances to improve their farming techniques I am quite sure that they had to thank also the “poor” guys who were born in the Paleolithic.

Why am I being biased towards Paleolithic then, claiming that they were “poor” guys? Well, that’s the point – you see. This is the main issue that we face when we look at something that happened in a relatively remote historical period – we almost always think that in the past it was all bad, all terrible, etc.

Why Neolithic then? Well, at this point you should have guessed it already. If we are who we are and we have what we have, we certainly have to say thanks to the progress and improvements created over the years. Oh, I almost forgot: of course, today is better than yesterday.