Saturday, 16 April 2016

Why administer coding tests?

Coding tests are becoming more and more common in filtering out applicants. However it seems to me that employers are not thinking about what they want to learn from the candidate's approach to the test: Increasingly sites like Codility are being used as initial screens. Using such a test the interviewer cannot see the candidate's thought process or their coding style. What is more such tests, as they have a time limit, create almost as much stress as coding on a whiteboard in a face to face interview.

All I need is the right coding test and
The right candidate will pop out.
In my view a coding test should be used to answer at least the following questions

  1. Is the candidate's understanding of English (say) good enough to let them do the job?
  2. Does their coding style fit the company needs and/or culture of their prospective team
  3. Are they a “get it done” person or a “get it right” person.
  4. If they are unable to complete the test would they be able to perform in the workplace if mentored
  5. If they do not complete the test does this indicate an inability to handle stress? If so does it matter?

And this cannot be achieved with remote timed tests administered by a third party.

One simple test is to ask them to write fizzbuzz, whereby for integers between (say) 1 and 100 inclusive the applicant should produce code that will print “Fizz” if the number is a multiple of three, “Buzz” if the number is a multiple of five and “FizzBuzz” if the number is a multiple of both.

As an exercise I programmed fizzbuzz using a Java 8 stream in order to further internalise the Java 8 functional paradigm. I started with a slightly different algorithm to the obvious one that first came to mind and, in the spirit of a timed test, reverted to the simpler algorithm at the first sign of difficulty. This is another risk of a timed test: the candidate will run with the first solution they think of and this may be neither optimal or creative. There is a universal tendency to think of a superior solution two minutes after the test ends.

Another problem with tests is that a candidate who performs well may be a details person to the extent that they cannot see the wood for the trees and, once in post, will miss design problems in your codebase

David Saintloth [1] notes that FizzBuzz type tests test knowledge of the minutia of a given programming language and that syntax errors in coding are often punished harshly even if ti is clear the candidate understands the algorithm they wish to use. Further he says

fizz/buzz tests ONLY test programming (and in that not very well either). What people want to hire though are engineers not programmers....but they test for programmers, job requests are programmer focused (listing a flurry of buzzwords) which leaves potentially brilliant engineers out in the cold.

Ericsson [2] notes that aptitude tests, and coding tests are a form of aptitude test, predict short term performance but not long term or even medium term performance. Coding tests pander to the tendency for developers to focus on tiny details rather then the big picture. As an example I noted in a post on code reviews [3] that a comment in [4] on how code reviews focussed on a minor and irrelevant aspect of the code was followed by a large number of comments discussing the code presented totally hijacking the discussion: this is a bit like judging a painter by a single brush stroke rather than the picture as a whole and how to hold the brush rather than what do to with it.

Some people say that rather than coding tests candidates should be asked to show code samples or discuss projects they have undertaken. These solutions have problems. Code produced for an employer or client is often covered by a non disclosure agreement and some developers deliberately refrain from spare time coding for a number of reasons: for example to avoid burnout or domestic friction, to study other things, architecture for example or something totally unrelated to technology. The argument that such programmers are not “Passionate” about technology is specious, involves the “Real Programmer” syndrome and that the developer is, or should be or become autistic as a result of obsession with technology. This of course benefits managers trying to squeeze more out of their resources before throwing them away when they burn out [5].

In Brief

If coding tests are to be used the employer must decide what they want to learn from these tests rather then just applying them as a silver bullet to fix the probably unsolvable problem of picking if not the best candidate, then one capable of doing the job. It is possible that in the future AI techniques will be better able than humans to pick the “right” candidate but such an AI could and probably would, pick another AI to do the job.

More seriously filtering candidates on the basis of automated or semi automated screening not only risks leaving out potentially brilliant candidates but ignores the heuristics that cultural fit is more important than technical ability, that technical weaknesses can be trained out and that a hire who turns out unable to do the job for which they hired may do brilliantly in another role and that may only be found out after they are hired.

There is a role for coding tests but using them to screen applicants automatically suggests that the fabled shortage of developers is indeed a fable: If there were a real shortage employers would be willing to take on and train candidates they now reject.

Further reading

  1. Ericsson,Krampe and Tesch-Romer: The Role of Deliberate Practice in the Acquisition of Expert Performance: PsychologicalReview 1993, Vol. 100. No. 3, 363-406
  2. Ways to make code review not suck
  3. Karojisatsu: Mental health and suicide risks posed by IT culture for Managers and Developers

Wednesday, 13 April 2016

Introduction to SIMD parallelism

Physical limitations seem to have taken the Von Neumann architecture to its limits. Modern hardware, even consumer hardware generally has multiple processors that transparently take on different tasks. As well as the ubiquitous multicores in machines on desks, coffee bar tables and trains these days, a standard laptop contains specialised processors like the Graphics Processing unit.

The conceptual model for programming languages however is still largely based on Von Neumann's concepts. Developers do not generally have to “Think Parallel” and even concurrency, the intelligent android's equivalent of multitasking, is a minority “sport” riddled with pitfalls for the unwary and opportunities for its players to brag about superiority to other developers.

But we need to start thinking parallel not just concurrent. There are at least three types of parallelism: SIMD (Single Instruction Multiple Data) MISD( Multiple Instructions, Single Data) and MIMD (Multiple Instruction, Multiple Data). Historically parallelism has required dedicated hardware but recent trends such as Map-Reduce allow parallelism to be achieved using commodity machines, sometimes within a cloud environment.

Since 2006 Supercomputers have been based on an MIMD paradigm. Technologies like Hadoop that are conceptually SIMD actually break the lockstep nature of SIMD computation and are based on a SPMD (Single Program Multiple Datastream) paradigm which is, according to Wikipedia, the most common form of parallel programming today.

There are currently efforts to produce an SIMD api for Javascript, though this requires a dedicated SIMD register, the amount of data that can be handled in parallel depends on the word length and like other recent efforts, seems to be one dimensional SIMD.

One Dimensional SIMD

One dimensional SIMD involves simultaneously applying the same function to one dimensional arrays (vectors) of data. And requires specialist hardware. Here the hardware is considered to be a processor array.

Example a = [1,2,3], b = [2,3,4] a+b = [3,5,6] with the three additions involved being carried out simultaneously in a different processor.

If hardware support for shifting data within the processor array is available, as was the case for the long defunct AMT Distributed Array Processor (DAP) operations like summing the contents of a vector can be performed in O(log2(N/2)) parallel operations where N is the size of the processor array.

Define SHLP(a,N) as shifting the contents of an array a left N places in planar fashion. Then for an array of 2k elements the following code will provide the sum of the elements in the leftmost position

a → a + shlp(a,1)
a → a + shlp(a,2)
a--> a + shlp(a,4)
The following example illustrates what is going on here: the numbers indicate the location of date in the original array, A

[1+2, 2+3, 3+4, 4+5, 5+6, 6+7, 7+8, 8+zero]
[1+2 + 3+4, 2+3+4+5, 3+4 +5+6, 4+5 + 6 +7, 5+6+7+8, 6+7 + 8]
[1 + 2 + 3 +4 + 5+6+7+8....... ]

This is just the way you simplify a long set of additions, for example when checking your shopping bill by reducing it to multiple additions of two items and repeating till only one addition is left.

NOTE this is not the same as

A → A + SHLP(A,1) + SHLP(A,2) + SHLP(A,4)

Two Dimensional SIMD

Assume a two dimensional square array with four nearest neighbour connections.

Let SHEP(X) denote shifting each column to the right and SHWP(X) mean shifting each column to the left in planar fashion, with SHUP(X) and SHDP(X) defined in the obvious way.

If we want to replace the contents of each processor with the sum of itself and its eight nearest neighbours we write

x → x + SHEP(x) + SHWP(X) + SHUP(X) + SHDP(X)

But if we want to replace the contents of each processor with the sum of itself and its eight nearest neighbours we write

x → x + SHEP(X)
x → x+ SHWP(X)
x → x + SHNP(X)
x → x+ SHSP(X)

That is we need only 4 parallel additions rather than 8.

SIMD parallelism and Map-Reduce

SIMD parallelism has been criticised as having limited application, for example to rasterisation. Without support for nearest Neighbour connections SIMD parallelism is conceptually similar to the Map-Reduce paradigm in that the same operation can be carried out on data distributed over multiple processors.

Map reduce, as implemented in Hadoop or Pyspark requires a complicated infrastructure which allows a processor cluster to be extended as needed and dealing with concurrency and other issues, while straightforward SIMD is generally based on hardware that cannot easily be extended to deal with more processors while Hadoop and its kin can use commodity hardware.

It is however more difficult to implement the kind of optimisations described above since, for example, each line of a file produces a key value pair and each such pair goes to a different processor.

With support for nearest neighbour communications the picture changes. I have in the distant past used the two dimensional SIMD described above to write algorithms for Sorting, Fast Fourier Transforms, Image Restoration by Simulated Annealing, Optic Flow Estimation for computer vision and Cellular Automaton models of Polymers. I also used this form of parallelism to emulate the microcode of an experimental parallel optical computer in order to develop algorithms that could test the laboratory hardware.

Wrapping up
SIMP parallelism without support for interprocessor connections may be limited but when this support is added a wider range of applications can be tackled. Map-Reduce as implemented in Hadoop is conceptually an SIMD process but needs a lot more complexity to handle asynchronous computation on multiple unreliable processors and two dimensional SIMD is hard to implement using map-reduce in Hadoop.

Wednesday, 16 March 2016

Code reviews: do we need them?

Code reviews are a potentially beneficial exercise but suffer from social, cultural and managerial problems and the major benefits claimed for them can be obtained in other ways.

Yahoo have abandoned QA and passed responsibility for Quality assurance to developers claiming that this speeded up delivery and reduced the number of bugs present. One claim was that developers were relying on QA to find bugs and that passing the responsibility for quality back to developers led them to test their code more thoroughly. If true this suggests code review could be causing similar problems.

Code review Vs Peer Review
Code review is also known as Peer Review and the name may have been suggested by academic Peer Review (which itself has deficiencies). Academic Peer Review involves anonymous reviewers who ideally do not know the authors of a paper either and submit a list of suggestions with a recommendation either to reject or accept the paper. The criticisms are generally restricted to substantive matters and a reviewer may refuse to review a paper for various reasons.

Code reviews by contrast are not anonymous and often involve members of the same team. Reviewers can and do use the process to settle personal scores, indulge their ego, hunt for trivial mistakes and, in the spirit of Agile adopt whatever criteria they desire and change them at any time. Furthermore reviewers are seldom if ever, turn down an invitation to review other people's code

Benefits of Code review

One touted benefit is increased bug detection, for example [7] cites McConnnel's book Code Complete
software testing alone has limited effectiveness – the average defect detection rate is only 25 percent for unit testing, 35 percent for function testing, and 45 percent for integration testing. In contrast, the average effectiveness of design and code inspections are 55 and 60 percent. Case studies of review results have been impressive:
However the detection rate for unit testing depends on coverage and the number of test cases handled. One contributor on Quora stated that a simple between() method requires 259 test cases. It is unrealistic to assume a developer can identify and test more than a small fraction of these. It is also unrealistic to assume code review will identify potential bugs for all 259 test cases.
Another issue is that function and integration testing tend to reveal bugs between components while unit testing reveals bugs within components and the inclusion of design reviews in the figures is misleading: design flaws are not the same as programming bugs.
Assuming each testing stage reveals new bugs a naïve composition of the figures above would suggest 75% of all bugs would be detected purely by testing.
Again taking a naïve view using the figures above, if design reviews identify 55% of bugs, unit testing eliminates 25% of the remaining bugs and so on then then only 12% of bugs remain. Function testing reduces this to 23% and integration testing reduces this to 12%. Thus design review and standard testing identifies 88% of potential bugs.
The reduction in bugs found post code review introduction cited in [7] sounds impressive but may have resulted from developers taking more time to review their own code. Code reviews find only 4% of defects while developers will find more by reviewing their own code after a period of incubation [8].
While it is intuitively plausible that code reviews result in increased bug detection rates it seems unclear whether this is significantly more than would be revealed by extensive testing involving full code coverage and identification of all possible test cases.
Another benefit of Code reviews is alleged to be diffusion of knowledge within the team. In particular developers junior in knowledge of the code base, are supposed, by the magic of joining reviews of code they do not know well, if at all, to gain knowledge of how a small fragment of code fits into the larger system. This is like giving someone a grain of sand and expecting them to know how it fits into the beach, or showing them a 1cm square of a photograph and expecting them to know which part of the image it fits. I don't buy that. There are other ways to transfer this knowledge.
Code reviews are supposed to enforce standards. While standards are good (which is why there are so many competing standards) they are guidelines not straitjackets and coding standards are better enforced by using ( for example) automated formatters.

Problems with code review
The problems with code review are largely sociocultural and management related.
A respondent in [4] said of code reviews
Every time I've doen them they've been a pain in the neck. It turns into a geek fest with the nerds looking for spelling mistakes in the comments and lines that aren't indented properly rather than checking the code does what it's supposed to.

Partly the problem is that they're called CODE REVIEWS not PROJECT REVIEWS so no-one checks the code matches the spec. The other problem is the level of immaturity in developers: they want to have some impact so half the changes will be renaming variables "because they don't like them".

Technical reviews here involve getting yelled at for missing the comparison off IF statements. Yes, here we really do have to write C++ that says:
  if (go_do_something() == TRUE)...

And every class needs to have an argument held about whether one "really" needed to create it and whether it makes the code "too complicated".

As if to prove her right the discussion then becomes dominated by posters arguing why   if (go_do_something() == TRUE)... is the one right way of coding such a statement.
I resonate with her comment, having seen code reviews dominated by missed indents and commas not followed by spaces rather than substantive discussions, and been told not to use the standard formatter because it destroys the blame function (which I feel should not be used anyway) and should not be trusted. Most such anal developers would, I think, not survive in the freelance world where every project has different standards or none. I am told that indents and so on improve readability but feel any reasonably competent developer would (a) not notice or (b) fix it and move on. I also feel that coding standards, while theoretically good, can be used, like Agile, to enforce conformity and reduce any tendency to creativity and individualism. This is of course what management desire but may not be best for the enterprise.
Another problem is inconsistency: Code acceptable in one review may be rejected in another, apparently randomly as described in [6], the author of which argues that social problems should be fixed before technical problems. A third problem is that reviews may be dominated by the reviewer with the loudest voice, strongest bladder, greatest stamina and strongest opinions. If they are also regarded as extremely competent/talented but are wrong this could fossilise problems into the code.
Some of these problems, such as Geek Fests, are to do with IT culture, which is a youth culture, while others such as domination by the loudest voice are universal. However the fact they occur is a result of poor management. A team should be self organising, but management should guide it first to standardise what is examined in a code review and second to apply standards consistently not randomly. Also the tendency to use code reviews, whether face to face or via a collaborative tool as a substitute for social intercourse during working hours and a technique for team bonding or a rite of passage should be strongly deprecated. It should also be noted that collaborative review tools can make reviewers less diplomatic and more prone to wounding remarks as well as focussing on trivia,
When should we review code and why?
Of course there are times when manual code review is essential: Safety, Security and Compliance for example. I would be unhappy either as a developer or a manager if safety critical code were not reviewed by a dedicated team and nowadays code should be reviewed by the security team to look for vulnerabilities missed by developers ( and, as a digression, automated tests must be written for any safety bugs or security holes), while the legal department would be donning brown corduroy trousers if a compliance review were not carried out. But all these are special cases, to be executed by specialist teams not the original development team.

Alternatives to code review
There are alternative ways to get the benefits attributed to code review.
Research shows that as test coverage is increased the percentage of bugs detected approached 100% [9] and at 95% coverage approximately 90% of all bugs are detected so a realistic minimum test coverage, say 90% should be agreed within the development team. The test cases to include in the unit test suite should be reviewed by an experienced tester together with the developer.
Collective ownership need not and should not be confined to code reviews and when working on code simply “fixing the immediate problem” should not be an option. At the very least further changes, including refactoring, should be raised as tickets and triaged and every developer know thye own all the code. Of course collective ownership, as in real life, can result in dilapidated and depressing buildings and code.
Enforcing consistent style is only important for automated code review. Many problems can be eliminated by installing and running tools like Checkstyle, PMD and Findbugs and making running these mandatory before committing code.
Mentoring and the transfer of knowledge from the more experienced developers to the less experienced ones can be handled by pair programming, high level documentation and a period of formal mentoring. In any case two eyes on the code during implementation can reveal bugs as they arise.
Essentially all the benefits attributed to code review can be attained in other ways, probably more efficiently.
The code review has become a staple of the high formality (pseudo? Agile?) development process adopted by many enterprises and a small industry has grown up selling tools for collaborative code review. However in 15 years as a freelance developer I recall few occasions where I was obliged to submit my code for review. And I heard no complaints about the functionality or the quality of my code.

The benefits of code review can be obtained in other ways, and the time used for code review better spent. Specialist reviews such as for security, safety and compliance should be carried out by specialist teams. One should bear in mind that Reviewing Code is late: The specification, acceptance criteria and test cases need to be reviewed before starting to code

The problems with code review are largely a result of corporate culture, the sociology of the IT community and industry and business culture in general and management failings.

Code review could be a good thing but like Agile it can be implemented in a way that destroys morale and creativity and may further management incentives to cause the project to fail [11] or, in accordance with Putt's laws, to cause a crisis that the manager can solve thereby furthering their career.

As with Agile [12,13] it seems management, among others, can and do take a potentially good idea like code review and ruin it. Maybe we should ask why this is such a common pattern and require those who propose process improvements like Agile, Code Review and Continuous integration, to take into account before publishing their proposals the way management and “Geeks” can ruin their brilliant idea and make it counterproductive.

Further Reading

  1. Another set of Critiques of Yahoo's elimination of QA
  2. Ways to make code review not suck
  3. Estimating the Number of Defects: A Simple and Intuitive Approach: Michael Naixin Li, Yashwant K. Malaiya, Jason Denton
  4. Estimating Residual Faults from Code Coverage, Bishop
  5. Politics Oriented Software development.

Monday, 4 January 2016

Karojisatsu: Mental health and suicide risks posed by IT culture for Managers and Developers

Kairoshi is a Japanese term meaning “death from overwork”. While most of the discussion around Kairoshi deals with older workers dying from heart attaches and strokes, the Japanese have also identified “Karōjisatsu” as committing suicide due to overwork [4]. It is not unknown in the USA [4] and may occur in the UK.

Information Technology is not a profession where you can easily stay healthy, sane, fit and have a life outside work. Some of the problems are to do with the culture of the industry, some to do with business culture in general and some to do with Western culture and the Protestant Work Ethic that regards even valueless work as valuable in itself and has mutated into the Western Employment Ethic, work not being regarded as work unless an employer is involved.

I have discussed the physical problems and touched on the mental problems elsewhere [6] but the psychic problems and risk of burnout, breakdown or even suicide need further exploration. I can only comment on the Technology Industry in the Uk from experience but I get the impression that similar issues arise in Finance, though some may be peculiar to IT. I note the deterioration in the appearance, and possibly the health, of senior politicians and once went for an interview for a university lectureship, which I did not get. A year later, for reasons I forget, I had to register for a course taken by the successful candidate and noted that in the interview they looked 30 and after a year they looked 45 with grey hair and signs of stress. Maybe my guardian spirit was looking after me. Fifteen years experience as a contractor in various European countries strongly suggested that the problems I mention here are much rarer in mainland Europe than in the UK ans USA (these countries have their own corporate and industry dysfunctionalities). It also suggested contracting is, apart from the regular financial crises involved, better for mental health.

Some of the causes of Kairoshi are excessive hours, all night work and Holiday work plus stress caused by being unable to meet company goals and screwed up management.

Managers are not immune. They may have to lay off staff and feel guilty for being unable to protect their staff.

All this reduces morale and performance, often for no reason other than overly aggressive deadlines and macho posturing

Long Hours

There is a longstanding consensus that a forty hour week is optimal, which may be true for physical work but is almost certainly untrue for intense mental work. In Europe lorry drivers have restrictions on the number of hours they can drive because a fatigued driver is a hazard. Companies should restrict the hours their IT staff and other brainworkers put in. IT work is much more tiring than driving and Sweden has recently introduced a thirty hour week and companies there report an increase in productivity and profit. Te economy is likely to boom as well since workers have more free time in which to spend money.

Young IT professionals tend to regard burnout as a badge of honour, or at the very least, a rite of passage, and try for regular 100 hour weeks. That is 12 hours a day seven days a week. Nobody can keep that up Nobody can maintain good performance like that. Nobody can stay healthy like that. Ironically at one company a couple of contractors each put in time sheets for 360 hours in one month and the management response was to install time clocks, and in most places I worked contractors were not allowed to bill more than 40 hours a week without approval. In the UK and US time clocks would be used to note who was working long hours and to demonise others as slackers or uncommitted.

Moves to eliminate a long hours culture tend to be resisted by those who have benefited from it, whether in Finance, IT or when considering the 80-120 hours worked by Junior Doctors in the UK. The response is inevitably of the form “It never did me any harm” (How do they know?). In the case of Doctors the risk to the patients is ignored, for as the old saying goes “Lawyers bill their mistakes Doctors bury theirs”. Sometimes it requires a law suit for the company to change its ways.

Impostor Syndrome

Impostor Syndrome is the reverse of the Dunning-Kruger effect. To quote Wikipedia

Impostor Syndrome is a term coined in the 1970s by psychologists and researchers to informally describe people who are unable to internalize their accomplishments. Despite external evidence of their competence, those exhibiting the syndrome remain convinced that they are frauds and do not deserve the success they have achieved. Proof of success is dismissed as luck, timing, or as a result of deceiving others into thinking they are more intelligent and competent than they believe themselves to be.

The Dunning-Kruger effect is where people regard themselves as better than they are. Typically young IT professionals overrate themselves and more experienced professionals under rate themselves.

The risk is that programmers think they need to work harder to become good enough. That means spending more time coding — every waking minute — and taking on an increasing number of projects. And that leeds to burnout and possibly suicide

The incidence of impostor syndrome is around 40%, with a lifetime incidence of 70% and men and women are probably equally affected. It is common in professions were work is peer reviewed, for example software development [9] though it seems to be rarer in Academia where reviews of a paper are expected, anonymous and regarded as helpful. It also helps that Academic papers, other than conference papers, rarely have deadlines.
Impostor Syndrome is not a mental disorder more a reaction to certain situations. Undue susceptibility to Impostor Syndrome can be identified through personality tests but does not seem to be a distinct personality trait. Sufferers tend to reflect and dwell upon extreme failure, mistakes and negative feedback from others. If not addressed, impostor syndrome can limit exploration and the courage to delve into new experiences, in fear of exposing failure. High achievers or those who have achieved a lot in the past may well experience it in a new role.

A number of management options are available to ease impostor syndrome. The best is to discuss the topic with other individuals early on in the career path. Most sufferers are unaware others feel inadequate as well. Once this is addressed, victims no longer feel alone in their negative experience. Listing accomplishments, positive feedback (A simple well done from managers ) and success stories will also aid to manage impostor syndrome. Finally, developing a strong support system, that provides feedback on performance and has discussions about imposter syndrome on a regular basis is imperative for sufferers.

The Real Programmer Syndrome

The Real Programmer [11] is a cultural stereotype originating, possibly as satire in 1983. Real programmers disdain such luxuries as IDEs and where possible any high level language and sometimes even disdain assembler preferring microcode.

A Real Programmer codes all the time and doesn't consider it work. They live to code.

A Real Programmer volunteers to work 60 to 80 hour weeks for no extra monetary compensation, because it's "fun". ..

Management love Real Programmers and the image of the Real Programmer is now in the DNA of the Tech Industry. IT has always had a long hours culture but now, unlike Finance or a Japanese company, workers are supposed to do it out of the enjoyment of the work.

Impostor Syndrome can lead people to think they have to work harder to become good so they over load themselves. Then they slowly burn out. Sometimes they kill themselves [4] though doubtless some would say they would have to have been mentally unstable rather than blame the long hours.
The Older IT worker

It is no secret that the Technology Industry is Ageist. Mark Zuckerberg claimed young people are smarter but is doubtless redefining “young” with every passing year. By a strange coincidence the average age of Facebook employees matches his age exactly.

Older workers have more experience and this lets them be more productive but the long hours they had to put in when younger and fitter make their bodies less resistant to the stresses of The Job, in particular long hours. In some trades the younger people “carry” the older ones, for example in heavily physical jobs the older worker may find his team shift him to lighter tasks. This does not happen in IT. And so the Older worker gets stressed because of having to demonstrate their “Commitment” and “Passion” not only to management but to younger “Real Programmer” wannabees.

Trying to balance Family, work and learn new technologies on their own (almost all companies refuse to fund training for their staff reasoning that it is cheaper to hire a young person with new skills and a bit of experience than train an older person) is a high task. This leads to burnout and heart attacks [3]

Adam Smith, in the “THE WEALTH OF NATIONS”, 1776 stated the risks of overwork admirably when discussing piece workers, though this applies to Real Programmers and the Technology Industry generally, even though Tech workers are not hourly paid or paid by lines of code. Note the eight year threat.

Some workers, indeed, when they can earn in four days what will maintain them through the week, will be idle the other three. This, however, is by no means the case with the greater part. workers, on the contrary, when they are liberally paid by the piece, are very apt to overwork themselves, and to ruin their health and constitution in a few years. A carpenter in London, and in some other places, is not supposed to last in his utmost vigour above eight years. Something of the same kind happens in many other trades, in which the workers are paid by the piece, as they generally are in manufactures, and even in country labour, wherever wages are higher than ordinary. Almost every class of artificers is subject to some peculiar infirmity occasioned by excessive application to their peculiar species of work.”

To The Management

Discourage long hours wherever possible. Set an example and treat yourself well. Look out for unexpected changes in performance, whether sudden or gradual, especially for the worse. Remember that Long hours are bad. Sweden has recently introduced a six hour work day, and this is about the limit a brain worker, such as a developer, or you yourself, can handle. The loss to the company of an experienced developer, Architect or Sysadmin who is burning out can be significant. You can always manage such a person out and hire another person to replace them without damage to your career: once, maybe twice, but ultimately the resultant missed deadlines and reduced project scope will be tracked down to you. Look after your reports and they will look after the business.

To the Worker

If you are in a company with a long hours culture, and this can be subtle, for instance with only employees working long hours being promoted, try to get your manager's support in balancing work and life. If they cannot or will not help you should quit. If you are well paid remember money is not always worth the price you pay to get it. Just look at any successful politician.

You may be starting to burnout without noticing it. If you come to hate Mondays, especially if the idea of going to work on Monday spoils your Saturday morning something needs to be done, fast.

And do not rely on your manager to look after your health. Only you can do that.

I have done long hours as a contractor and damaged my health but recovered.I did long hours again in a permanent role. But as a contractor I never suffered burnout, only as a permanent employee. The periods on the bench as a contractor let me recover and preserved my liking for coding, though the increasingly regular financial crises as I got older took a lot of the fun out of contracting. The relentless pace of the permanent role (fast paced, aggressive schedule etc ) nearly killed me and it took some months to start recovering. I no longer have much desire to code, except in my head where I can devise algorithms easily, and am trying for a hands off managerial role or returning to university for a career change if I can afford it.

Learn from my experience. For your sake

And if you need to talk to someone the author of [4] has pledged themselves to help anyone who needs it. If I could I would help others.

Summing Up

Technology Industry culture is dysfunctional. Some companies, bless their little cotton socks, do their best to look after their workers, but are trapped in this culture and unable to see their chains: a phenomenon best exemplified by those Nazis who said “But some of my best friends are Jews' or the cemetery in Ireland where protestant and Catholic are buried in the same graveyard but an underground wall separates protestant and catholic graves and this was presented as a move to break down barriers between the two sects.

Aspects of Technology Industry dysfunctionality, most notably a long hours culture are shared by other industries, but there is no equivalent of Real Programmer Syndrome: an accountant may have to work long hours but is not expected to do so because it is “fun”.

The manager should be on the lookout for Impostor and Real Programmer Syndrome and take steps to prevent or cure them. The Real Programmer may be good at coding but unable to see the big picture and design good code, let alone handle architecture. The employee should not trust their manager to look after their health.

  7. Yahoo have recently decided to abolish QA and found a decrease in the number of bugs in production. Together with evidence that code reviews are no more effective in finding bugs than requiring a developer to leave their code aside for a while then review it suggests that the code review process is damaging to developers and should at least be restructured. But that is a different topic.
  8. IT professional nails the Real Programmer.
  9. The Real Programmer
  10. Dangers of Overwork

Monday, 28 December 2015

Health Risks of IT for managers and workers.

You are a young software developer. Eight hours a day fly by, nine hours a day you notice but hey, your code nearly works. Finally it works and you are winding down when you get an email from over the pond about an urgent problem. You fix the problem and finally depart after eleven or twelve hours. Most nights you also work on side projects or, since technology moves faster than politicians chasing bribes, investigate what you think will be the next big thing (you are probably wrong) and hope The Management adopt it. You dream you are coding

Next day you do it all again, and the day after that. The management notice you and you are employee of the month.

Fast forward ten years and an extra five stone and 12 inches on your waist. The work is getting harder but your experience makes it go faster. Shame about the knee and back pain. You should exercise but you groan at the idea and it's an interruption from coding. Shame you are still single and no one in sight. Shame you have no conversation other than your work. You survive on food from the vending machines and cheap kebabs. You have become one of the devs from  CommitStrip. You dream you are coding.

You get married. For a while, but as with police work, your spouse comes second to The Job and eventually they leave you. They take the dog. You enter clinical depression and end up sleeping on the street, dreaming of coding.

This is slightly exaggerated but programming can hijack your brain and make you autistic. There is always pressure, from peers or management, to work long hours, which you don't even notice because you love coding, right?

Offices are not the healthiest of work environments, and, like doctors, software people do not have the healthiest lifestyles. There are physical, mental and even social risks involved. Some of the hazards of the office are shared with other office workers, albeit experienced more strongly.

The physical damage caused flying a desk can, if caught early, be reversed by small changes consistently applied. The mental, social and relationship problems may be impossible to fix

To The Manager
You are not immune to this damage, even if your work is different. There is a longstanding tradition that middle managers work long hours. Senior managers work long hours networking with other senior managers on the golf course and in the bar (Criminals spend a lot of time in bars networking: to them that IS their office) while sending memos about timekeeping to the lower orders.

Look after your self before looking after your reports: Set an example by sticking to your contracted hours. Take breaks ( note that face to face meetings give you a hint of exercise walking to a meeting room, and can help keep you healthy). If you must work overtime one day compensate for it as soon as possible.

Take time to relax: Switch off the phone after work and take time to relax.

Learn to say NO when over loaded.

Monitor your mental and physical health but avoid hypochondria. Maybe keep a diary.

Look after your reports. Make it clear that you expect long hours to be an occasional emergency measure. Avoid Death Marches. If need be occasionally wonder round in the evening and send any late working programmers home, unless they are working on a time sensitive critical issue, in which case try to work out how to prevent this happening again.

Watch out for signs of physical and mental issues and any change in the performance of your reports. There could be any number of reasons for this and you are duty bound to investigate.

Remember that problems like stress can be insidious and by the time they become obvious may be irreversible.

And do not feel embarrassed discussing the health of a one of your reports with them. And try not to stigmatise someone with psychological issues.

To the Developer

Do not rely on your manager to look after your health. The culture in IT and business generally is to extract more and more from workers - and developers are increasingly becoming commodities to exploit – and the manager has their own pressures from above. If it is a choice between you and them guess which they will choose.

Stick to your contracted hours and take any time off in lieu of overtime as soon as possible ( or get your manager's agreement to taking it as paid leave). If need be keep a diary of the hours you work. Remember some places feel that if you cannot do your job in the normal number of hours you are probably not up to the job.

Learn to say NO.

Take regular screen breaks. If need be walk around for ten minutes every hour

Coding divides into Analysis, Inspiration and Perspiration. The latter is when you have a solution and are turning it into code. Inspiration comes when you are away from the machine, often at the least convenient times (Bed, Bath and Bus are the traditional places). So take on board that you do not need to be in front of a keyboard all the time and may become less productive if you are.

Without becoming hypochondriac, monitor your mental and physical health. If possible buddy with a trusted colleague to look after each other. If married ask your partner, siblings or children to be brutal in pointing out changes in your health and personality. “My god you have got fat” may be impolite but it could save your job, career or even your life. “You seem to be slowing down and you memory is worse” is an even bigger red light possibly leading to Alzheimers in later life.

Physical Health Issues
You can lose muscle mass sitting in front of a computer all day. This messes up your ability to remain the light, slim athlete you used to be because muscle is far more effective at metabolizing calories than fat. Muscle also weighs more so you can lose weight but still expand sideways while your face turns into a balloon. Your risk of diabetes goes through the roof. Poor diet and lack of exercise can cause cardiovascular disease which not only affects arteries round the heart but can also affect blood flow to the legs and other extremities resulting in peripheral vascular disease, a serious condition that can lead to a heart attack, stroke or Diabetes.

Moving up to the gut, now expanded dramatically through poor diet and exercise aversion you have a 1 in 8 chance of having gained 20 pounds or more and a one in three chance of gaining more than 10 pounds. This is less than the bloat that affects those in financial services (one of the few professions with a culture of longer hours than IT). You may weigh less than when you started you career but muscle is denser than fat so you may to be having problems opening a jar of jam or even walking up two flights of stairs. Lunch at the local Pizza Parlour or bar makes matters worse.

Your beer gut, which would be appropriate in a darts player or Sumo wrestler lays you open to heart disease, diabetes and other problems. Diabetes leads to a vast number of other physical and mental problems: blindness, sores that do not heal, testosterone deficiency and erectile dysfunction (impotence) . But hey, no need to worry till you are over 45 right? WRONG. Diabetes is moving down the age ladder and even children are getting it now.

By now you may be around 50 and thinking “if I had known I would live this long I would have looked after myself better”

They say the road to a man's heart is through his stomach. Heart problems and Diabetes follow this road carved by your expanding stomach or worse, hidden fat round your internal organs. Some of the risk factors cannot be controlled but others like smoking, exercise and diet can help reduce your risk.

Your upper extremities also suffer, though not from lack of exercise. Repetitive strain injury is likely. Unless you consciously compensate activities like texting, using a mouse, typing etc can cause you to tense shoulders and upper arms which reduces circulation to the forearm just when the thumb and fingers need more blood. Typing all the time increases the risk of arthritis.

If you don't set up your workspace properly you risk back, spine and shoulder problems. Exercising can actually make matters worse if you do not follow a balanced workout and thereby create imbalances.

Poor posture, an easily adopted habit, can set you up for a host of musculo-skeletal problems as well as indigestion and constipation ( which can lead to colon cancer ) as well as lung problems when your posture makes it harder to breathe.

You are also likely to have eye problems and laser surgery to correct these can result in career ending after effects.

We haven't even started on the mental problems yet

Mental Problems

Long hours and permanent connectivity to your office email overstimulates the brain. So does thinking about coding problems. Lack of time for relaxation and exercise bumps up your stress, as if you do not have enough already. Chronic and excess stress can harm the immune and cardiovascular systems, and increase vulnerability to heart disease, depression, exhaustion, sleep deprivation and overall malaise. Undue stress can also trigger anxiety, with its own symptoms, including stomach pain, dizziness, muscle tension and headaches, decreased concentration, irritability and sexual problems. Extreme anxiety can even increase the risk of cardiovascular diseases, psychological problems, suicide and some cancers.

When stressed you get sleep problems which again leads to a higher risk of diabetes, obesity, high blood pressure and other health problems. As a result you may suffer from fatigue which, like stress leads to poor performance and bad decisions. This can impact your job and career, if not more important things.

Overstimulating the brain has other risks. At least one savant has trained so hard at mental problems that they made themselves autistic and some of the thumbnails photos of contestants in competition coding sites show definite signs of autism.

If you get Diabetes you can look forward to a host of other mental problems, including possible mental decline associated with some medications ranging from major depression through anxiety to bipolar disorder.

Stress can of course mean you perform less well which means more risk of ending up on the streets. In the finance industry many high pressure workers take to drugs to deal with the stress. DON'T. If you find you need alcohol to relax pile on the brakes and see a professional.


Other than leaving IT solutions to many of the physical problems are the same as for handling diabetes without medication: Diet, Exercise and Weight control. Other physical problems can be combated or prevented by attention to posture and properly setting up your workspace.

The mental problems require stress management, maintaining a decent work life balance and getting a life away from the keyboard plus making time for sleep. Side projects are NOT a good idea. If you can, stick to your contracted hours. If not find another job.

Take frequent breaks, drink loads of water (coffee in a moderation) and try not to get stressed: it's only work. Take up a sport and train three times a week.

The Wrap

Desk work is not healthy. Computer related work is even worse. The pressure to work long hours, whether internal or external, lead to a lifestyle that can spawn a host of physical and mental problems. Constant vigilance is needed to prevent these problems, including reverting to a separation between work and leisure that some may feel quaintly last century.

Managers should look after themselves first then their reports. Developers cannot assume managers will look after them.

Wider issues such as role of the culture of the IT industry and the Protestant Work Ethic in ensuring the unhealthy lifestyle of IT workers, and more generally office workers, are left for future research.

Further reading

Sunday, 20 December 2015

Java 8 Interfaces Lambda-expressions and Streams

My last workplace refused to move from Java 7 after experiments found that some code failed at runtime when the compiler version was raised. Now, as a relatively late adopter I am able to look at some of the new features of Java 8. First impressions are that Java 8 is best considered a new language, or at worst a superset of Classical Java ( the 21st Century version of Cobol), just as C++ is technically a superset of C. The changes could reduce code and class bloat resulting. As usual the new features raise questions about the underlying implementation which need further research.

Interface changes can reduce the number of classes needed in an application, Lambda expressions can reduce the number of lines of code, though the new syntax can be confusing at first and Streams provide support for parallelism, though only for stateless operation. The new language will leave many trapped in backward looking organisations floundering when they finally emerge.

Interfaces and Default Methods

A useful design/architecture pattern in Classical Java is the Interface-Abstract Class-Implementing class pattern where an abstract class implements an interface and holds default methods expected to be common to all implementations. The New Java makes abstract classes much less useful and I would expect them to be deprecated at some future point. As a result a typical implementation may hold fewer classes as abstract classes will not be needed. The eventual outcome should be simpler designs with less code to harbour bugs.

An interface can also hold static methods ( which cannot be overridden in subclasses), which may allow movement of utility functions into an interface when this is appropriate.

Here is an example

interface Greeter {
public void saySomething();

public default void sayHi()
System.out.println("Hi there");

public class HelloWorld implements Greeter
public static void main(String[] args)
HelloWorld hello = new HelloWorld();

// saySomething() is not a default method so still needs to be implemented.
public void saySomething()
System.out.println("Say What?");

Interface static methods are useful for providing utility methods and utility classes can be replaced with interfaces that contain static methods.

Functional interfaces interfaces with exactly one abstract method are new

A new annotation @FunctionalInterface has been introduced to mark an interface as a Functional Interface and avoid accidental addition of abstract methods in the functional interfaces. Functional Interfaces allow the use of lambda expressions for instantiating them.
Default and static interface methods alone should simplify existing code bases dramatically.
Lambda Expressions

One of the painful aspects of Classical Java is the sheer verbosity needed for Object Orientation. Java 8 allows the walls of text needed even to start a simple thread to be replaced by more compact, initially cryptic, expressions.

Consider this

Runnable r1 = () -> System.out.println("My Runnable");

Equivalent to

Runnable r = new Runnable()
public void run()
System.out.println("My Runnable");

What the Lambda expression is doing is instantiating a new Runnable with a run() method that just prints out a message. 

Using Lambdas to inject behaviour

It is now possible to inject behaviours into a method using predicates for example

private static Stream<String> extract(List<String> words, Predicate<String> predicate)
return words.parallelStream().filter(predicate);

Predicate<String> catpred = index -> index.startsWith("cat");

Stream<String> catstream = extract(words,catpred);

And one can be even lazier with method references

private static boolean isdog(String target)
return target.startsWith("dog");

Stream<String> dogstream = extract(words,BehaviourInjector::isdog);

This feature, as just one example, renders the Classical java methods for Custom sorting pretty much irrelevant for example as in [2] given a class Person with fields name and age we can sort a list of persons by name, pers2) -> pers1.getName().compareTo(pers2.getName()));

which could be written as a BiPredicate

(The official Oracle docs still require a comparator for custom sorting at the time of writing)


Streams bear some resemblance to Pyspark RDDs (Resilient Distributed Datasets) which represent
“an immutable, partitioned collection of elements that can be operated on in parallel.” The difference seems to be that a Java 8 Stream is computed on demand by operating on a specified data source whereas the RDD seems to exist independently but the Pyspark documentation seems not too clear on this.

Streams are created on demand and produce a pipeline of data from a specified source and this can be operated on sequentially or in parallel, with, according to Oracle, the parallelism being largely user transparent. They cannot therefore be reused.

There are limitations to streams for example

  • The operations on a stream must not depend on the order in which they are performed
  • Streams, once consumed, may not be reused

Classical Java often feels like this
Wrapping up

Java 8 was launched as a major update to Java. In many ways the functional programming aspects introduced make it a new language. The changes to interfaces could, properly used, reduce the number of classes in a typical application, Lambda functions will reduce the number of lines of code and improve readability and Streams will go a long way to provide transparent support for parallelism, at least on single multicore machines.

It seems unlikely that Java 8 will render large scale parallel programming frameworks such as Hadoop and (Py)Spark redundant since the use cases seem different, nor does it seem likely that New Java (Java 8 and beyond) will do anything to impact the popularity of languages such as Python which also support both functional and object oriented programming but require much less boilerplate code.

Time will tell but one pleasant possibility is that Java will evolve into a functional language with residual support for Object Orientation. If that happens it will probably be when another programming paradigm pops up and becomes popular.

A more likely scenario is that various bits of cruft will build up and intellectual fashions plus the standard developer intellectual arrogance (I have to admit having been guilty of this in the past and plead immaturity), desire to show off (ditto) and love of developing complex solutions where simple ones will suffice [4] plus the tendency to adopt technological fashions uncritically will lead to a situation where the simplifications Java 8 has bought are lost and another simplifying paradigm is needed.

  1. In all fairness the first solution of a problem is usually too complex and refactoring simplifies it while improving non functional aspects, but the current desire to shorten time to market makes time for such refactoring hard to find. Developers are not to blame for everything.