Professional software development between science, sound experience and beliefs



Is Computer Science a science that can currently support software development's decisions in the day-to-day work?

Is sound experience enough to take decisions in software development?








This is what I discovered from the reading list about the current state in:
- Computer Science, research and academia
- Professional Software Development and IT industry

And the following are my
- Conclusions

Professional software development between science, sound experience and beliefs: Science



Software engineering research and Computer Science research are currently in an immature state of understanding:

  • there is not a first principle or universal theory that helps to decide beyond doubt what is right or wrong, what works and what not.

  • still today many ideas from which originate most important advancements in Computer Science remain unproved conjectures as i.e. the Church–Turing conjecture, P ≠ NP problem, or the Brewer's conjecture

 
People, the high variability caused by the human factor, is heavily underestimated:

  • The effectiveness of many software technologies depends heavily on the people who are using them [2].

  • When researchers investigate programmer behavior, a messy area involving human factors, their methods must become ever more sophisticated. Above all, we must be aware of how much we can learn and where we must accept limits to our knowledge [2].


Substantial differences among projects limit the possibility to generalize the results of experiments

  • Evidence has a context-specific component. On certain topics, there is little evidence that results from one project have been, or can be, generalized to others [2].

  • Many experiments do not remotely resemble Professional Software Production as is in the IT Industry. Toy problems are used in artificial environments with inexperienced programmers, looks for example at the Basili and Selby study [2]. The exceptions exists and are rare [4][5].

In conclusion software engineering is not mature enough for practice:

  • David Budgen and Barbara Kitchenham ask: “Is evidence-based software engineering mature enough for practice and policy?” Their answer is “no, not yet”: the software engineering field needs to significantly restructure itself before we can show that results can be replicated in different projects [2][6].

  • To date, we have not realized our dream of evidence that is elegant, statistically sound, and replicable [2].

Professional software development between science, sound experience and beliefs: IT Industry



IT industry on one hand is doing a good job in documenting lessons learned and experience accumulated on the field, for example see [7] and [8]. On the other hand many lessons are forgotten because each generation start over to find solutions that speak directly to its unique time.

In the day to day work, IT professionals are let alone to deal with complex projects and with many unknowns. They can count on their judgment and it's up to them to verify what works and what not, in every single specific case.

When there is a doubt or a disagreement, there are no scientific profs of what these or those books and articles advocate. And in Professional software development, popularity, reputation and authority do not constitute valid profs.

Nonetheless statistics about success rates of IT projects suggests that things are improving.

The following instead are examples of things that don't go well:

  • At industry-oriented gatherings, sometimes it seems that a strong opinion, a loud voice, and a couple of pints of beer constitute a "proof" for many claims [1].

  • Case studies and success cases in the IT industry are really anecdotes. They are an effective means to share stories and experiences. They also lack the qualities required to constitute a proof.

  • Even in the current situations where unknowing is an essential part of the reality, in some professional communities well known and respected thought leaders can lavish answers authoritatively and dogmatically.

  • Some certification organizations base the body of knowledge on anecdotes, manage it in a not scientific and closed way and present their thesis as a valid indisputable truth.

  • Some case studies and success cases are made up for marketing purpose.


This meas that in the IT industry the boundary between
  • solid experience gained working on the field or learned from peers, and
  • popular beliefs and fads
is very thin and unclear.

Professional software development between science, sound experience and beliefs: conclusions


In conclusion,

    nowadays Professional Software Development compared for example with Medicine, is still in its early 
Middle Ages.

    Computer Science, academic Software Engineering and IT industry provide some answers. But after science and
sound experience, there is still a large area of unknowns and uncertainties.
In presence of unknowns and uncertainties, many rely on current trends that can be easily confused for fads and popular beliefs and superstitions. Others rely on popular experts
whose reliability is determined by their popularity, reputation or authority. Others rely on personal intuition


W
ithout a universal theory and with lessons learned from experience that cannot be generalized and with lack of profs, the answers we have to many questions from the day-to-day work are not always correct. Some are based on wrong assumptions, some have been over-generalized and some have not been properly verified.

By one side there are the IT industry and practitioners, on the other are researchers and academics. There is a near-complete disconnect between the two [1]. And it doesn't help.


Socrates:
It is likely that neither of us knows anything worthwhile, but he thinks he knows something when he does not, whereas when I do not know, neither do I think I know; so I am likely to be wiser than he to this small extent, that I do not think I know what I do not know


Professional software development between science, sound experience and beliefs: reading list



The following is the reading list:

[1] Two Solitudes Illustrated, Greg Wilson, Jorge Aranda, 2012
[2] Making Software (What really works, and why we believe it), Chapter 1 The Quest for Convincing Evidence, Tim Menzies, Forrest Shull, 2011
[3] Computer Science Is Not a Science (Communications of the ACM, Vol. 56 No. 1, Pages 8-9)
[4] The Impact of Irrelevant and Misleading Information on Software Development Effort Estimates: A Randomized Controlled Field Experiment, Magne Jørgensen, Stein Grimstad, 2009
[5] Variability and Reproducibility in Software Engineering: A Study of Four Companies that Developed the Same System, Bente C.D. Anda, Dag I.K. Sjøberg, Audris Mockus, 2009
[6] Is Evidence Based Software Engineering Mature Enough for Practice & Policy?, Budgen, Kitchenham, Brereton, 2009.
[7] Facts and Fallacies of Software Engineering, Robert Glass, 2002
[8] Revisiting The Facts and Fallacies of Software Engineering, Jeff Atwood, 2008