You are on page 1of 12

Sheet1

Research Database

My Index Article Title Principle Authors Date


1 Introducing Agile Development into David Kane 2003
Bioinformatics

2 Systems development Judy Wynekoop 1995


Methodologies: unanswered
questions
3 The fiction of methodological Joe Nandhakumar 1999
development: a field study of
information systems development

4 How do scientists Develop and Use Hannay Unk


Scientific Software

5 Managing Chaos: Lessons Learned Sarah Killcoyne, John 2009


Developing Software in the Life Boyle
Sciences

6 A Survey of Software Tools for Worth, DJ, Greennough, 2006


Computational Science C
7 No silver Bullet Brookes, F Unk

Page 1
Sheet1

8 The fetish of technique: David Wastell 1996


methodology as a social defence.

9 Exploring XP for Scientific Research William A Wood, William 2003


L Kleb

10 Where's the real bottleneck in Greg Wilson Unk


scientific computing
11 Agile Software Development: It's Laurie Williams 2003
about Feedback and Change
12 Computational Science and Worth, DJ, Greennough, 2006
Engineering Software Development C
Best Practise

13 Ante disciplinary Science S Eddy .

14 Matching Methodology to Problem Robert L Glass 2004


Domain

15 Some problem of Professional End Judith Segal 2007


User Developers

16 Two principles of end user software Judith Segal 2005


engineering research

Page 2
Sheet1

17 When software engineers met Judith Segal 2005


research scientists: a case study

18 Understanding the HPC Computing Basili Unk


Community

19 A Software Chasm Kelly, D Unk

20 Scientific Software Development is Baxter, SM 2006


not an Oxymoron

21 Top-down standards will not serve John Quakenbush 2006


systems biology

22 The case for Open Source software DeLano, W 2005


in drug discovery

Page 3
Sheet1

23 An empirical investigation into the Brian Fitzgerald 1998


adoption of systems development
methodologies

24 Participatory Programming Catherine Letondal


25 Guest Editors Introduction: J Segal, Chris Morris 2008
Developing Scientific Software

26 Software Engineering and Greg Wilson 2009


Computational Science
27 Advances in software development Unk Unk
methodology may be a solver bullet

28 Models of Scientific Software Judith Segal 2004


Development
29a Point: Teach end users the 3 rs + Janice Singer, Mark 2009
SE Vigder
29b Software Engineers don't know Judith Segal, Steve 2009
everything about end user Clark
programming
30 Problems versus solutions: The role Iris Vessey Unk
of the application domain is
software

31 Requirements Engineering Sarah Thew 2009

Page 4
Sheet1

Orphan

32 Software Development Cultures and Segal, J 2009


cooperation problems: a field study
of the early stages of development
of software for the scientific
community.

33 Software Design for Empowering David D Roure, 2009


Scientists

Page 5
Sheet1

34 Scientific Computing Productivity Stuart Faulk 2009


Gridlock: how s/w engineering can
help,

Page 6
Sheet1

Main Themes
A good write-up on how a special team was integrated into a scientific software
development environment, It was tasked with incrementally introducing
standards (agile) into the group. Particularly mentions dynamic requirements.
Discussed methodology selection and adaptation.

Describes the general waterfall. Quotes Wastrel, uses the term “irrational ritual”, to
create a feeling of security. Decides development process consists of intervention,
bricolage, improvisation etc. as much as management processes etc.

Scientists get knowledge from self study and from their peers rather than from
formal education. The believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.

Isolated groups, collaborate loosely.


Manic foraging exercise.
System works, and works well.
Highly specialized software
Funding wants standards for reuse. Too costly.
Research Informatics Team founded.
Processes tailored to life sciences.
Skills gaps, not formal training.
Poor specs
No protect vision, creep
Too many roles
Managing complexity, standards
No underlying mathematical theory in life sciences (harder than physics to model).
Success is perceived differently
Rift/chasm
Discover oriented, project focused publish-led culture.
Software engineering work in teams
Particular skills
Extend team knowledge
Researchers work in isolation
Scientific developers rejected project management
More like a band than a team
Attitudes: s/w dev is not real work
Not highly regarded
Engineers educate scientists
Developers resistant to new ways.
Developer struggles with dev process steps.
View only as far as specific projects life cycles
Good introduction to the software life cycle, various models. Waterfall. Talks about
how the process is imposed on the project.
The essential difficulty of writing software, versus the accidental difficulty,.

Page 7
Sheet1

Shows how a methodology can lead to a rigid ans mechanical approach. Rituals
that stifled innovation. Inhibited creative thinking. It may operate as a social
defence. It may operate as an irrational ritual. Gives a feeling of security and
efficiency at the cost to real engagement with the task as hand.
Another example of how a method (XP) fits into the scientific process. In particular,
it contains an example of how the process was altered where it don't seem to fit
very well.
See project proposal

No read

Advice from engineers within a large team dedicated to scientific software


development. Described where funding pressure for better methods comes from.
Suggests a mix and match approach. Pragmatism. Also goes into salvaging legacy
material. Suggests a break down along the lines of software type (prod, project,
prototype), maturities (legacy, mature, new) and life cycle elements.

Describes reluctance of Scientists to adopt standards. Suggests ways to gain


trust and support. This is ANOTHER amelioration method.

New methods must be seen to have value by the developers.

A coruscating account of how he fears interdisciplinary science. He calls for multi-


skilled scientists, who blaze a trail and create new disciplines. He uses the term
“expecting a team of disciplinary scientists to develop a new discipline is like
sending a team of monolingual blah blah blah....”
excellent paper calling for a taxonomy of methods, another of scientific
development domain,s and a taxonomy between them.
The latest won't do. Universal methods are weak. Swap between sabbaticals in
academia, and sojourns in industry.
We don't nee any new methodologies. Also mentions the chasm (I think).
Scientists get knowledge from self study and from their peers rather than from
formal education. The believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
Tries to define end-user-developers. Mentions how they are quite clever and can
master the programming themselves. Shows that the directly use the software to
chive their goals, and are therefore not concerned with how the material appears.
Tells us that not all end users are the same and the second is that research should
use field studies. It's a bit confused, but you could use it as evidence in the
research prospect.

Page 8
Sheet1

Describes some pitfalls of the waterfall (document led) development approach, and
how the chasm cropped up in a bunch of scientists and engineers. It also justifies
the use of qualitative research, in very good term. Contains good tips about the
research methods. Some good further reading is given as well. The tips are 11
people, 4 different groups, published docs, (citation given), follow calls/emails,
check back to see that the ring true.

The paper goes over a waterfall methodology, and shows how it failed. Could be
good work to include, because it gives a good case study of how the w-fall methods
failed in a (supposedly) integrated team, and shows the faults of using paper to
bridge the gap – it basically doesn't work. Also talks about tailoring the method to
the job, and give a discussion on agile, and posits that it might be better, but still
would not be perfect. So she shows how agile and w-fall could be combined. All in
all, a pretty good paper.

Scientists make decisions based on maximising scientific output, not program


performance.
Debugging and validation are qualitatively different than for traditional software
development
A new technology that can co-exist with older ones has a greater chance of
success
More study is required to see why OO has seen little adoption.
Scientists have yet to be convinced that reusing existing frameworks will save
effort.
Without more support, IDEs are unlikely to be adopted.
Scientists on large projects see the value of an architectural infrastructure, and are
more disposed to build their own.

Scientists get knowledge from self study and from their peers rather than from
formal education. The believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
A small group of scientists and engineers in an integrated team lay out some simple
guidelines about how things should be done, i.e. their ideas on best practises.
Good examples on how amateurs get bogged down.
Speaks against strong, or top-down standards. Assumption that standards are
needed because biologists are not trained is false. Reality is that developing a
professional software systems is not the priority. New reassert means that
standards may bot be appropriate for them. Most early efforts are poorly document,
bit the best rose to the top and become robust and well documented. Example of a
failed CORBA project is given, where formatted files were used instead.
Engineering ahead of time fails. Argues for a democratic, community based
approach, research driven development efforts. Don;t make standards before field
has matured. Stifles innovation.
Goes over the compelling case for open source. Many advantages listed, and a few
disadvantages. Says the method is here to stay.

Page 9
Sheet1

Good acronym, SRD. Says early methods were unsystematic. Gives plenty of
reasons for using methods (the usual suspects), but also gives a few reasons why
not. E.g. a 1000 exist (!), doesn't match (sys dev is not orderly). Slavish – loose
sight of goal. Not universal. What about creativity and intuition. Moving towards
configuration, not dev as such. Used a questionnaire (20 retrain). Concludes that
methods are not on the rise, used as a framework, Developers understand limited
contribution,

Reason why sci soft is different (output is not obvious), must have a domain expert.
Software changes as domain expands. Scientists have their own process model. A
clash exists between it, and engineers models.

The paper has three themes. One is risk management. Another is war stories. Find
the actual magazine, as it has the case studies. Final theme was guidelines.
Computing is the third pillar. Mismatch of culture. Get more from this paper. Get
related articles.
The case that methodologies are winning out.

Describes JS's scientific process model. Basically, it;s the same as mine. Read this
again.
Make scientist think more like software engineers, Collaborative Development,
Verson Control, Testing and Debugging, Maintainability
Argues for maintaining the “scientific process model”, but improving tools to make it
work better. Also, some degree of latitude – take SW Eng where it makes sense.

Focus has been on application independent approaches. Mentions strong solutions


(targeted) and weak ones. Uses a wrench metaphor. Advocates using an approach
based on the type of problem. How much more progress is possible without taking
the type of application into account. Application dependent focus .Mentions
FORTRTAN and COBOL, as I do. The failure of PL1 (general). In two minds about
specific versus general. Mentions Parnas: one of the sicknesses of the computer
science field is that we try to solve the general problem before learning how to best
solve specific ones. Narrow research, expanding communities. Glass, Vessey,
application domain taxonomies. Schmidt refers to frameworks that are specific to
the problem type. Specific patterns emerging.

Reuse via domain emerging – make reusable components. Ref to Second Annual
conference on Software Methods called for Taxonomy of methods. e.g. application
domain focuses taxonomies. Group into areas that lend themselves to similar
solutions. Will it be driven by practise or theory?

Basically, a continuation along the same lines as Glass. We might need to make it
application specific.

Gives a nod to Dynamic Requirements, continuous dialogue, scientific process


model. Hen describe standard UCD process. I reckon UCD makes engineering
more like scientists.

Page 10
Sheet1

Scientists get knowledge from self study and from their peers rather than from
formal education. They believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
Specific to interdisciplinary team. Talks of cooperation , and culture problems.
Useful definition of culture. Life sciences. Immediate development. Validates the
pervasive, informal scientific process model, e.g. cursory testing, only valuable if it
support science, Domain sill,s not sw skills, are valued. Domain is king. Neither
establishment of reqs nor testing is major. Warns that scientist may think their own
model is ubiquitous. The science process used is similar to simulated annealing.

Needs to be systematised, systems data collection. Problems are ambiguous


terminology. Melding takes time. Uses a qualitative study. 10 interviews, a good
selection. Use of semantic units and themes. Like my mini taxonomy. Problems due
to influence of PEUD culture. Domain Users must interact with de v process (ref to
that effect). Not trivial. Big problem: team not able to do this. User negotiation is a
big deal (e.g. Neil Blue). Data driven UI not easy to use. Mandatory cmds. Break
down due to the chasm. Within dev team, comm of practise exists. Things are
smooth. Shock horror when sci team discover strict rules about project – they didn't
read the paperwork! Confusion about implementation authority. Confrontations.
Resentment. Not an integrated team. Big problem: lack of take up. Solutions: Lack
of domain knowledge - A mediator was put in to bridge the gap (a scientific
representative). Not enough time, though. Users were involved. Not enough to
make a difference, though. Perhaps this should be added to the amelioration
measures? Code camp.... Another chasm cause: systems enable progress, but do
not directly produce them. Thus, they suffer. Gelling takes a long time.

The article finishes with an unual reflection on cultural issues, related to problems
building an integrated team, and various solutions for that.

First tie in I've seen between apparatus and software. Mentions “in silico”,
reinforcing the parallel. Example of divide and conquer. Engineer does workflow
(Taverna), scientist does model. Also, use pipline pilot. Mentions automation.
Mentions repeatable. Article is mainly about taverna – a workflow manager with
some ease of use features. Scientists are concerned with getting results rapidly.
But: they want reliability but not the bother of learning a generic solution – they are
likely to invent their own. Engineers, though, want the best solution, … Another
constraint on success: scientists wpuld rather share their toothbrush than their data
(Mike Ashburner). What will be software do for me? Content, relevance, not clever
or innovative (this will do to validate a calim I made in TMA1). Principles: fit in,
don;t chnage. Jam today and more Jam tomorrow. Incremental development. Act
locally, think globally. Build the system for local scientists, in a way you can scale it
up. Build trust between users and developers. A good team gelling quote. 1) Keep
friends close. 2) Embed 3) Know your users. 4) Expect and anticiptae change.

Page 11
Sheet1

Stuff to mention: empirical methods to measure. Mentions chasm. Says gap is grounded in
diverse values and constraints (culture) of sci and s/w eng eng. Sci seceded decades ago.
Triangulation to ensure research can be trusted. Use “phenomenon's under investigation”.
Machines improve but hard to get things done. DARPA challenge – improve productivity.
S/W process problems. Sci has its own approach – bottleneck. Due to interdisciplinary
experts. Handcraft code. Fundamental changes required.Cooperation needed
interdisciplinary. Human/org issues.Skill problems they need domain skills. and
programming. Need algorithms tuning skills. Not enough people with the full skill set
(suggest divide and conquer, or twin programming etc.) Apprenticeship (suggests
prof/student approach). Work is manual. Suggests automation (divide and conquer). There
is a lack of tools (divide and conquer). How to trust the output (make scientists more like
software engineers). SW eng have developments methods, but are unknown to scientists).
Mentions opens sources as a solution. Excellent desert island analogy – Hm... where are
comm in my list: interdisciplinary team? Lack of global thinking – dessert island.
interdisciplinary team? Contrasts styles. Codes are expensive, last years, it's about science,
performance matters h/w changes, Fortran and C++. Sci values pref, h/w cost, portability,
not s/w eng. S/w eng values good code, robust language, high abstract, sci doesn't. S/W
Engineers must be prepared to make the case for technology using sci frame of ref. Make
engineers more like scientists. Mentions these amelioration techniques (divide and
conquers) automation, abstraction. Also, did up some old software (back filling). Increased
many good things, but performance suffered. Culture change required (make sci more like
software engineers ).

Page 12

You might also like