Professional Documents
Culture Documents
Research Database
Page 1
Sheet1
Page 2
Sheet1
Page 3
Sheet1
Page 4
Sheet1
Orphan
Page 5
Sheet1
Page 6
Sheet1
Main Themes
A good write-up on how a special team was integrated into a scientific software
development environment, It was tasked with incrementally introducing
standards (agile) into the group. Particularly mentions dynamic requirements.
Discussed methodology selection and adaptation.
Describes the general waterfall. Quotes Wastrel, uses the term “irrational ritual”, to
create a feeling of security. Decides development process consists of intervention,
bricolage, improvisation etc. as much as management processes etc.
Scientists get knowledge from self study and from their peers rather than from
formal education. The believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
Page 7
Sheet1
Shows how a methodology can lead to a rigid ans mechanical approach. Rituals
that stifled innovation. Inhibited creative thinking. It may operate as a social
defence. It may operate as an irrational ritual. Gives a feeling of security and
efficiency at the cost to real engagement with the task as hand.
Another example of how a method (XP) fits into the scientific process. In particular,
it contains an example of how the process was altered where it don't seem to fit
very well.
See project proposal
No read
Page 8
Sheet1
Describes some pitfalls of the waterfall (document led) development approach, and
how the chasm cropped up in a bunch of scientists and engineers. It also justifies
the use of qualitative research, in very good term. Contains good tips about the
research methods. Some good further reading is given as well. The tips are 11
people, 4 different groups, published docs, (citation given), follow calls/emails,
check back to see that the ring true.
The paper goes over a waterfall methodology, and shows how it failed. Could be
good work to include, because it gives a good case study of how the w-fall methods
failed in a (supposedly) integrated team, and shows the faults of using paper to
bridge the gap – it basically doesn't work. Also talks about tailoring the method to
the job, and give a discussion on agile, and posits that it might be better, but still
would not be perfect. So she shows how agile and w-fall could be combined. All in
all, a pretty good paper.
Scientists get knowledge from self study and from their peers rather than from
formal education. The believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
A small group of scientists and engineers in an integrated team lay out some simple
guidelines about how things should be done, i.e. their ideas on best practises.
Good examples on how amateurs get bogged down.
Speaks against strong, or top-down standards. Assumption that standards are
needed because biologists are not trained is false. Reality is that developing a
professional software systems is not the priority. New reassert means that
standards may bot be appropriate for them. Most early efforts are poorly document,
bit the best rose to the top and become robust and well documented. Example of a
failed CORBA project is given, where formatted files were used instead.
Engineering ahead of time fails. Argues for a democratic, community based
approach, research driven development efforts. Don;t make standards before field
has matured. Stifles innovation.
Goes over the compelling case for open source. Many advantages listed, and a few
disadvantages. Says the method is here to stay.
Page 9
Sheet1
Good acronym, SRD. Says early methods were unsystematic. Gives plenty of
reasons for using methods (the usual suspects), but also gives a few reasons why
not. E.g. a 1000 exist (!), doesn't match (sys dev is not orderly). Slavish – loose
sight of goal. Not universal. What about creativity and intuition. Moving towards
configuration, not dev as such. Used a questionnaire (20 retrain). Concludes that
methods are not on the rise, used as a framework, Developers understand limited
contribution,
Reason why sci soft is different (output is not obvious), must have a domain expert.
Software changes as domain expands. Scientists have their own process model. A
clash exists between it, and engineers models.
The paper has three themes. One is risk management. Another is war stories. Find
the actual magazine, as it has the case studies. Final theme was guidelines.
Computing is the third pillar. Mismatch of culture. Get more from this paper. Get
related articles.
The case that methodologies are winning out.
Describes JS's scientific process model. Basically, it;s the same as mine. Read this
again.
Make scientist think more like software engineers, Collaborative Development,
Verson Control, Testing and Debugging, Maintainability
Argues for maintaining the “scientific process model”, but improving tools to make it
work better. Also, some degree of latitude – take SW Eng where it makes sense.
Reuse via domain emerging – make reusable components. Ref to Second Annual
conference on Software Methods called for Taxonomy of methods. e.g. application
domain focuses taxonomies. Group into areas that lend themselves to similar
solutions. Will it be driven by practise or theory?
Basically, a continuation along the same lines as Glass. We might need to make it
application specific.
Page 10
Sheet1
Scientists get knowledge from self study and from their peers rather than from
formal education. They believe testing is important, but don't believe they have the
understanding of testing concepts. Mentions the chasm, with a bunch of refs.
Suggests that the fields should be broken down to study different types of science.
Few publications in engineering that specifically address scientific software
development.
Specific to interdisciplinary team. Talks of cooperation , and culture problems.
Useful definition of culture. Life sciences. Immediate development. Validates the
pervasive, informal scientific process model, e.g. cursory testing, only valuable if it
support science, Domain sill,s not sw skills, are valued. Domain is king. Neither
establishment of reqs nor testing is major. Warns that scientist may think their own
model is ubiquitous. The science process used is similar to simulated annealing.
The article finishes with an unual reflection on cultural issues, related to problems
building an integrated team, and various solutions for that.
First tie in I've seen between apparatus and software. Mentions “in silico”,
reinforcing the parallel. Example of divide and conquer. Engineer does workflow
(Taverna), scientist does model. Also, use pipline pilot. Mentions automation.
Mentions repeatable. Article is mainly about taverna – a workflow manager with
some ease of use features. Scientists are concerned with getting results rapidly.
But: they want reliability but not the bother of learning a generic solution – they are
likely to invent their own. Engineers, though, want the best solution, … Another
constraint on success: scientists wpuld rather share their toothbrush than their data
(Mike Ashburner). What will be software do for me? Content, relevance, not clever
or innovative (this will do to validate a calim I made in TMA1). Principles: fit in,
don;t chnage. Jam today and more Jam tomorrow. Incremental development. Act
locally, think globally. Build the system for local scientists, in a way you can scale it
up. Build trust between users and developers. A good team gelling quote. 1) Keep
friends close. 2) Embed 3) Know your users. 4) Expect and anticiptae change.
Page 11
Sheet1
Stuff to mention: empirical methods to measure. Mentions chasm. Says gap is grounded in
diverse values and constraints (culture) of sci and s/w eng eng. Sci seceded decades ago.
Triangulation to ensure research can be trusted. Use “phenomenon's under investigation”.
Machines improve but hard to get things done. DARPA challenge – improve productivity.
S/W process problems. Sci has its own approach – bottleneck. Due to interdisciplinary
experts. Handcraft code. Fundamental changes required.Cooperation needed
interdisciplinary. Human/org issues.Skill problems they need domain skills. and
programming. Need algorithms tuning skills. Not enough people with the full skill set
(suggest divide and conquer, or twin programming etc.) Apprenticeship (suggests
prof/student approach). Work is manual. Suggests automation (divide and conquer). There
is a lack of tools (divide and conquer). How to trust the output (make scientists more like
software engineers). SW eng have developments methods, but are unknown to scientists).
Mentions opens sources as a solution. Excellent desert island analogy – Hm... where are
comm in my list: interdisciplinary team? Lack of global thinking – dessert island.
interdisciplinary team? Contrasts styles. Codes are expensive, last years, it's about science,
performance matters h/w changes, Fortran and C++. Sci values pref, h/w cost, portability,
not s/w eng. S/w eng values good code, robust language, high abstract, sci doesn't. S/W
Engineers must be prepared to make the case for technology using sci frame of ref. Make
engineers more like scientists. Mentions these amelioration techniques (divide and
conquers) automation, abstraction. Also, did up some old software (back filling). Increased
many good things, but performance suffered. Culture change required (make sci more like
software engineers ).
Page 12