Holiday Greetings from Ivar Jacobson

December 22nd, 2011

Season's Greetings

Year after year I feel fortunate that we as a company can report so many new technology advances – advances which help our customers build better software, faster and with happier users and clients.  A number of new practices have been developed to help our customers scale agile beyond small teams to whole organizations.  Our practices for scaling agile and for setting up coaching communities have, with our experienced mentors, helped many organizations to not just become successful with small projects but changed organizations in their entirety.  Several case studies confirm these successes.

Most important in 2011 is our release of Use-Case 2.0.  After having refined the way we work with agile use cases over many years, we have now put together all our experience in an ebook.  Our customers have already used this around the world, China, UK, Sweden, US; but now we have streamlined it close to perfection.  Use-Case 2.0 supports backlog-driven development with Scrum or Kanban making it as lean as it can be.  Many of our colleagues and friends are awaiting this new release ready to start using it.

Lean development has been a long term goal for our company.  Our approach, starting lean with a kernel of the most essential elements in software engineering and staying lean as more detailed practices are added on top of the kernel, is getting wide acceptance in the world.  During 2012, we will see standards being adopted, the academic world embracing it, new tools being available, the industry getting some governance of their ways of working, and developers systematically growing their competence in software development.  Most importantly, success stories will speak for themselves: agile and lean will be achievable targets for every organization developing software.

Welcome to the future, but don't forget to enjoy the holidays.
--Ivar Jacobson


“Is code engineering an art or a science”?

October 17th, 2011

I received this question in an interview with SDTimes.  Although answering this question is like walking through a mine-field, I was so intrigued by the question that I accepted the challenge.

Code engineering is an interesting term that I so far have not come across.  It instantly has a built in conflict because it implies that coding is an engineering activity and if it is, then it can't be an art or a science.    Neither of these would be an engineering activity. Thus, the question is broken.

It would be easier to answer the question "is coding an art or a science".  I think it is neither.  It is more like a craft.  However, when do you really need craftsmanship when writing code?

When you implement interesting new algorithms,  you really need excellent coding skills.  However, the total amount of code of this nature is small.  I dare say that more than 80% - 90 % of code written for applications in banking, telecom, defense, etc. are just plain implementations of relatively simple features.  The coding job is quite straightforward, most of it doesn’t require great craftsmenship. Understanding the business processes (or how you can change them), the user's needs, the architectural principles and values, the available solutions as prefabricated packages or frameworks etc. are then as critical as having good coding skills.

Now, is there a case when someone could motivate talking about code engineering.  Yes, if everything you do would be about coding.  When you think about new business processes, you express them as code.  When you think about what the software should do (requirements) you express it as code.  When you think about the architecture and design, you express it as code.  Then of course code engineering would make perfect sense.  On the contrary if you think about code engineering as a discipline besides other engineering activities (requirements, architecture, design, etc.), then you would sub-optimize the work you have to do while coding, and that would be a serious mistake – not smart!

Follow the conversation with me on Twitter.


Liberating the Essence from the Burden of the Whole: A Renaissance in Lean Thinking

October 12th, 2011

"In every block of marble I see a statue as plain as though it stood before me, shaped and perfect in attitude and action. I have only to hew away the rough walls that imprison the lovely apparition to reveal it to the other eyes as mine see it."—Michelangelo.

So how can a 500 year old quotation be relevant to business today? Well, you’ll be surprised.

The concept of a “kernel” representing the essence of something is a good starting point for most things you build.  You start lean and you stay lean throughout the life of what you build.  The idea of a kernel has many practical applications in today’s business:

1) designing an agile business,

2) building products using agile techniques,

3)  re-engineering your method or way of working.

Proven in many practical situations, the kernel concept provides the ability to scale up the use of agile approaches whilst maintaining control and visibility. It is now being considered for adoption by standards bodies such as the Object Management Group to enable light weight, usable, agile approaches to knowledge management.

The justification for the title is that adopting a kernel approach focuses on the essentials and naturally leads to lean processes and systems. The renaissance is that instead of removing waste to get lean we start lean and stay lean.

I presented a keynote with the same title as this blog at the Agile Business Conference in London Oct 5.  You can download the presentation here.

Follow the conversation with me on Twitter.


Use-Case 2.0: Scaling Up, Scaling Out, Scaling In for Agile Projects

October 11th, 2011

Use-cases continue to be a popular way of working for both business and system requirements.  Googling "use-case" yields a search volume six times greater than Googling "user story", but software development should not be driven by popularity. Instead we should use the most practical way of working, one that allows us to continuously improve.  Over the years we have learnt how to be truly successful with use cases, and of course we have learnt something from other techniques, such as user stories and aspect-orientation, which have inspired us to make use-cases even better while maintaining their core values.

Distinctive features of the new use cases are:

  • As agile and light as you want it to be
  • Scaling up, scaling out and scaling in - to meet your needs
  • It’s not just about requirements - it's for the whole lifecycle
  • It's also for non-functional requirements
  • It’s not just for software development - it's for business as well

The presenation at IIBA was received very well by the around 400 participants.  Afterwards IJI staff were busy explaining the new use-cases and how to find slices of use cases to populate the backlog to support Scrum or Kanban like projects. Participants signed up for the new eBook on use-case 2.0, to be released shortly.

You can register to receive the powerpoint presentation here.

Follow the conversation with me on Twitter.


Semat – what happens?

September 19th, 2011

I would like to draw your attention to three recent blog entries: http://sematblog.wordpress.com/

1) "You are a developer - what is in Semat for you".

2) "Agile in everything".  One of the underlying principles of Semat is that working with methods needs to be agile (not just the methods themselves but working with them).  This implies features not previously found in how to define, use and adapt methods.

3) "A Major Milestone: On the way to a new standard".  An RFP of a standard based on the key ideas of Semat has been issued by OMG.  Letters of Intent are due on November 22, 2011; submissions are due on February 22, 2012.

This is very good progress, but honestly I don't feel the acceptance of the RFP is a sufficient step to declare success.  In the blog "A Major Milestone: On the way to a new standard", we finish by saying:

"Getting the RFP approved by OMG was one of the major milestones of Semat. Quoting Churchill: “Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” Now we need to create something that will go beyond anything previously done by any standards body working with methods: getting the standard adopted by the broad developer community.

This is a challenge that cannot be overestimated.  This requires new fresh ideas that are not typical for standard bodies and methods work.  Fortunately, the Semat teams have several such new ideas. ‘Separation of concerns’ and ‘agile in everything’ will guide us, but more is needed.”

We have fresh new ideas for how to describe methods and practices in a very light way, ideas that significantly will improve readability.  The kernel will allow us to not just learn practices easily, but most importantly also allow us to use them during real work.  Earlier approaches have been completely silent on use, but modern approaches such as Kanban and Lean rely on similar ideas.

The number of people working on Semat has more than doubled over the last couple of months.  New chapters of Semat are set up in China and Latin America.  Still we would like to welcome more talented people to work with us.

--Ivar


Use Cases – What is New?

March 14th, 2011

As we refine and improve use cases we are careful to make sure that we don't impact any of these things that are key to their popularity and success. Use cases as we deal with them today have gone through a major face-lift.  Without really changing the key ideas, the impact of the changes is dramatic.  The result is a fundamentally more efficient way of developing software than the original use cases.

What is new about use cases?
The impact comes essentially from two areas: user stories and aspect-orientation.  The result is that we adapt them for backlog-driven development and managing cross-cutting concerns.

User stories:
In the past we had two concepts – use cases and scenarios.  Scenarios were a kind of user story.  In 2003 we introduced the concept use-case module (published in a paper [1] and in the aspect book [2]).  A use-case module was a slice through the system.  It included a use case (or a part of a use case), its analysis counterpart, its design, its code and its test.  Influenced by Scrum and user stories we sharpened these concepts and improved the terminology.  Now we talk about use cases, stories and use-case slices. Thus we have now:
1)    Use cases are, as they have always been, sets of structured stories (user stories if you want) in the form of flows of events, special requirements and test scenarios.
2)    Each story is managed and implemented as a use-case slice, which takes the story from its identification through its realization, implementation and test allowing the story to be executed.
3)    Thus a use-case slice is all the work that goes together with a particular story. Each story and thus its slice is designed to be a proper backlog element, and realized within an iteration or a sprint.
4)    The use-case strategy (starting from a use-case model) makes it significantly easier compared to the traditional user story strategy to identify the right user stories to work with and to understand how the ones selected make up the whole system.  As the use cases are now developed slice-by-slice, the size of the use cases is not any longer a problem!
Thus, use cases are what they always have been.  Stories are abstract scenarios a la user stories.  Use-case slices are use-case modules made smaller, suitable as backlog entries.   The terms scenario and use-case module will thus be replaced by story to remove the ambiguity between the abstract story-like scenarios and the concrete test scenarios and use-case slice.

Note: this can be compared with the user story approach where:
1)    The stories are captured as a set of unstructured user stories.
2)    Each user story is managed and implemented as one-or-more user story slices, which take the story from its identification through its realization, implementation and test allowing the user story to be executed.
3)    If a user story is too much too implement in one go the story is sliced up into a number of smaller user stories and the original user story disposed of. This illustrates how it is the user story slices that are implemented and not the user stories.
4)    Additional story types, such as Epic and Theme, are added to act as placeholders for user stories that we know will have to be sliced before they can be implemented.

Aspects:
Aspect-orientation has inspired us to deal with not just application-specific use cases (functional requirements), but also with infrastructure use cases (non-functional use cases).  The latter are dealt with as cross-cutting concerns, allowing us to add behavior to an existing system without actually changing the code of the existing system.  Examples of such non-functional behavior are persistency, logging of transactions, security.  This has helped us to deal with requirements (and their realizations) for systems of systems (enterprise systems, product lines, service-oriented architectures), and for partial systems such as frameworks and patterns.  See our book on aspects [ref. 2]

Thus, the key ideas have not changed, but they have been augmented with features that support backlog-driven development and working with non-functional requirements.

Use cases with stories and story slices address many of the issues now raised with the sole user story strategy.  Use cases with cross-cutting concerns address many of the problems analysts have raised with non-functional requirements.  To people, who already had adopted use cases, the new changes are not seen as large, but its impact on the way we develop software is dramatic.

-- Ivar

[1] Jacobson Ivar, Use cases and aspects - working seamlessly together, Journal of object technology, July-Aug 2003

[2] Jacobson Ivar, Ng Pan Wei, Aspect-oriented software development with use cases, Addison-Wesley, 2005.


Use-cases – why successful and popular?

March 7th, 2011

I am pleased, honored and gratified that use cases are still a popular way of working with requirements.  Googling “use case” yields 6 times more hits than Googling “user story”, but software development should not be driven by popularity.  Instead we should use the most practical way of working.  And, of course we have learnt something from other techniques.  For instance, as I will discuss in my next blog, user stories and aspect-orientation have inspired us to make use cases even better while maintaining their core values.

The popularity of use cases has led to some misunderstandings and some distortions of the original technique.  This is natural, and while it is encouraging to see authors take the original concept and adapt it to solve new problems, some of the misconceptions and distortions have clouded the original vision.

Common misunderstandings

Before further discussing the improved use cases, let’s first discuss common misunderstandings about what we have had since their inception (1986-1992).  Many people believe that:

1) Use cases are for requirements only, which is not true. In fact, from the very beginning, they have also been used to analyze behavior, design software, and drive test identification, just to name a few uses.

2) Use cases are heavyweight; that you need to write fat specifications of each use case, which is also not true.  In fact, use cases can be exceptionally lightweight (a brief description only), to lightweight (just an outline of the flows), to comprehensive (full descriptions of all behavior), and every variation in between.  For most systems an outline can be very valuable and yet still be very lightweight.  Today, we express this in a better way: when describing use cases, focus on the essentials, what can serve as placeholders for conversations.

3) Use cases are a technique for decomposing the behavior of a system, which is also not true.  Some authors have introduced levels of decomposition, and others try to show use cases “calling” other use cases as if they were subroutines.  Neither of these is right.  A use case shows how a system delivers something of value to a stakeholder. Use cases that need to be “composed” in order to provide value are not real use cases.

4) Use cases are complicated.  In fact, using use cases, if done right, makes a system easier to understand.

o It is impossible to understand what a system does from looking at many hundreds of user stories; the equivalent use-case model might express the system’s behavior in a few handfuls of use cases. 

o A user is represented by a stick figure and a use case is represented by an oval.  Their interconnection by a simple line.

o The relationship between a use case and its scenarios are likewise very easy to represent. 

o To solve this problem with user stories, people have started to invent concepts such as themes and epics, making a case that the user story by itself is an incomplete concept. 

o The use-case approach can accommodate a wide range of levels of detail without introducing new and potentially confusing concepts.

5. Use cases are only seen as being good for green field development, which of course is not true.  They are great to explain large legacy systems as with such systems there is often little or no documentation left.  Use case modeling is a technique that is cheap and easy to get started with to capture the usages of the system.

What people like about use cases

The reason use cases have become so widely accepted is that since their introduction they are useful in so many ways in software development. 

1) A use-case model (a picture) already mentioned, which thus allows you to describe even a complex system in an easy to understand way, and which tells in simple terms what the system is going to do for its users. 

2) Use cases give value to a particular user, not to an unidentifiable user community.

3) Use cases are test cases, so when you have specified your use cases, you have also after complementing with test data, specified your test scenarios,

4) Use cases are the starting point to design effective user experiences, for instance for a web site.

5) Use cases ‘drive’ the development through design and code.  Each use case is a number of scenarios; each scenario is implemented and tested separately.

Moving forward
As we refine and improve use cases we are careful to make sure that we don't impact any of these things that are key to their popularity and success.  In my next blog I will describe how we adapted use cases to backlog driven development and managing cross-cutting concerns.

-- Ivar


Semat – moving forward

February 10th, 2011

Semat moving forward

During the last many months I have been very silent, but not inactive. I have been very active working with a dozen other people on moving Semat forward.  You will soon hear a lot more from us, but I would already now like to give you a quick update on the progress.

As you may recall, the Grand Vision of Semat was to re-found software engineering based on a widely agreed upon kernel representing the essence of software engineering.  The kernel would include elements covering societal and technical needs that support the concerns of industry, academia and practitioners.

The troika (Bertrand, Richard and I) were pleased, honored and gratified to find that in a short period of time, a dozen corporate and academic organizations, and some 3 dozen well-known individuals from the field of software engineering and computer science, became signatories to support the vision.  In addition, more than 1400 other supporters agreed to the call.

In November 2010, the troika agreed that we would move the work on the kernel to OMG (Object Management Group) to get the proper governance we needed.  Since then we have been working in three different but overlapping groups on three tasks:

Moving the development of the kernel to OMG.

In order to move the work to OMG, OMG first needed to submit a request for proposal (RFP).  A couple of people from Semat have worked together with a couple of OMG members to specify an RFP for what now is called 'A domain-specific language and a kernel of essentials for software engineering.’  Early December 2010, an early version of this RFP was presented to the Analysis and Design Task Force of OMG in Santa Clara. It was very well received.  Several other OMG members have now joined us to work on the RFP, which will be published within a few weeks.  March 21-24 the RFP will be discussed at an OMG meeting in Arlington/Washington DC.  We hope and expect it to be approved, and thus the work on proposals can start.  Anyone can submit a proposal, and so we will too.

Our proposal to a kernel

Semat itself will of course not give a proposal to the RFP, but key players are now working together to continue the work we started within Semat.  There is one team lead, Paul MacMahon, who along with 12-15 participants will now continue the work in a couple of tracks.  The idea of doing architectural spikes is continued.  The plan is still to be able to deliver something that can be used by the industry by April 1.  Personally, I think the work has slowed down because of the work with OMG and the continued work on Semat, which I will describe next.  However, we will deliver something of interest and also of value in a couple of months.

The kernel is just a first step in the Grand Vision of Semat.  However, much more work needs to be done.

Even if the development of the kernel now has been moved under the OMG’s umbrella, Semat still has a lot of work to do. We need for example to:

  • be a demanding “customer” of OMG, making sure that as a community, we get what we want,
  • support the community in its effort to get reusable practices,
  • move the work to the academic community to inspire the development of new curricula and useful research.

Thus, a vision for the next couple of years is needed.  A team of 8 people have been working for more than a month to develop a proposal for a Three Year Vision of Semat. This proposed vision should be published within a couple of weeks.  We will focus on what impact we expect to have on three key user groups: the practitioner, the industry and the academia.  The impact should be measurable and not just hand-waving.  How we will work to get the results specified in the vision will be discussed separately.  First we want to agree on where we want to go.

As I am sure you understand, working to ensure that the vision of Semat becomes reality is a challenging task to say the least.  However, it is one well worth the effort.  Please join us.


What Drives Me

June 16th, 2010

What Drives Me

“The best way to predict the future is to invent it!“ (Alan Kay)

A few days ago, a very simple but thought provoking question was raised to me: “what it is that drives me?” The simple truth is that I do not know. But I do know what it is that does not drive me. It is not about money. Actually, never has it been about money. Neither is it about power. I am happy to step aside and I am happy to delegate both up and down. It is not about popularity – but I do like to be appreciated for what I do.

No, it has to do with helping others improve themselves over and over again. I get a kick out of seeing others become successful because I helped them. It was like that in the late 1960s and the ‘70s when the Ericsson AXE system beat all competition and won every contract thanks to being component-based. Similarly, when Rational was successful because of UML and Objectory. And Telelogic because of SDL. I am happy when people are successful thanks to use cases.

Read the rest of this entry »


Two complementary macro-trends in software engineering

April 30th, 2010

From my horizon, I see two distinct and yet complementary macro-trends driving the way we become better at developing software. One could be called “Methods & Tools” the other could be called “Professionalism & Craftsmanship”. These two trends are not new, they have been around as long as we have built software. Both are based on the fact that it is people who develop programs, rather than methods and tools. But they take different approaches to the problem by focusing on different aspects of software development.

Trend “Methods & Tools,” exemplified by Semat Initiative (www.semat.org), of which I myself am one of the founders, drives the thesis that the way we develop software is immature and in need of being revolutionized. Yes, these are strong words, but the initiative is supported by more than 30 renowned scientists, scholars and practitioners in the software engineering field, including leaders from major industrial corporations (e.g., ABB, Ericsson, Fujitsu, IBM, Samsung, Microsoft, etc), academia (researchers, professors, institutional managers), and thousands of practitioners around the world.

Specific problems addressed are that often methods seem to be based on fashion and fads; that there are almost an infinitive number of methods that cannot be assessed or compared with each other. The number itself is not a problem. There should be many methods, however, these methods must be designed in such a way that they can be compared, assessed and improved. Finally, there exists a big barrier between academic research and industrial practices, which must be torn down.

The way forward is based on observation and understanding that:

  1. Every method is just a composition of practices, either human or technology related.
  2. Practices are reusable over many application areas and technologies.
  3. Underneath all practices is a small kernel of universals described by a small kernel language.
  4. Universals are things that we always have, do or produce when building software.  We will discover them!
  5. Practices and universals will be described by a small Process Kernel Language.

Using this kernel and the language, we can describe all known methods, and, because practices are comparable, methods that are composed from these practices can be compared. I would like to tell you a lot more now, but it can wait for later. The impact of this SEMAT initiative is that, if successful, we will streamline the entire software world: from academia to industry, from practitioners to teachers and researchers. We will become better, faster, and happier in developing software.

After all, it is people who develop software, not methods and tools. So we must address the “human” side of software development.

Trend “Professionalism & Craftsmanship”, which is popular with the original founders of the agile movement, for instance by Bob Martin (“Uncle Bob”), who takes a very different standpoint. From this perspective, the big problem is not the lack of methods or tools, but how we train, educate and mentor programmers to become professional craftsmen. Code can be written by anyone at any time, but what makes us professionals?

  1. We must be proud of what we do. We must be able to say “no” to either the boss or the customer, if necessary. We have our professional practices and these cannot be compromised.
  2. The boss and the client must accept the fact that our work is technical in nature; so let them think we are geeks, but respectable geeks.
  3. Eliminate hourly rate - doctors or lawyers are not paid by the hour (even if under pressure they may say so). There must be better ways to charge.
  4. Anything that is worth doing should be done well and with quality. When we ship code, we must know that it works. Acceptance testers should not find any errors.
  5. Become competent through an apprenticeship program. Choose a master and learn from him or her.  After some years you may select a new master and also learn from him or her.

Both trends are of course important.

Proponents of methods & tools suggest that it is clear that we must constantly improve professionalism. However, it would be much easier to be professional if we can elevate the level of our understanding of methods & tools.

Proponents of professionalism & craftsmanship are concerned that such an elevation means enforcing restrictions, and many are therefore hesitant or reluctant to work with or support initiatives related to methods & tools.

It is clear to me that we must do both. Since Bob Martin is a signatory of Semat, I think it is clear to him as well.

It is smart!