ICANN ICANN Email List Archives

[gnso-wpm-dt]


<<< Chronological Index >>>    <<< Thread Index >>>

RE: [gnso-wpm-dt] Introduction to draft Work Prioritization model

  • To: "Liz Gasster" <liz.gasster@xxxxxxxxx>, <gnso-wpm-dt@xxxxxxxxx>
  • Subject: RE: [gnso-wpm-dt] Introduction to draft Work Prioritization model
  • From: "Rosemary Sinclair" <Rosemary.Sinclair@xxxxxxxxxxx>
  • Date: Sun, 22 Nov 2009 10:49:07 +1100

Hi Everyone

 

I'm safe in the service in Shenzen! So here are a few comments:

 

This process looks familiar - like Strategic Planning priority setting;
like Risk Assessment matrices, Operational Planning Processes - so for
me the type of process has a track record - of the available things to
be done which ones should we get on with......then which ones can we get
on with given resource levels.....

 

Having said that - it is very possible using this quantitative process
to get the WRONG answer with the RIGHT process

 

So a process which starts with a link to ICANN's AoC and Strategic Plan
might be provide some early "relevance" checks - and if GNSO keeps
getting requests for work that don't fit with these overarching
documents - we can feed that in to the next cycle of updating those Big
Picture pieces

 

Most Strategic Plans, Missions etc get done and sit on the shelf until
the next review process - Shelfware!

 

I prefer to make them Working Docs - (no that's not Working Dogs!) - so
we can keep our efforts coherent and focused - AND if those docs get in
the way - we can recommend change from a basis of "evidence" of effort
in trying to use them in the next round of review

 

THAT said - any quantitative tool that we use to try to reach consensus
should take us back to a discussion - if you like the narrative not just
the numbers should guide us on Priorities at the final step - even using
Delphi approaches I have been in groups where the WRONG answer lead to
good discussion and then the right outcome.

 

The next element of linking for me is to the Policy Development Process
(not sure I quite understand this yet - but I know we had presentations
in Seoul about a newly forming Policy Development Process and again I
search for connection between processes) - where does the Priority
Setting fit with that process at GNSO stage OR is it a task that fits
with the PDP.....

 

The third area of linkage for me is with our other ICANN Groups -
how/where do we "take the temperature" with other Support Organisations
and Advisory Committees - seeking their views on Priorities (perhaps
annually; perhaps they are embedded in the Strategic Plan) ...would
build links that related to a Work Program and could inform
co-ordination discussions and the like in a way that might be useful -
eg our GNSO - GAC liaison meeting might be an opportunity to review
Priorities focused around our Policy Priority Matrix -  allowing time
for free-wheeling but looking for outcomes that build into our work...

 

One other thought - in the Government policy work I an familiar with
there are some processes around Good Practice - I will find link and
forward - this might be a third element that we need to consider - How
Valuable? How Difficult? And then thirdly .....How Effective?

 

Cheers

 

Rosemary

 

________________________________

From: owner-gnso-wpm-dt@xxxxxxxxx [mailto:owner-gnso-wpm-dt@xxxxxxxxx]
On Behalf Of Liz Gasster
Sent: Saturday, 21 November 2009 8:01 AM
To: gnso-wpm-dt@xxxxxxxxx
Subject: [gnso-wpm-dt] Introduction to draft Work Prioritization model

 

Work Prioritization Team:  

 

As a way to help bring everyone to the same level on the GNSO Work
Prioritization project, I have attempted to consolidate various emails
and organize our latest thinking into a single document.    Again, this
is a suggested draft starting place offered by staff and the group is
encouraged to modify it as you feel appropriate.  There are three
sections as follows:  

 

1)     Recommended construct and methodology (see also attached
spreadsheet)

2)     Draft definitions for two dimensions  

3)     Procedural questions to be considered 

 

1)     Recommended construct and methodology

 

For this effort, Staff is envisioning a two dimensional matrix or chart
(X,Y) to help the GNSO Council graphically depict its work
prioritization.  This concept is based on having each discrete project
rated on two dimensions:  Value/Benefit (Y axis) and Difficulty/Cost (X
axis).   Section 2 below outlines the preliminary draft definitions for
each dimension (or axis), so we will concentrate in this section on what
the chart means, how it would be produced, and the rating/ranking
methodology including sample instructions.  

 

Illustration:   The chart below shows 8 illustrative projects (simply
labeled ABC, DEF, GHI, etc.) plotted on two dimensions:  Value/Benefit
(Y axis) and Difficulty/Cost (X axis).  In this sample depiction, Q1,
Q2, Q3, and Q4 represent four quadrants which are drawn at the midpoints
of each axis (arbitrarily set to 10).   Thinking about Value/Benefit
versus Difficulty/Cost, Q1 includes those projects that have the highest
value and lowest cost; whereas, Q4 would contain projects with the
lowest value and highest cost.   Project ABC, in this example, is ranked
3.25 on Difficulty and 7.75 in Value; therefore, it is located squarely
in Q1.   Conversely, project GHI, is rated 7.75 on Difficulty, but only
1.00 on Value and is thereby placed in Q4.   

 

 

 

 

How do the projects end up with these individual X, Y coordinates that
determine their placement on the chart?  

 

There are several options for rating/ranking individual projects.   We
will look specifically at two alternatives below:  

 

Rating Alternative A:  

 

One option is to ask each Council member, individually and separately,
to rate/rank each project on both dimensions.   Even with this
alternative (and B following), there are different methods possible, for
example, (1) place a ranking from 1 to n for each project under each
column, or (2) use something a bit simpler, e.g. High, Medium, and Low
to rate each project relative to the others.   Since it is arguably
easier to rate each project as H, M, or L versus ranking them discretely
from 1 to n, we will illustrate the former approach here.   Keep in mind
that an ordinal ranking methodology would simply substitute a number
(from 1 to 8 in our example) instead of the letters H, M, or L.  

 

Directions:    Rate each project on a scale of HIGH, MEDIUM, or LOW for
each dimension (Value/Benefit, Difficulty/Cost), but keep in mind that
the rating should be relative to the other projects in the set.   There
are no fixed anchors for either dimension, so raters are asked to group
projects as LOW, MEDIUM, or HIGH compared to each other.   A HIGH
ranking on Value simply means that this project is perceived to provide
significantly greater benefit than projects ranked as MEDIUM.   

 

If there are 20+ raters, we could provide a simple blank matrix and ask
them to provide their individual scorings.   For example, assume that
the matrix below is one individual's ratings for all 8 illustrative
projects:  

 

PROJECT

VALUE/BENEFIT

DIFFICULTY/COST

ABC

L

H

DEF

L

M

GHI

H

L

JKL

M

M

MNO

L

L

PQR

H

H

STU

M

M

VWX

M

L

 

Once we have all results submitted (could be simple Word, Excel, or even
Text Email) from all individual raters, Staff would convert each LOW to
a Score of 1, MEDIUM = 5.5, and HIGH = 10 (see attached spreadsheet,
Rankings tab).    We would then average the rankings for all raters and
produce a chart as shown in the attached spreadsheet (see Summary tab).
Note:  We only used 4 raters in the spreadsheet for illustrative
purposes, but it is trivial to extend to as many raters as we decide to
involve.   

 

Rating Alternative B:  

 

Instead of asking each Council member to rate/rank each project
individually, the Council could use a grouping technique (sometimes
referred to as "DELPHI").  For example, suppose we set up 4 teams based
upon existing Stakeholder Group structures as follows:   

 

            Team1:           CSG  = 6

            Team2:           NCSG = 6

            Team3:           RySG (3) + RrSG (3)  = 6

            Team4:           Others (NCA, Liaison) = 4-5

 

Using this approach, we would have 4 small teams and we would ask for a
single CONSENSUS score sheet from each one (whether ordinally ranked or
rated H, M, or L).   Then, we would average those results to produce the
overall chart (similar to the example in the spreadsheet).   We should
make it clear that we are discouraging teams from individually rating
and averaging their own results.   The benefit, from this modified
DELPHI approach, is that individuals (especially new Council members)
can learn from each other and develop, collectively, what they think the
most appropriate answer should be.   

 

The above methodologies are subject to further discussion.  Ultimately,
the Council will need to decide: 

 

1)     What work prioritization construct should be utilized (we have
suggested a simple two dimensional Risk/Cost vs. Value/Benefit displayed
in a four quadrant model)?   

2)     How should it be executed, e.g. participation, consensus ranking
(Delphi), individual ratings averaged, etc.?  

 

2)     Draft definitions for the X, Y dimensions

  

Staff proposes the following starting definitions for the axes in this
conceptual model.  

 

X - Difficulty/Cost ... this dimension relates to perceptions of
complexity (e.g. technical), intricacy (e.g. many moving parts to
coordinate), lack of cohesion (e.g. many competing interests), and,
therefore, overall cost/time to develop a recommendation.  We could have
- but chose not to - create a third axis to indicate the amount of time
required.  This adds complexity and we decided that initially we would
include the concept of time into the definition for level of difficulty.

 

Y - Value/Benefit ... this dimension relates to perceptions of benefit
to ICANN and its stakeholders in terms of internet growth/expansion,
enhancing competitiveness, increasing security/stability, and improving
the user experience.  

 

Please feel free to word-smith the above descriptions...

 

3)     Procedural questions to be considered

 

Once the matrix is developed and all projects plotted, what should the
Council do with the results?   This is an important question to answer
BEFORE the rating/methodology are finalized and executed.  

 

The Council should discuss and decide questions such as:

 

1)     How often should it be exercised and/or what event triggers an
analysis?  

2)     What decisions or outcomes are expected from the process?  

 

 

Please let me know if we can provide any additional assistance prior to
and during the upcoming conference call on Monday.  

 

Thanks and regards,

 

Liz

 

 

 

 

GIF image



<<< Chronological Index >>>    <<< Thread Index >>>

Privacy Policy | Terms of Service | Cookies Policy