<<<
Chronological Index
>>> <<<
Thread Index
>>>
RES: [gnso-wpm-dt] WPM: Summary of 16 Feb 2010 Session (Step 6-In Progress)
- To: <KnobenW@xxxxxxxxxx>, <ken.bour@xxxxxxxxxxx>, <gnso-wpm-dt@xxxxxxxxx>
- Subject: RES: [gnso-wpm-dt] WPM: Summary of 16 Feb 2010 Session (Step 6-In Progress)
- From: "Jaime B Wagner" <j@xxxxxxxxxxxxxx>
- Date: Thu, 18 Feb 2010 16:38:24 -0200
Comments below
Jaime Wagner
j@xxxxxxxxxxxxxx
cel: +55(51)8126-0916
De: owner-gnso-wpm-dt@xxxxxxxxx [mailto:owner-gnso-wpm-dt@xxxxxxxxx] Em nome
de KnobenW@xxxxxxxxxx
Enviada em: quarta-feira, 17 de fevereiro de 2010 05:51
Para: ken.bour@xxxxxxxxxxx; gnso-wpm-dt@xxxxxxxxx
Assunto: AW: [gnso-wpm-dt] WPM: Summary of 16 Feb 2010 Session (Step 6-In
Progress)
Thanks Ken,
as I pointed out for me the handling of the process is critical with regards
to frquency and time consumption at council meetings:
- Frequency: I agree to the 3-times-a-year suggestion but it shouldn't be
mandatory to conduct a formal rating session at every meeting. It should be
mandatory once a year only - my suggestion: at the first meeting after each
annual meeting since new councillors who have normally been seated at the
annual meeting have been given to learn about the projects. On the occasion
of the other ICANN meetings it should be put to the agenda optionally
subject to council decision.
[JBW] Agree
- Time consumption: A potential 2 hour slot at each ICANN meeting just for
the prioritization debate of projects (not discussing content!) seems to be
too much. My suggestion in order to facilitate the process would be to put a
prioritization proposal to the council meeting as a basis for the council
debate and decision rather than letting the council start from the scratch.
This proposal could be elaborated by any working team with council mandate
(e.g. option 2)
[JBW] I think the 2h time slot that I mentioned in our call will be
necessary only for the first exercise (and to keep up with this schedule it
will demand careful preparation). Subsequent exercises will always take into
account the priority scale fixed at the previous meeting as a
"prioritization proposal". With the convenient preparation it will often
amount only to a confirmation of the previous list, and I bet this will take
only a 15 minute slot in the Council meeting, except when a huge new project
is at stake.
Kind regards
Wolf-Ulrich
_____
Von: owner-gnso-wpm-dt@xxxxxxxxx [mailto:owner-gnso-wpm-dt@xxxxxxxxx] Im
Auftrag von Ken Bour
Gesendet: Dienstag, 16. Februar 2010 23:50
An: gnso-wpm-dt@xxxxxxxxx
Betreff: [gnso-wpm-dt] WPM: Summary of 16 Feb 2010 Session (Step 6-In
Progress)
WPM Team Members:
I apologize once again for a longish email summary, but I attempted to
capture the decisions made during our teleconference today, 16 Feb 2010 and,
also, to flesh out some of the discussions that remained unresolved as our
time expired. Ultimately, we are going to need a record of our
deliberations and I find that this writing process helps me congeal my own
thinking. In preparing each of these reports, short and long, I frequently
research earlier summaries to recall what was said/done and, in many
instances, I end up consolidating material in an effort to sustain forward
progress. Although this approach produces more written material to digest;
hopefully, you will agree that there is "method in the madness."
Following my normal procedure, I will create a 2nd email to include the
Agenda Topics and Action Items for our next session scheduled for 23 Feb
2010 (1700 UTC). Since there may be an opportunity to make progress via the
email list on some/all of the questions contained below, I will wait for
several days before compiling the separate email of agenda topics.
1) Project Criteria Definitions: Definitions for Value and Difficulty
approved as drafted in Ken's 14 Feb email.
2) Step 6 Analysis/Questions:
a) Rank order ties will be permitted. If Council needs to decide (for
any reason) between one or more tied projects, then it will rate each
project's "Difficulty" as defined by the WPM team employing a process
similar to the one used for rating Value.
b) Frequency: a formal prioritization rating session (all relevant
projects) will be conducted 3 times per year at ICANN meetings. The team
noted that a face-to-face working session may facilitate using the Delphi
group ratings approach. Specific methodology issues/questions are taken up
in (f) below. [KB Note: Chuck and I had an email exchange on the list
regarding the "political morale" question, which included the possibility of
creating a new exemption category (or two). I had that email ready for
discussion today, but we did not have sufficient time. May I suggest that
others register your thoughts in the intervening week, if you have an
opportunity? If we do not resolve that question via the list, I will add it
to the Action Items for next time.]
c) New Projects: when a new project arises, the Chair will assess
whether it can be addressed effectively by the Council (see d below) without
requiring that an official Value rating be determined. If a specific
rating/ranking is deemed necessary to proceed with the Council's management
responsibilities, then the Chair will call for the new project to be
formally rated before further action is taken; otherwise, that project will
be officially prioritized at the next scheduled session (e.g. next ICANN
meeting).
d) New Project Questions: the following questions were slightly
modified per today's discussion and are repeated to give the team another
look at them.
o Should this new project be undertaken, that is, have resources assigned
to accomplish a particular objective?
o What resource types, skills, and quantities are needed to adequately
staff this project?
o Are there sufficient resources (Staff and Community) available without
causing adverse impacts to other project work in progress? If not, should
any other project work be stopped or postponed?
o Should this new project have a date/deadline imposed, thus establishing
urgency? If it is determined to be urgent, can any real consequences be
identified that will cause the date to be perceived and treated as critical?
e) Project Status Changes: as noted in b above, updates will be
officially recognized and captured 3 times per year as reflected in the
Value ratings assigned. Status changes will also be considered when new
projects are evaluated (see c-d above). If, at any other time, conditions
warrant a formal reevaluation, the Chair may make such a recommendation to
the Council. [KB Question: does this mean that the Chair can call for
another prioritization exercise in between scheduled sessions? If not, I am
not sure how these "status changes" would be captured or reflected. The
team has not created any recording process, yet, for status changes other
than the actual Value ratings.]
f) Methodology: several alternatives were briefly discussed as
today's call came to a conclusion. I will attempt to summarize 4 Options
below including pros/cons as I perceive them. Please feel free to
add/change/delete any of the advantages or disadvantages. The options are
not presented in any order of importance or significance other than how they
occurred to me as I began sketching them.
Option 1:
Each Council member rates all projects individually in advance (e.g.
spreadsheet template); then a single Council Delphi session is held with all
members participating. Process would include looking for mathematical
consensus, e.g. Std Dev<1.0 or Range<=2, ahead of the group session. With
20 raters, I think it may be statistically unlikely, but it could happen.
Staff facilitates an Adobe Connect session using the polling feature and
blind voting. The polling process ends when all individual scores occupy no
more than 3 adjacent ratings or Range<=2 (e.g. 4, 5, 6 or 2, 3, 4); the
median becomes the final rating.
Pros:
. After one single iteration, all projects are rated and, thereby,
ranked (ties permitted).
. Prioritization occurs through group deliberation with all
viewpoints expressed and heard by everyone.
. If successful, this process should result in a high level of
acceptance by the Council and the community.
Cons:
. Time: the WPM's experience with 5 participants was an average of
7 minutes discussion/voting per project (in both Y and X sessions). Fifteen
(15) projects would take a minimum of 105 minutes not counting setup,
explanations, process discussions, etc.
[KB Questions/Comments:
1) We haven't tested, so we must ask: will 7 minutes per project hold
up with 20 potential participants? Instead of 4-5 individuals expressing an
opinion, what if 10-15 Councilors elect to speak individually? Should there
be time limits imposed (e.g. green/yellow/red)? If the team believes that 7
minutes is a reasonable benchmark, then a 2 hour session would be barely
adequate - the first time through the process. Note that at 10 minutes
average per project, the effort would require closer to 3 hours (180
minutes) including 20 minutes to start and 10 to wrap-up.
2) With 20 participants, how likely is it that a Range<=2 would occur
on the 1st polling vs. requiring multiple iterations? Should a limit of 3
polling iterations be instituted in the event that, after the 3rd attempt, a
Range<=2 has not been achieved? I assume that we would still take the
median score at that point.
3) Irrespective of the number of participants, I also wonder whether a
face-to-face session might actually prolong the discussions more so than
participation remotely via telephone.
4) Restating Wolf-Ulrich's concept, would it accelerate the Council's
process if the WPM provided an initial rating as input?]
Option 2:
Extending another of Wolf-Ulrich's ideas in a slightly different direction,
Ken offered an approach with these feature elements: the Council asks each
SG to appoint one Council member to be a "delegate" to a standing GNSO
Project Prioritization Committee. The non-voting NCA would be a 5th
delegate and represent all other non-SG (e.g. ALAC). With Staff assistance
(TBD), each delegate would be responsible for developing a composite project
rating by soliciting/aggregating input from his/her SG. Delegates would
then attend a single rating session (stay with 3x per year?) in which they
would use the group Delphi approach facilitated via Adobe Connect (same
procedures/rules as Option 1). One delegate would be elected "Committee
Chair" by the team and that position would conduct/manage rating sessions,
direct Staff's involvement, and report the results to the Council.
Pros:
. Soliciting and consolidating Constituency/SG input is a common
Councilor function which would be applied to this task.
. SG prioritizations are handled in the "background" and the group
session involves only 5 individuals - similar to the WPM's testing
experience. The process would still take 2 hours, in all likelihood, but
would involve only 5 individuals vs. 20.
. Delegates would gain proficiency and efficiency both in collecting
SG input and working with colleagues on group ratings.
. After a single group session of the Committee, all projects are
rated and ranked.
Cons:
. Process might not enjoy the same acceptance level as Option 1,
especially if delegates are perceived to have yielded too much ground in a
quest to produce consensus.
. Delegates might tend to fight for their ratings more vigorously
than would be the case if individual Councilors rated independently and
without win/lose pressure.
Option 3:
The Council is sub-divided into small groups: RySG - 3; RrSG - 3; CSG - 6;
NCSG - 6; NCA's - 3) and each one uses the WPM procedures/rules outlined in
Option 1 (above) to produce a consensus rating for all 15 projects. The
median from the 5 individual group results becomes the Council rating for
each project.
Pros:
. Smaller groups of individuals from like-minded communities may
achieve consensus more easily and efficiently since perceptions and goals
tend to be congruent.
Cons:
. There may be very different rating outcomes between the 5 groups,
which will be "averaged away" in calculating the medians.
. If the groups do produce very different rating results, the
acceptance level of the final prioritization may suffer.
[KB Note: this option could also be reworked with small groups being
selected more or less randomly so that each one has a mix of contracted and
non-contracted parties. If the team thinks I should scope out that option,
I would be pleased to do so.]
Option 4:
Each Council member rates all 15 projects independently (e.g. spreadsheet
template; no group discussions) and the results are averaged or the medians
computed.
Pros:
. Most efficient option and requires least amount of time.
Cons:
. Statistical centering is very likely and, therefore, there may
also be a much higher incidence of ties. For a simple illustration, two
ratings of 1 and 7 produce a 4 both for mean and median, but that result
doesn't capture the reality of what really happened in the prioritization
process. The team witnessed this same phenomenon when its individual
ratings (only 6 raters) were averaged (mean and median). As we observed and
calculated, only 4 of the 15 projects achieved "natural" consensus
(Range<=2) in the Value ratings until after the group discussions took
place. There was considerable clustering of the ratings; in fact, 11/15 of
the Y means were between the range of 3.7 and 5.7 and 10/15 of the Y medians
were between 3.5 and 5.0.
. As a result of averaging, acceptance of the ranking may suffer if
the results show little overall variance with many ties.
3) Process Outcomes: Not discussed - move to 23 Feb agenda.
4) Project Management Toolset Recommendation: Not discussed - move to
23 Feb session.
5) Independent ICANN Project Ratings? Wolf-Ulrich wondered if
ICANN/Staff develop a separate prioritization list attendant to or
derivative from an operational or budget planning process. If so, how
should it be dovetailed with this effort? Ken will follow up Rob.
Prepared by: Ken Bour
<<<
Chronological Index
>>> <<<
Thread Index
>>>
|