ICANN ICANN Email List Archives

[gnso-wpm-dt]


<<< Chronological Index >>>    <<< Thread Index >>>

Re: [gnso-wpm-dt] WPM-DT: Step 5 (In Progress)

  • To: Ken Bour <ken.bour@xxxxxxxxxxx>
  • Subject: Re: [gnso-wpm-dt] WPM-DT: Step 5 (In Progress)
  • From: Olga Cavalli <olgac@xxxxxxxxxxxxxxx>
  • Date: Thu, 7 Jan 2010 13:37:08 -0300

Thanks so much Ken.
Talk to you soon
regards
Olga

2010/1/7 Ken Bour <ken.bour@xxxxxxxxxxx>:
> This might be a RESEND.   It was originally forwarded from the wrong email
> account.   I just installed Office Pro 2010 and am still getting used to the
> different look/feel on Outlook.
>
>
>
> From: Ken Bour [mailto:kenbour@xxxxxxxxxxx]
> Sent: Thursday, January 07, 2010 10:31 AM
> To: 'gnso-wpm-dt@xxxxxxxxx'
> Subject: WPM-DT: Step 5 (In Progress)
>
>
>
> WPM Members:
>
>
>
> In preparation for our call today (7 January 2010), I have uploaded all of
> our latest documents (e.g. Projects List, Brief Descriptions, Test
> Data/Charts, and X/Y Definitions) to the Adobe Connect room
> (http://icann.na3.acrobat.com/gnsowprioritization/) so that we can review
> them together.
>
>
>
> In addition, I took a stab at modifying both the X and Y definitions in
> light of comments/notes from previous discussions and those on the list
> (changes in red font).   FOR DISCUSSION…
>
>
>
> Y – Value/Benefit … this dimension relates to perceptions of overall value
> and benefit to the following communities:  1) the GNSO, 2) the global
> Internet community; and 3) ICANN stakeholders.  Components of this dimension
> may include, but are not limited to:  new opportunities for Internet
> growth/expansion, enhanced competitiveness, resolution/improvement of
> serious performance or infrastructure problems, increased
> security/stability, and improved user experience.
>
>
>
> X – Resource Consumption … this dimension relates to perceptions of total
> human capital expenditure anticipated and also includes such factors as
> complexity (e.g. technical), intricacy (e.g. many moving parts to
> coordinate), lack of cohesion (e.g. many competing interests), length of
> time/energy expected; availability/scarcity of resources -- all of which
> contribute to the total resource consumption and overall cost (economic and
> otherwise) required to develop a recommendation.  [Note:  for projects
> already in progress, please estimate only those expenditures remaining from
> the point of rating through to completion of the final recommendation; prior
> historical/sunk expenditures and future implementation/services costs are
> not to be factored into this dimension].
>
> As a tentative agenda, I might suggest the following general approach based
> on my summary from our last call.  As we go through the material, we might
> ask ourselves if there are any changes recommended to work already
> completed, i.e., what worked, what didn’t work.   It is not meant to be an
> exhaustive set of topics, but some of the concepts that we might consider in
> our assessment.
>
>
>
> 1)      Quick review of Team’s objectives (copied below).   Recalling that
> our goals were originally drafted, “To ensure that the process we select and
> recommend:
>
> ·         is user-friendly, unambiguous, and straightforward to execute;
>
> ·         produces realistic outputs that will enable the Council to make
> effective prioritization decisions; and
>
> ·         is structured not only as a one-time exercise, but considers the
> inclusion of new projects as they are proposed in the future.”
>
> 2)      Project list (15) and requirements for inclusion as “active”:
>
> a)      Is the format for the Project List (Tables 1 & 2) and Short
> Descriptions acceptable?
>
> b)     For removed projects, are the categorizations useful (I, M, X)?
>
> c)      How do we finalize the project list and when should that action
> occur?
>
> d)     Should “implementation-only” projects be included (relates to
> definitions below)?  If so, should there a separate process (and model?) for
> their prioritization?  This project was initially conceived as a way to
> assist GNSO with the prioritization of limited community resources (vs.
> Staff).   Once a project has been approved, does there continue to be a need
> for substantial community resources or primarily Staff?
>
> 3)      Two-dimensional model and definitions for X and Y:
>
> a)      Ken posted modified definitions to the email list (above)…any
> changes?
>
> b)     Should Y and X be rated by DIFFERENT parties to avoid
> cross-correlation between X and Y?
>
> 4)      Ratings scale 1-7 from Far Below Average to Far Above Average:
>
> a)      Is the team still OK with the 7 point scale for both dimensions?
>
> 5)      Individual vs. Group Ratings:
>
> a)      What is the team’s assessment of the pros/cons between individual
> vs. group ratings?
>
> b)     If we recommend group sessions, would we recommend that Councilors
> struggle to complete individual ratings first?   Would they be submitted and
> analyzed for commonality as we did in our test?
>
> c)      If group ratings are recommended, what sizes and configurations
> would the team recommend?
>
> d)     Should this WPM-DT perform another sequence of tests using smaller
> groups (e.g. 2 or 3)?
>
> e)      Should rating sessions be facilitated (by Staff?) using Adobe
> Connect with polling feature?
>
> Assuming no changes to the testing process after this step, the team could
> then focus on HOW (Step 6) it might utilize the data in terms of developing
> a prioritization (the ultimate goal of this effort).
>
>
>
> I look forward to talking with you at 1700 UTC.
>
>
>
> Ken Bour
>
>




<<< Chronological Index >>>    <<< Thread Index >>>

Privacy Policy | Terms of Service | Cookies Policy