<<<
Chronological Index
>>> <<<
Thread Index
>>>
Re: [soac-newgtldapsup-wg] charter language
- To: soac-newgtldapsup-wg@xxxxxxxxx
- Subject: Re: [soac-newgtldapsup-wg] charter language
- From: Richard Tindal <richardtindal@xxxxxx>
- Date: Wed, 12 May 2010 12:00:17 -0700
All,
This has been a very useful exchange between Alan and Anthony. I encourage
everyone on the WG to read the recent emails between them (one below and two
previously on the list).
Here are some additional thoughts on the notion that some applications will be
cheaper to process than others. If we plan to make the argument that our 'Who
and What' applicants have cheaper processing costs, I think we need to explore
and resolve the following:
A. Administration of the objections that some applicants may face (trademark,
morality, confusing similarity, community) is not funded from the $185K
evaluation fee. These objections are separately funded on a loser pays basis.
B. I believe the auction process for resolving contention sets will be
self-funding (i.e. the auctioneer will be paid out of proceeds) so we cannot
make an argument that our applicants are paying for this process.
C. In some cases the cost averaging approach will benefit our applicants, for
example:
(i) We may include Community status as one of our 'Who' criteria. If
so, we should note that all applicants fund the Community review process (out
of their $185K). As such,
non-Community applicants will fund the review of Community applicants.
(ii) I believe ICANN staff have rightly assumed that some applicants
will apply for multiple strings but use the same technical system in each
application. These applicants will not undergo a separate
technical evaluation for each string (at least not to the same level of detail
in each evaluation). The cost averaging approach has provided the benefit of
this efficiency to all applicants. For example, a
commercial applicant who applies for 3 strings will pay for 3 technical
evaluations -- even though the same technical system is being evaluated each
time. As our applicants will typically apply for
only one string, our applicants receive a disproportionate benefit from this
averaging.
Comments welcome on all of this.
RT
On May 12, 2010, at 9:08 AM, Alan Greenberg wrote:
> Two replies to Tony.
>
> At 12/05/2010 10:49 AM, Anthony Harris wrote:
>
>> The rationale was that the gTLD process development costs reduced ICANN's
>> ability to build its reserve, and that when received for new applications,
>> would go into the reserve. There was a strong negative reaction to this at
>> the time it was first introduced, and no change was made. I am not
>> optimistic that at this point, where the projected FY11 budget is rather
>> constrained and the contribution to the reserve has been reduced, that this
>> is a productive path to follow.
>> Obviously any proposal we come up with may well be rejected, nonetheless as
>> far as budgets go I have difficulty in feeling concerned about them, when on
>> page 3 of the ICANN explanation document we are told that if 500
>> applications are presented, this will mean a total intake of U$S 92.5
>> million !!!
>
> True, but the majority of this is aimed at paying the cost of processing
> (using the one-fee-fits-all formula). It may also be interesting to question
> whether the 500 is still a "reasonable" number, but I am afraid that my
> crystal ball is in for repairs today.
>
>
>> One possibility is that we do not argue against recovering these sunk costs
>> in general, but that we do recommend that they be waived for whatever group
>> of applicants meets the criteria developed under Objective 1. That will be a
>> moderately small percentage of the overall applicant group and may be
>> palatable.
>> A very good suggestion. But what about the U$S 60.000 "just in case" risk
>> contribution that is part of the application fee?
>
> When I first heard about the $60k "risk" component, my reaction was exactly
> the same - why assess this for applications that are far less likely to be
> "risky" (that is, risky during the applications process, not whether it will
> succeed of fail once deployed). But on re-reading the description, that is
> not what the primary rationale is. It is considering the risk that they have
> far fewer or far more than 500 applications, or that they botched the cost
> estimate for some of the tasks, or that they completely forgot or ignored
> some aspect of the process and thus did not cost it at all. As mentioned, it
> was derived through a complex process that will be difficult or impossible to
> audit and its validity is based solely on the confidence level in the input
> into the process, the theoretical soundness of the process that was used, and
> the correct implementation of that process. I fear that we have the ability
> to judge none of that.
>
> One could argue that the process for sponsored or not-for-profit TLDs will
> inherently be smoother and not as subject to all of these worries, but that
> attacks the one-fee-fits-all principle.
>
> Alan
<<<
Chronological Index
>>> <<<
Thread Index
>>>
|