Internet DRAFT - draft-ietf-bmwg-sip-bench-term

draft-ietf-bmwg-sip-bench-term






Benchmarking Methodology Working Group                         C. Davids
Internet-Draft                          Illinois Institute of Technology
Intended status: Informational                                V. Gurbani
Expires: May 16, 2015                                 Bell Laboratories,
                                                          Alcatel-Lucent
                                                             S. Poretsky
                                                    Allot Communications
                                                       November 12, 2014


Terminology for Benchmarking Session Initiation Protocol (SIP)  Devices:
                  Basic session setup and registration
                   draft-ietf-bmwg-sip-bench-term-12

Abstract

   This document provides a terminology for benchmarking the Session
   Initiation Protocol (SIP) performance of devices.  Methodology
   related to benchmarking SIP devices is described in the companion
   methodology document.  Using these two documents, benchmarks can be
   obtained and compared for different types of devices such as SIP
   Proxy Servers, Registrars and Session Border Controllers.  The term
   "performance" in this context means the capacity of the device-under-
   test (DUT) to process SIP messages.  Media streams are used only to
   study how they impact the signaling behavior.  The intent of the two
   documents is to provide a normalized set of tests that will enable an
   objective comparison of the capacity of SIP devices.  Test setup
   parameters and a methodology is necessary because SIP allows a wide
   range of configuration and operational conditions that can influence
   performance benchmark measurements.  A standard terminology and
   methodology will ensure that benchmarks have consistent definition
   and were obtained following the same procedures.

Status of this Memo

   This Internet-Draft is submitted in full conformance with the
   provisions of BCP 78 and BCP 79.

   Internet-Drafts are working documents of the Internet Engineering
   Task Force (IETF).  Note that other groups may also distribute
   working documents as Internet-Drafts.  The list of current Internet-
   Drafts is at http://datatracker.ietf.org/drafts/current/.

   Internet-Drafts are draft documents valid for a maximum of six months
   and may be updated, replaced, or obsoleted by other documents at any
   time.  It is inappropriate to use Internet-Drafts as reference
   material or to cite them other than as "work in progress."




Davids, et al.            Expires May 16, 2015                  [Page 1]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   This Internet-Draft will expire on May 16, 2015.

Copyright Notice

   Copyright (c) 2014 IETF Trust and the persons identified as the
   document authors.  All rights reserved.

   This document is subject to BCP 78 and the IETF Trust's Legal
   Provisions Relating to IETF Documents
   (http://trustee.ietf.org/license-info) in effect on the date of
   publication of this document.  Please review these documents
   carefully, as they describe your rights and restrictions with respect
   to this document.  Code Components extracted from this document must
   include Simplified BSD License text as described in Section 4.e of
   the Trust Legal Provisions and are provided without warranty as
   described in the Simplified BSD License.



































Davids, et al.            Expires May 16, 2015                  [Page 2]

Internet-Draft        SIP Benchmarking Terminology         November 2014


Table of Contents

   1.  Terminology  . . . . . . . . . . . . . . . . . . . . . . . . .  4
   2.  Introduction . . . . . . . . . . . . . . . . . . . . . . . . .  4
     2.1.  Scope  . . . . . . . . . . . . . . . . . . . . . . . . . .  6
   3.  Term Definitions . . . . . . . . . . . . . . . . . . . . . . .  7
     3.1.  Protocol Components  . . . . . . . . . . . . . . . . . . .  7
       3.1.1.  Session  . . . . . . . . . . . . . . . . . . . . . . .  7
       3.1.2.  Signaling Plane  . . . . . . . . . . . . . . . . . . .  8
       3.1.3.  Media Plane  . . . . . . . . . . . . . . . . . . . . .  8
       3.1.4.  Associated Media . . . . . . . . . . . . . . . . . . .  9
       3.1.5.  Overload . . . . . . . . . . . . . . . . . . . . . . .  9
       3.1.6.  Session Attempt  . . . . . . . . . . . . . . . . . . . 10
       3.1.7.  Established Session  . . . . . . . . . . . . . . . . . 10
       3.1.8.  Session Attempt Failure  . . . . . . . . . . . . . . . 11
     3.2.  Test Components  . . . . . . . . . . . . . . . . . . . . . 11
       3.2.1.  Emulated Agent . . . . . . . . . . . . . . . . . . . . 11
       3.2.2.  Signaling Server . . . . . . . . . . . . . . . . . . . 12
       3.2.3.  SIP Transport Protocol . . . . . . . . . . . . . . . . 12
     3.3.  Test Setup Parameters  . . . . . . . . . . . . . . . . . . 13
       3.3.1.  Session Attempt Rate . . . . . . . . . . . . . . . . . 13
       3.3.2.  Establishment Threshold Time . . . . . . . . . . . . . 13
       3.3.3.  Session Duration . . . . . . . . . . . . . . . . . . . 14
       3.3.4.  Media Packet Size  . . . . . . . . . . . . . . . . . . 14
       3.3.5.  Codec Type . . . . . . . . . . . . . . . . . . . . . . 15
     3.4.  Benchmarks . . . . . . . . . . . . . . . . . . . . . . . . 15
       3.4.1.  Session Establishment Rate . . . . . . . . . . . . . . 16
       3.4.2.  Registration Rate  . . . . . . . . . . . . . . . . . . 16
       3.4.3.  Registration Attempt Rate  . . . . . . . . . . . . . . 17
   4.  IANA Considerations  . . . . . . . . . . . . . . . . . . . . . 17
   5.  Security Considerations  . . . . . . . . . . . . . . . . . . . 17
   6.  Acknowledgments  . . . . . . . . . . . . . . . . . . . . . . . 18
   7.  References . . . . . . . . . . . . . . . . . . . . . . . . . . 18
     7.1.  Normative References . . . . . . . . . . . . . . . . . . . 18
     7.2.  Informational References . . . . . . . . . . . . . . . . . 19
   Authors' Addresses . . . . . . . . . . . . . . . . . . . . . . . . 19















Davids, et al.            Expires May 16, 2015                  [Page 3]

Internet-Draft        SIP Benchmarking Terminology         November 2014


1.  Terminology

   The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT",
   "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this
   document are to be interpreted as described in BCP 14, RFC2119
   [RFC2119].  RFC 2119 defines the use of these key words to help make
   the intent of standards track documents as clear as possible.  While
   this document uses these keywords, this document is not a standards
   track document.  The term Throughput is defined in RFC2544 [RFC2544].

   For the sake of clarity and continuity, this document adopts the
   template for definitions set out in Section 2 of RFC 1242 [RFC1242].

   The term Device Under Test (DUT) is defined in the following BMWG
   documents:

      Device Under Test (DUT) (c.f., Section 3.1.1 RFC 2285 [RFC2285]).

   Many commonly used SIP terms in this document are defined in RFC 3261
   [RFC3261].  For convenience the most important of these are
   reproduced below.  Use of these terms in this document is consistent
   with their corresponding definition in the base SIP specification
   [RFC3261] as amended by [RFC4320], [RFC5393] and [RFC6026].

   o  Call Stateful: A proxy is call stateful if it retains state for a
      dialog from the initiating INVITE to the terminating BYE request.
      A call stateful proxy is always transaction stateful, but the
      converse is not necessarily true.
   o  Stateful Proxy: A logical entity, as defined by [RFC3261], that
      maintains the client and server transaction state machines during
      the processing of a request.  (Also known as a transaction
      stateful proxy.)  The behavior of a stateful proxy is further
      defined in Section 16 of RFC 3261 [RFC3261] .  A transaction
      stateful proxy is not the same as a call stateful proxy.
   o  Back-to-back User Agent: A back-to-back user agent (B2BUA) is a
      logical entity that receives a request and processes it as a user
      agent server (UAS).  In order to determine how the request should
      be answered, it acts as a user agent client (UAC) and generates
      requests.  Unlike a proxy server, it maintains dialog state and
      must participate in all requests sent on the dialogues it has
      established.  Since it is a concatenation of a UAC and a UAS, no
      explicit definitions are needed for its behavior.


2.  Introduction

   Service Providers and IT Organizations deliver Voice Over IP (VoIP)
   and Multimedia network services based on the IETF Session Initiation



Davids, et al.            Expires May 16, 2015                  [Page 4]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   Protocol (SIP) [RFC3261].  SIP is a signaling protocol originally
   intended to be used to dynamically establish, disconnect and modify
   streams of media between end users.  As it has evolved it has been
   adopted for use in a growing number of services and applications.
   Many of these result in the creation of a media session, but some do
   not.  Examples of this latter group include text messaging and
   subscription services.  The set of benchmarking terms provided in
   this document is intended for use with any SIP-enabled device
   performing SIP functions in the interior of the network, whether or
   not these result in the creation of media sessions.  The performance
   of end-user devices is outside the scope of this document.

   A number of networking devices have been developed to support SIP-
   based VoIP services.  These include SIP Servers, Session Border
   Controllers (SBC) and Back-to-back User Agents (B2BUA).  These
   devices contain a mix of voice and IP functions whose performance may
   be reported using metrics defined by the equipment manufacturer or
   vendor.  The Service Provider or IT Organization seeking to compare
   the performance of such devices will not be able to do so using these
   vendor-specific metrics, whose conditions of test and algorithms for
   collection are often unspecified.

   SIP functional elements and the devices that include them can be
   configured many different ways and can be organized into various
   topologies.  These configuration and topological choices impact the
   value of any chosen signaling benchmark.  Unless these conditions-of-
   test are defined, a true comparison of performance metrics across
   multiple vendor implementations will not be possible.

   Some SIP-enabled devices terminate or relay media as well as
   signaling.  The processing of media by the device impacts the
   signaling performance.  As a result, the conditions-of-test must
   include information as to whether or not the device under test
   processes media.  If the device processes media during the test, a
   description of the media must be provided.  This document and its
   companion methodology document [I-D.ietf-bmwg-sip-bench-meth] provide
   a set of black-box benchmarks for describing and comparing the
   performance of devices that incorporate the SIP User Agent Client and
   Server functions and that operate in the network's core.

   The definition of SIP performance benchmarks necessarily includes
   definitions of Test Setup Parameters and a test methodology.  These
   enable the Tester to perform benchmarking tests on different devices
   and to achieve comparable results.  This document provides a common
   set of definitions for Test Components, Test Setup Parameters, and
   Benchmarks.  All the benchmarks defined are black-box measurements of
   the SIP signaling plane.  The Test Setup Parameters and Benchmarks
   defined in this document are intended for use with the companion



Davids, et al.            Expires May 16, 2015                  [Page 5]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   Methodology document.

2.1.  Scope

   The scope of this document is summarized as follows:
   o  This terminology document describes SIP signaling performance
      benchmarks for black-box measurements of SIP networking devices.
      Stress and debug scenarios are not addressed in this document.
   o  The DUT must be RFC 3261 capable network equipment.  This may be a
      Registrar, Redirect Server, or Stateful Proxy.  This document does
      not require the intermediary to assume the role of a stateless
      proxy.  A DUT may also include a B2BUA, SBC functionality.
   o  The Tester acts as multiple "Emulated Agents" (EA) that initiate
      (or respond to) SIP messages as session endpoints and source (or
      receive) associated media for established connections.
   o  SIP Signaling in presence of media
      *  The media performance is not benchmarked.
      *  Some tests require media, but the use of media is limited to
         observing the performance of SIP signaling.  Tests that require
         media will annotate the media characteristics as a condition of
         test.
      *  The type of DUT dictates whether the associated media streams
         traverse the DUT.  Both scenarios are within the scope of this
         document.
      *  SIP is frequently used to create media streams; the signaling
         plane and media plane are treated as orthogonal to each other
         in this document.  While many devices support the creation of
         media streams, benchmarks that measure the performance of these
         streams are outside the scope of this document and its
         companion methodology document [I-D.ietf-bmwg-sip-bench-meth].
         Tests may be performed with or without the creation of media
         streams.  The presence or absence of media streams MUST be
         noted as a condition of the test as the performance of SIP
         devices may vary accordingly.  Even if the media is used during
         benchmarking, only the SIP performance will be benchmarked, not
         the media performance or quality.
   o  Both INVITE and non-INVITE scenarios (registrations) are addressed
      in this document.  However, benchmarking SIP presence or
      subscribe-notify extensions is not a part of this document.
   o  Different transport -- such as UDP, TCP, SCTP, or TLS -- may be
      used.  The specific transport mechanism MUST be noted as a
      condition of the test as the performance of SIP devices may vary
      accordingly.
   o  REGISTER and INVITE requests may be challenged or remain
      unchallenged for authentication purpose.  Whether or not the
      REGISTER and INVITE requests are challenged is a condition of test
      which will be recorded along with other such parameters which may
      impact the SIP performance of the device or system under test.



Davids, et al.            Expires May 16, 2015                  [Page 6]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   o  Re-INVITE requests are not considered in scope of this document
      since the benchmarks for INVITEs are based on the dialog created
      by the INVITE and not on the transactions that take place within
      that dialog.
   o  Only session establishment is considered for the performance
      benchmarks.  Session disconnect is not considered in the scope of
      this document.  This is because our goal is to determine the
      maximum capacity of the device or system under test, that is the
      number of simultaneous SIP sessions that the device or system can
      support.  It is true that there are BYE requests being created
      during the test process.  These transactions do contribute to the
      load on the device or system under test and thus are accounted for
      in the metric we derive.  We do not seek a separate metric for the
      number of BYE transactions a device or system can support.
   o  IMS-specific scenarios are not considered, but test cases can be
      applied with 3GPP-specific SIP signaling and the P-CSCF as a DUT.
   o  The benchmarks described in this document are intended for a
      laboratory environment and are not intended to be used on a
      production network.  Some of the benchmarks send enough traffic
      that a denial of service attack is possible if used in production
      networks.


3.  Term Definitions

3.1.  Protocol Components

3.1.1.  Session

   Definition:
      The combination of signaling and media messages and associated
      processing that enable a single SIP-based audio or video call, or
      SIP registration.

   Discussion:
      The term "session" commonly implies a media session.  In this
      document the term is extended to cover the signaling and any media
      specified and invoked by the corresponding signaling.

   Measurement Units:
      N/A.

   Issues:
      None.







Davids, et al.            Expires May 16, 2015                  [Page 7]

Internet-Draft        SIP Benchmarking Terminology         November 2014



   See Also:
      Media Plane
      Signaling Plane
      Associated Media

3.1.2.  Signaling Plane

   Definition:
      The plane in which SIP messages [RFC3261] are exchanged between
      SIP Agents [RFC3261].

   Discussion:
      SIP messages are used to establish sessions in several ways:
      directly between two User Agents [RFC3261], through a Proxy Server
      [RFC3261], or through a series of Proxy Servers.  The Session
      Description Protocol (SDP) is included in the Signaling Plane.

   Measurement Units:
      N/A.

   Issues:
      None.

   See Also:
      Media Plane
      EAs

3.1.3.  Media Plane

   Definition:
      The data plane in which one or more media streams and their
      associated media control protocols (e.g., RTCP [RFC3550]) are
      exchanged between User Agents after a media connection has been
      created by the exchange of signaling messages in the Signaling
      Plane.

   Discussion:
      Media may also be known as the "bearer channel".  The Media Plane
      MUST include the media control protocol, if one is used, and the
      media stream(s).  Examples of media are audio and video.  The
      media streams are described in the SDP of the Signaling Plane.

   Measurement Units:







Davids, et al.            Expires May 16, 2015                  [Page 8]

Internet-Draft        SIP Benchmarking Terminology         November 2014


      N/A.

   Issues:
      None.

   See Also:
      Signaling Plane

3.1.4.  Associated Media

   Definition:
      Media that corresponds to an 'm' line in the SDP payload of the
      Signaling Plane.

   Discussion:
      The format of the media is determined by the SDP attributes for
      the corresponding 'm' line.

   Measurement Units:
      N/A.

   Issues:
      None.


3.1.5.  Overload

   Definition:
      Overload is defined as the state where a SIP server does not have
      sufficient resources to process all incoming SIP messages
      [RFC6357].

   Discussion:
      The distinction between an overload condition and other failure
      scenarios is outside the scope of black box testing and of this
      document.  Under overload conditions, all or a percentage of
      Session Attempts will fail due to lack of resources.  In black box
      testing the cause of the failure is not explored.  The fact that a
      failure occurred for whatever reason, will trigger the tester to
      reduce the offered load, as described in the companion methodology
      document, [I-D.ietf-bmwg-sip-bench-meth].  SIP server resources
      may include CPU processing capacity, network bandwidth, input/
      output queues, or disk resources.  Any combination of resources
      may be fully utilized when a SIP server (the DUT) is in the
      overload condition.  For proxy-only (or intermediary) devices, it
      is expected that the proxy will be driven into overload based on
      the delivery rate of signaling requests.




Davids, et al.            Expires May 16, 2015                  [Page 9]

Internet-Draft        SIP Benchmarking Terminology         November 2014



   Measurement Units:
      N/A.

3.1.6.  Session Attempt

   Definition:
      A SIP INVITE or REGISTER request sent by the EA that has not
      received a final response.

   Discussion:
      The attempted session may be either an invitation to an audio/
      video communication or a registration attempt.  When counting the
      number of session attempts we include all requests that are
      rejected for lack of authentication information.  The EA needs to
      record the total number of session attempts including those
      attempts that are routinely rejected by a proxy that requires the
      UA to authenticate itself.  The EA is provisioned to deliver a
      specific number of session attempts per second.  But the EA must
      also count the actual number of session attempts per given time
      interval.

   Measurement Units:
      N/A.

   Issues:
      None.

   See Also:
      Session
      Session Attempt Rate

3.1.7.  Established Session

   Definition:
      A SIP session for which the EA acting as the UE/UA has received a
      200 OK message.

   Discussion:
      An Established Session may be either an invitation to an audio/
      video communication or a registration attempt.  Early dialogues
      for INVITE requests are out of scope for this work.

   Measurement Units:







Davids, et al.            Expires May 16, 2015                 [Page 10]

Internet-Draft        SIP Benchmarking Terminology         November 2014


      N/A.

   Issues:
      None.

   See Also:
      None.

3.1.8.  Session Attempt Failure

   Definition:
      A session attempt that does not result in an Established Session.

   Discussion:
      The session attempt failure may be indicated by the following
      observations at the EA:
      1.  Receipt of a SIP 3xx-, 4xx-, 5xx-, or 6xx-class response to a
          Session Attempt.
      2.  The lack of any received SIP response to a Session Attempt
          within the Establishment Threshold Time (c.f.  Section 3.3.2).


   Measurement Units:
      N/A.

   Issues:
      None.

   See Also:
      Session Attempt

3.2.  Test Components

3.2.1.  Emulated Agent

   Definition:
      A device in the test topology that initiates/responds to SIP
      messages as one or more session endpoints and, wherever
      applicable, sources/receives Associated Media for Established
      Sessions.

   Discussion:
      The EA functions in the Signaling and Media Planes.  The Tester
      may act as multiple EAs.







Davids, et al.            Expires May 16, 2015                 [Page 11]

Internet-Draft        SIP Benchmarking Terminology         November 2014



   Measurement Units:
      N/A

   Issues:
      None.

   See Also:
      Media Plane
      Signaling Plane
      Established Session
      Associated Media

3.2.2.  Signaling Server

   Definition:
      Device in the test topology that facilitates the creation of
      sessions between EAs.  This device is the DUT.

   Discussion:
      The DUT is a RFC3261-capable network intermediary such as a
      Registrar, Redirect Server, Stateful Proxy, B2BUA or SBC.

   Measurement Units:
      NA

   Issues:
      None.

   See Also:
      Signaling Plane

3.2.3.  SIP Transport Protocol

   Definition:
      The protocol used for transport of the Signaling Plane messages.

   Discussion:
      Performance benchmarks may vary for the same SIP networking device
      depending upon whether TCP, UDP, TLS, SCTP, websockets [RFC7118]
      or any future transport layer protocol is used.  For this reason
      it is necessary to measure the SIP Performance Benchmarks using
      these various transport protocols.  Performance Benchmarks MUST
      report the SIP Transport Protocol used to obtain the benchmark
      results.






Davids, et al.            Expires May 16, 2015                 [Page 12]

Internet-Draft        SIP Benchmarking Terminology         November 2014



   Measurement Units:
      While these are not units of measure, they are attributes that are
      one of many factors that will contribute to the value of the
      measurements to be taken.  TCP, UDP, SCTP, TLS over TCP, TLS over
      UDP, TLS over SCTP, and websockets are among the possible values
      to be recorded as part of the test.

   Issues:
      None.

   See Also:
      None.

3.3.  Test Setup Parameters

3.3.1.  Session Attempt Rate

   Definition:
      Configuration of the EA for the number of sessions per second
      (sps) that the EA attempts to establish using the services of the
      DUT.

   Discussion:
      The Session Attempt Rate is the number of sessions per second that
      the EA sends toward the DUT.  Some of the sessions attempted may
      not result in a session being established.

   Measurement Units:
      Session attempts per second

   Issues:
      None.

   See Also:
      Session
      Session Attempt

3.3.2.  Establishment Threshold Time

   Definition:
      Configuration of the EA that represents the amount of time that an
      EA client will wait for a response from an EA server before
      declaring a Session Attempt Failure.







Davids, et al.            Expires May 16, 2015                 [Page 13]

Internet-Draft        SIP Benchmarking Terminology         November 2014



   Discussion:
      This time duration is test dependent.

      It is RECOMMENDED that the Establishment Threshold Time value be
      set to Timer B or Timer F as specified in RFC 3261, Table 4
      [RFC3261].

   Measurement Units:
      Seconds

   Issues:
      None.

   See Also:
      None.

3.3.3.  Session Duration

   Definition:
      Configuration of the EA that represents the amount of time that
      the SIP dialog is intended to exist between the two EAs associated
      with the test.

   Discussion:
      The time at which the BYE is sent will control the Session
      Duration.

   Measurement Units:
      seconds

   Issues:
      None.

   See Also:
      None.

3.3.4.  Media Packet Size

   Definition:
      Configuration on the EA for a fixed number of frames or samples to
      be sent in each RTP packet of the media stream when the test
      involves Associated Media.








Davids, et al.            Expires May 16, 2015                 [Page 14]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   Discussion:
      This document describes a method to measure SIP performance.  If
      the DUT is processing media as well as SIP messages the media
      processing will potentially slow down the SIP processing and lower
      the SIP performance metric.  The tests with associated media are
      designed for audio codecs and the assumption was made that larger
      media packets would require more processor time.  This document
      does not define parameters applicable to video codecs.

      For a single benchmark test, media sessions use a defined number
      of samples or frames per RTP packet.  If two SBCs, for example,
      used the same codec but one puts more frames into the RTP packet,
      this might cause variation in the performance benchmark results.

   Measurement Units:
      An integer number of frames or samples, depending on whether
      hybrid- or sample-based codec are used, respectively.

   Issues:
      None.

   See Also:
      None.

3.3.5.  Codec Type

   Definition:
      The name of the codec used to generate the media session.

   Discussion
      For a single benchmark test, all sessions use the same size packet
      for media streams.  The size of packets can cause a variation in
      the performance benchmark measurements.

   Measurement Units:
      This is a textual name (alphanumeric) assigned to uniquely
      identify the codec.

   Issues:
      None.
   See Also:
      None.

3.4.  Benchmarks







Davids, et al.            Expires May 16, 2015                 [Page 15]

Internet-Draft        SIP Benchmarking Terminology         November 2014


3.4.1.  Session Establishment Rate

   Definition:
      The maximum value of the Session Attempt Rate that the DUT can
      handle for an extended, pre-defined, period with zero failures.

   Discussion:
      This benchmark is obtained with zero failure.  The session attempt
      rate provisioned on the EA is raised and lowered as described in
      the algorithm in the accompanying methodology document
      [I-D.ietf-bmwg-sip-bench-meth], until a traffic load over the
      period of time necessary to attempt N sessions completes without
      failure, where N is a parameter specified in the algorithm and
      recorded in the Test Setup Report.

   Measurement Units:
      sessions per second (sps)

   Issues:
      None.

   See Also:
      Invite-Initiated Sessions
      Non-Invite-Initiated Sessions
      Session Attempt Rate

3.4.2.  Registration Rate

   Definition:
      The maximum value of the Registration Attempt Rate that the DUT
      can handle for an extended, pre-defined, period with zero
      failures.

   Discussion:
      This benchmark is obtained with zero failures.  The registration
      rate provisioned on the Emulated Agent is raised and lowered as
      described in the algorithm in the companion methodology draft
      [I-D.ietf-bmwg-sip-bench-meth], until a traffic load consisting of
      registration attempts at the given attempt rate over the period of
      time necessary to attempt N registrations completes without
      failure, where N is a parameter specified in the algorithm and
      recorded in the Test Setup Report.
      This benchmark is described separately from the Session
      Establishment Rate (Section 3.4.1), although it could be
      considered a special case of that benchmark, since a REGISTER
      request is a request for a Non-Invite-Initiated session.  It is
      defined separately because it is a very important benchmark for
      most SIP installations.  An example demonstrating its use is an



Davids, et al.            Expires May 16, 2015                 [Page 16]

Internet-Draft        SIP Benchmarking Terminology         November 2014


      avalanche restart, where hundreds of thousands of end points
      register simultaneously following a power outage.  In such a case,
      an authoritative measurement of the capacity of the device to
      register endpoints is useful to the network designer.
      Additionally, in certain controlled networks, there appears to be
      a difference between the registration rate of new endpoints and
      the registering rate of existing endpoints (register refreshes).
      This benchmark can capture these differences as well.

   Measurement Units:
      registrations per second (rps)

   Issues:
      None.

   See Also:
      None.

3.4.3.  Registration Attempt Rate

   Definition:
      Configuration of the EA for the number of registrations per second
      that the EA attempts to send to the DUT.

   Discussion:
      The Registration Attempt Rate is the number of registration
      requests per second that the EA sends toward the DUT.

   Measurement Units:
      Registrations per second (rps)

   Issues:
      None.

   See Also:  Non-Invite-Initiated Session


4.  IANA Considerations

   This document requires no IANA considerations.


5.  Security Considerations

   Documents of this type do not directly affect the security of
   Internet or corporate networks as long as benchmarking is not
   performed on devices or systems connected to production networks.
   Security threats and how to counter these in SIP and the media layer



Davids, et al.            Expires May 16, 2015                 [Page 17]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   is discussed in RFC3261 [RFC3261], RFC 3550 [RFC3550] and RFC3711
   [RFC3711].  This document attempts to formalize a set of common
   terminology for benchmarking SIP networks.  Packets with unintended
   and/or unauthorized DSCP or IP precedence values may present security
   issues.  Determining the security consequences of such packets is out
   of scope for this document.


6.  Acknowledgments

   The authors would like to thank Keith Drage, Cullen Jennings, Daryl
   Malas, Al Morton, and Henning Schulzrinne for invaluable
   contributions to this document.  Dale Worley provided an extensive
   review that lead to improvements in the documents.  We are grateful
   to Barry Constantine, William Cerveny and Robert Sparks for providing
   valuable comments during the document's last calls and expert
   reviews.  Al Morton and Sarah Banks have been exemplary working group
   chairs, we thank them for tracking this work to completion.


7.  References

7.1.  Normative References

   [RFC2119]  Bradner, S., "Key words for use in RFCs to Indicate
              Requirement Levels", BCP 14, RFC 2119, March 1997.

   [RFC2544]  Bradner, S. and J. McQuaid, "Benchmarking Methodology for
              Network Interconnect Devices", RFC 2544, March 1999.

   [RFC3261]  Rosenberg, J., Schulzrinne, H., Camarillo, G., Johnston,
              A., Peterson, J., Sparks, R., Handley, M., and E.
              Schooler, "SIP: Session Initiation Protocol", RFC 3261,
              June 2002.

   [RFC5393]  Sparks, R., Lawrence, S., Hawrylyshen, A., and B. Campen,
              "Addressing an Amplification Vulnerability in Session
              Initiation Protocol (SIP) Forking Proxies", RFC 5393,
              December 2008.

   [RFC4320]  Sparks, R., "Actions Addressing Identified Issues with the
              Session Initiation Protocol's (SIP) Non-INVITE
              Transaction", RFC 4320, January 2006.

   [RFC6026]  Sparks, R. and T. Zourzouvillys, "Correct Transaction
              Handling for 2xx Responses to Session Initiation Protocol
              (SIP) INVITE Requests", RFC 6026, September 2010.




Davids, et al.            Expires May 16, 2015                 [Page 18]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   [I-D.ietf-bmwg-sip-bench-meth]
              Davids, C., Gurbani, V., and S. Poretsky, "SIP Performance
              Benchmarking Methodology",
              draft-ietf-bmwg-sip-bench-meth-10 (work in progress),
              May 2014.

7.2.  Informational References

   [RFC2285]  Mandeville, R., "Benchmarking Terminology for LAN
              Switching Devices", RFC 2285, February 1998.

   [RFC1242]  Bradner, S., "Benchmarking terminology for network
              interconnection devices", RFC 1242, July 1991.

   [RFC3550]  Schulzrinne, H., Casner, S., Frederick, R., and V.
              Jacobson, "RTP: A Transport Protocol for Real-Time
              Applications", STD 64, RFC 3550, July 2003.

   [RFC3711]  Baugher, M., McGrew, D., Naslund, M., Carrara, E., and K.
              Norrman, "The Secure Real-time Transport Protocol (SRTP)",
              RFC 3711, March 2004.

   [RFC6357]  Hilt, V., Noel, E., Shen, C., and A. Abdelal, "Design
              Considerations for Session Initiation Protocol (SIP)
              Overload Control", RFC 6357, August 2011.

   [RFC7118]  Baz Castillo, I., Millan Villegas, J., and V. Pascual,
              "The Websocket Protocol as a Transport for the Session
              Initiation Protocol (SIP)", RFC 7118, January 2014.


Authors' Addresses

   Carol Davids
   Illinois Institute of Technology
   201 East Loop Road
   Wheaton, IL  60187
   USA

   Phone: +1 630 682 6024
   Email: davids@iit.edu










Davids, et al.            Expires May 16, 2015                 [Page 19]

Internet-Draft        SIP Benchmarking Terminology         November 2014


   Vijay K. Gurbani
   Bell Laboratories, Alcatel-Lucent
   1960 Lucent Lane
   Rm 9C-533
   Naperville, IL  60566
   USA

   Phone: +1 630 224 0216
   Email: vkg@bell-labs.com


   Scott Poretsky
   Allot Communications
   300 TradeCenter, Suite 4680
   Woburn, MA  08101
   USA

   Phone: +1 508 309 2179
   Email: sporetsky@allot.com
































Davids, et al.            Expires May 16, 2015                 [Page 20]