Tech-invite3GPPspaceIETFspace
21222324252627282931323334353637384‑5x

Content for  TS 33.117  Word version:  17.2.0

Top   Top   Up   Prev   None
1…   4.2…   4.2.3…   4.2.3.4…   4.2.3.5…   4.2.4…   4.3…   4.3.4…   4.4…

 

4.4  Basic vulnerability testing requirementsp. 97

4.4.1  Introduction |R16|p. 97

Basic Vulnerability Testing activities consist of requirements for running automated Free and Open Source Software (FOSS) and Commercial off-the-shelf (COTS) security testing tools against the external interfaces of a Network Product. These activities cover at least four aspects: Port Scanning, Vulnerability Scanner by the use of Vulnerability scanners and robustness/fuzz testing, and endpoint scanning. For each of these aspects, test requirements and test results are described in the present clause.
Up

4.4.2  Port Scanning |R16|p. 97

Requirement Name:
Port scanning
Requirement Description:
It shall be ensured that on all network interfaces, only documented ports on the transport layer respond to requests from outside the system.
The test for this requirement can be carried out using a suitable tool or manually performed as described below. If a tool is used then the tester needs to provide evidence, e.g. by referring to the documentation of the tool, that the tool actually provides functionality equivalent to the steps described below.
Test Case:
Test Name:
TC_BVT_PORT_SCANNING
Purpose:
To ensured that on all network interfaces, only documented ports on the transport layer respond to requests from outside the system
Procedure and execution steps:
Pre-Conditions:
A list of all available network services containing at least the following information shall be included in the documentation accompanying the Network Product:
  1. all interfaces providing IP-based protocols;
  2. the available transport layer protocols on these interfaces;
  3. their open ports and associated services per transport layer protocol;
  4. and a free-form description of their purposes.
The port scanning tool that is used shall be capable to detect open ports on the relevant transport layer protocols.
Execution Steps:
The accredited evaluator's test lab is required to execute the following steps:
  1. Verification of the compliance to the prerequisites:
    1. Verification that the list of available network services is available in the documentation of the Network Product
    2. Validation that all entries in the list of services are meaningful and reasonably necessary for the operation of the Network Product class
  2. Identification of the open ports by means of capable port scanning tools or other suitable testing means
  3. Verification that the list of identified open ports matches the list of available network services in the documentation of the Network Product
Expected Results:
The used tool(s) name, their unambiguous version (also for plug-ins if applicable), used settings, and the relevant output containing all the technically relevant information about test results is evidence and shall be part of the testing documentation.
All discrepancies between the list of identified open ports and the list of available network services in the documentation shall be highlighted in the testing documentation.
Expected format of evidence:
Output of portscan and list of identified discrepancies.
Up

4.4.3  Vulnerability scanning |R16|p. 98

Requirement Name:
Vulnerability scanning
Requirement Description:
The purpose of vulnerability scanning is to ensure that there are no known vulnerabilities (or that relevant vulnerabilities are identified and remediation plans in place to mitigate them) on the Network Product, both in the OS and in the applications installed, that can be detected by means of automatic testing tools via the Internet Protocol enabled network interfaces.
Vulnerability scanning tools may also report false positives and they shall be investigated and documented in the test report.
The test for this requirement can be carried out using a suitable tool or manually performed as described below. If a tool is used then the tester needs to provide evidence, e.g. by referring to the documentation of the tool, that the tool actually provides functionality equivalent to the steps described below.
Test case:
Test Name:
TC_BVT_VULNERABILITY_SCANNING
Purpose:
The purpose of vulnerability scanning is to ensure that there are no known vulnerabilities (or that relevant vulnerabilities are identified and remediation plans in place to mitigate them) on the Network Product that can be detected by means of automatic testing tools via the Internet Protocol enabled network interfaces.
Procedure and execution steps:
Pre-Conditions:
A list of all available network services containing at least the following information shall be included in the documentation accompanying the Network Product:
  • all interfaces providing IP-based protocols;
  • the available transport layer protocols on these interfaces;
  • their open ports and associated services;
  • and a free-form description of their purposes.
The used vulnerability scanning tool shall be capable to detect known vulnerabilities on common services. The used vulnerability information shall be reasonably recent at the time of testing.
Execution Steps:
The accredited evaluator's test lab is required to execute the following steps:
  1. Execution of the suitable vulnerability scanning tool against all interfaces providing IP-based protocols of the Network Product.
  2. Evaluation of the results based on their severity.
Expected Results:
The used tool(s) name, their unambiguous version (also for plug-ins if applicable), used settings, and the relevant output is evidence and shall be part of the testing documentation.
The discovered vulnerabilities (including source, example CVE ID), together with a rating of their severity, shall be highlighted in the testing documentation.
COTS Vulnerability scanners, by their nature, (e.g. depending on how they are configured) may result in false findings/positives. The tool's documentation may even mention that the failing test shall be repeated to check whether it is really a recurring problem or not. The tester shall make best effort to determine if there is an issue with NE or the test tool and if necessary, work with the vendor of the network product to come to a consensus on the test result outcome.
Expected format of evidence:
Output of BVT tool.
Up

4.4.4  Robustness and fuzz testing |R16|p. 99

Requirement Name:
Robustness and fuzz testing
Requirement Reference:
Clause 4.2.6.2.2. - Interface Robustness requirements
Requirement Description:
It shall be ensured that externally reachable services are reasonably robust when receiving unexpected input
Test case:
Test Name:
TC_BVT_ROBUSTNESS AND FUZZ TESTING
Purpose:
To verify that the network product provides externally reachable services which are robust against unexpected input. The target of this test are the protocol stacks (e.g. diameter stack) rather than the applications (e.g. web app).
Procedure and execution steps:
Pre-Conditions:
  • The tester has the privileges to log in the network product and to access all system resources (e.g. log files)
  • A list of all available network services containing at least the following information shall be included in the documentation accompanying the Network Product:
  • all interfaces providing IP-based protocols;
  • the available transport layer protocols on these interfaces;
  • their open ports and associated services;
  • and a free-form description of their purposes.
  • The robustness and fuzzing tools that are selected for this test shall utilize state-of-the-art technology to identify input which causes the Network Product to behave in an unspecified, undocumented, or unexpected manner.
  • Fuzz testing tools are a highly sophisticated technology and adaptation to the individual protocols in question is needed to be effective. Therefore, there is a lack of available effective fuzz testing tools available especially for protocols proprietary to the Telco industry. Taking into account note 4 of clause 7.2.4 of TR 33.916, test labs shall acquire fuzz testing tools for those protocols where commercially feasible.
  • It needs to be taken into account that fuzz testing tools might show drastic differences in terms of effectiveness. The accredited test lab is expected to have sufficient expertise to recognize the level of effectiveness of the available tools.
  • A network traffic analyser on the network product (e.g. TCPDUMP) or an external traffic analyser directly connected to the network product and on a tester machine is available.
Execution Steps:
The accredited evaluator's test lab is required to execute the following steps:
  1. Execution of available effective fuzzing tools against the protocols available via interfaces providing IP-based protocols of the Network Product for an amount of time sufficient to be effective.
  2. Execution of available effective robustness test tools against the protocols available via interfaces providing IP-based protocols of the Network Product for an amount of time sufficient to be effective.
  3. For both step 1 and 2:
    1. Using a network traffic analyser on the network product (e.g. TCPDUMP) or an external traffic analyser directly connected to the network product, the tester verifies that the packets are correctly processed by the network product.
    2. The testers verifies that the network product and any running network service does not crash.
    3. The execution of tests shall run sufficient times.
Expected Results:
A list of all of the protocols of the network product reachable externally on an IP-based interface, together with an indication whether effective available robustness and fuzz testing tools have been used against them, shall be part of the testing documentation. If no tool can be acquired for a protocol, a free form statement should explain why not.
The used tool(s) name, their unambiguous version (also for plug-ins if applicable), used settings, and the relevant output is evidence and shall be part of the testing documentation.
Any input causing unspecified, undocumented, or unexpected behaviour, and a description of this behaviour shall be highlighted in the testing documentation.
COTS fuzzing tools, by their nature, may have an acceptable failure rate (e.g. 0.1%) due to different non-deterministic variables in their implementation. At some point the tool's documentation may even mention that the failing test shall be repeated to check whether it is really a recurring problem or not. The tester shall make best effort to determine if there is an issue with NE or the test tool and if necessary, work with the vendor of the network product to come to a consensus on the test result outcome.
Expected format of evidence:
A testing report provided by the testing agency which will consist of the following information:
  • The used tool(s) name and version information,
  • Settings and configurations used
  • The output log file of the chosen tool that displays the results (passed/failed).
  • Screenshot
  • Test result (Passed or not)
  • Log/evidence tracing possible crashes
  • Any input causing unspecified, undocumented, or unexpected behaviour
Up

$  Change historyp. 102


Up   Top