SPEC

SPEC Fair Use Rules

Updated 28 May 2025 [view the change history]

Introduction

Consistency and fairness are guiding principles for SPEC. To help assure that these principles are met, the following requirements must be met by any organization or individual who makes public use of SPEC benchmark results.

Section I lists general requirements that apply to public use of all SPEC benchmarks. Section II lists additional specific requirements for individual benchmarks.

It is intended that this document provides the information needed for compliance with Fair Use, and in the event of any inconsistencies, this document takes precedence over individual benchmark run rules fair use requirements.

I. General Requirements For Public Use of All SPEC Benchmark Results

I.A. Requirements List

  1. Compliance. Claimed results must be compliant with that benchmark's rules. See definition: compliant result. (Certain Exceptions may apply.)
  1. Data Sources
    1. Source(s) must be stated for quoted SPEC results.
    2. Such sources must be publicly available, from SPEC or elsewhere.
    3. The licensee (the entity responsible for the result) must be clearly identifiable from the source.
    4. The date that the data was retrieved must be stated.
    5. The SPEC web site (http://www.spec.org) or a suitable sub page must be noted as a resource for additional information about the benchmark.
  1. Clear and correct, as of a specific date

    1. Statements regarding SPEC, its benchmarks, and results published by SPEC, must be clear and correct.

    2. A claim must state a date as of which data was retrieved.

    3. A claim may compare newly announced compliant results vs. data retrieved earlier.

    4. There is no requirement to update a claim when later results are published.

      For example, an Acme web page dated 28 January 2011 announces performance results for the Model A and claims "the best SPECweb® 2009 benchmark performance when compared vs. results published at www.spec.org as of 26 January 2011". If SPEC publishes better results on 1 February, there is no requirement to update the page.

  2. Trademarks

    1. Reference must be made to the SPEC trademark. Such reference may be included in a notes section with other trademark references (SPEC trademarks are listed at http://www.spec.org/spec/trademarks.html).

    2. SPEC's trademarks may not be used to mislabel something that is not a SPEC metric.

      For example, suppose that a Gaming Society compares performance using a composite of a weighted subset of the SPEC CPU 2006 benchmark plus a weighted subset of the SPECviewperf 11 benchmark, and calls its composite "GamePerfMark". The composite, weighting, and subsetting are done by the Society, not by SPEC. The composite may be useful and interesting, but it may not be represented as a SPEC metric. It would be a Fair Use violation to reference it as "SPECgame".

  3. Required Metrics. In the tables below, some benchmarks have Required Metrics. Public statements must include these.

  4. Comparisons. It is fair to compare compliant results to other compliant results. Enabling such comparisons is a core reason why the SPEC benchmarks exist. Each benchmark product has workloads, software tools, run rules, and review processes that are intended to improve the technical credibility and relevance of such comparisons.

    When comparisons are made,

    1. SPEC metrics may be compared only to SPEC metrics.
    2. The basis for comparison must be stated.
    1. Results of one benchmark are not allowed to be compared to a different benchmark (e.g. SPECjAppServer2004 to TPC-C; or SPECvirt_sc2010 to SPECweb 2005).
    2. Results of a benchmark may not be compared to a different major release of the same benchmark (e.g. SPECweb 2005 to SPECweb 2009). Exception: normalized historical comparisons may be made as described under Retired Benchmarks.
    3. Comparisons of non-compliant numbers. The comparison of non-compliant numbers to compliant results is restricted to certain exceptional cases described later in this Fair Use Rule (Academic/Research usage; Estimates, for those benchmarks that allow estimates; Normalized Historical Comparisons). Where allowed, comparisons that include non-compliant numbers must not be misleading or deceptive as to compliance. It must be clear from the context of the comparison which numbers are compliant and which are not.

[Back to top]

I.B. Generic Example

This example for a generic SPEC benchmark illustrates the points above. See also the examples for specific benchmarks below, for additional requirements that may apply.

Example: Example: New York, NY, January 28, 2011: Acme Corporation announces that the Model A achieves 100 for the SPECgeneric 2011 benchmark, a new record among systems running Linux [1].

[1] Comparison based on best performing systems using the Linux operating system published at www.spec.org as of 26 January 2011. SPEC® and the benchmark name SPECgeneric® are registered trademarks of the Standard Performance Evaluation Corporation. For more information about the SPECgeneric 2011 benchmark, see www.spec.org/generic2011/.

[Back to Top]

I.C. Compliance Exceptions

Exceptions regarding the compliance requirement are described in this section.

  1. Academic/research usage. SPEC encourages use of its benchmarks in research and academic contexts, on the grounds that SPEC benchmarks represent important characteristics of real world applications and therefore research innovations measured with SPEC benchmarks may benefit real users. SPEC understands that academic use of the SPEC benchmarks may be seen as enhancing the credibility of both the researcher and SPEC.

Research use of SPEC benchmarks may not be able to meet the compliance requirement.

Examples: (1) Testing is done with a simulator rather than real hardware. (2) The software innovation is not generally available or is not of product quality. (3) The SPEC test harness is modified without approval of SPEC.

SPEC has an interest in protecting the integrity of the SPEC metrics, including consistency of methods of measurement and the meaning of the units of measure that are defined by SPEC benchmarks. It would be unfair to those who do meet the compliance requirements if non-compliant numbers were misrepresented as compliant results. Therefore, SPEC recommends that researchers consider using the SPEC workload, but do not call the measurements by the SPEC metric name.

The requirements for Fair Use in academic/research contexts are:

  1. It is a Fair Use violation to imply, to the reasonable reader, that a non-compliant number is a compliant result.

  2. Non-compliance must be clearly disclosed. If the SPEC metric name is used, it is recommended that (nc), for non-compliant, be added after each mention of the metric name. It is understood that there may be other ways to accomplish this in context, for example adding words such as "experimental" or "simulated" or "estimated" or "non-compliant".

  3. Diagrams, Tables, and Abstracts (which, often, are excerpted and used separately) must have sufficient context on their own so that they are not misleading as to compliance.

  4. If non-compliant numbers are compared to compliant results it must be clear from the context which is which.

    Example: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. Our Research Compiler improves the same hardware to SPECint®2006 125(nc). The notation (nc), for non-compliant, is used because our compiler does not meet SPEC's requirements for general availability.

Other Fair Use Requirements Still Apply. This section discusses an exception to only the compliance requirement from the Requirements List. Fair Use in academic/research context must still meet the other requirements, including but not limited to making correct use of SPEC results with dated citations of sources.

  1. Estimates. Some SPEC benchmarks allow estimates, as shown in the tables below. Only for those benchmarks, it is acceptable to compare estimates to compliant results provided that:

    1. Estimates must be clearly identified as such.

    2. Each use of a SPEC metric as an estimate must be clearly marked as an estimate.

    3. If estimates are used in graphs, the word "estimated" or "est." must be plainly visible within the graph, for example in the title, the scale, the legend, or next to each individual number that is estimated.

    4. Licensees are encouraged to give a rationale or methodology for any estimates, together with other information that may help the reader assess the accuracy of the estimate.

      Example 1: The Acme Corporation Model A achieves SPECint®2006 100 in testing published at www.spec.org. The Bugle Corporation Model B will nearly double that performance to SPECint®2006 198(est). The notation (est), for estimated, is used because SPECint®2006 was run on pre-production hardware. Customer systems, planned for Q4, are expected to be similar.

      Example 2: Performance estimates are modeled using the cycle simulator GrokSim Mark IV. It is likely that actual hardware, if built, would include significant differences.

[Back to Top]

I.D. Derived Values

It is sometimes useful to define a numeric unit that includes a SPEC metric plus other information, and then use the new number to compare systems. This is called a Derived Value.

Examples:

SPECint®_rate2006 per chip

SPECvirt_sc®2010 per gigabyte

Note: the examples above are not intended to imply that all derived values use ratios of the form above. The definition is intentionally broad, and includes additional examples

  1. Derived values are acceptable, provided that they follow this Fair Use rule, including but not limited to using compliant results, listing sources for SPEC result data, and including any required metrics.

  2. A derived value must not be represented as a SPEC metric. The context must not give the appearance that SPEC has created or endorsed the derived value. In particular, it is a Fair Use violation, and may be a Trademark violation, to form a new word that looks like a SPEC metric name when there is no such metric.

    Not Acceptable:

    SPECint®_chiprate2006

    SPECvirt_sc®2010gigs

  3. If a derived value is used as the basis of an estimate, the estimate must be correctly labeled. A derived value may introduce seeming opportunities to extrapolate beyond measured data. For example, if 4 different systems all have the same ratio of SPECwhatever per chip, it can be tempting to estimate that another, unmeasured, system will have the same ratio. This may be a very good estimate; but it is still an estimate, and must be correctly labeled. If used in public, it must be for a benchmark that allows estimates.

[Back to Top]

I.E. Non-SPEC Information

  1. A basis for comparison or a derived value may use information from both SPEC and non-SPEC sources.

  2. SPEC values truthfulness and clarity at all times:

    When information from SPEC sources is used in public, SPEC requires that such information be reported correctly (per section I.A.3).

    SPEC recommends that non-SPEC material should be accurate, relevant, and not misleading. Data and methods should be explained and substantiated.

  3. Disclaimer. SPEC is not responsible for non-SPEC information. The SPEC Fair Use rule is limited to the information derived from SPEC sources. (Other rules may apply to the non-SPEC information, such as industry business standards, ethics, or Truth in Advertising law.)

  4. SPEC may point out non-SPEC content. SPEC reserves the right to publicly comment to distinguish SPEC information from non-SPEC information.

  5. Integrity of results and trademarks. The non-SPEC information must not be presented in a manner that may reasonably lead the reader to untrue conclusions about SPEC, its results, or its trademarks.

Examples

Example 1 (basis): ACME Corporation claims the best SPECjEnterprise 2010 benchmark performance for systems available as (example 1a) rack mount, or (1b) with more than 8 disk device slots, or (1c) with Art Deco paint. Bugle Corporation asserts that the basis of comparison is irrelevant or confusing or silly. Bugle may be correct. Nevertheless, such irrelevance, confusion, or silliness would not alone be enough to constitute a SPEC Fair Use violation.

Example 2 (derived value): ACME claims that its model A has better SPECint®_rate2006 per unit of cooling requirement than does the Bugle Model B. SPEC is not responsible for judging thermal characteristics.

Example 3: ACME claims the "best SPECmpi®M_2007 performance among industry-leading servers". This claim violates the requirement that the basis must be clear.

Example 4: ACME computes SPECint®_rate2006 per unit of cooling, but inexplicably selects SPECint®_rate_base2006 for some systems and SPECint®_rate2006 for others. The computation violates the requirement that the SPEC information must be accurate, and may also violate the requirement that a claim should not lead the reasonable reader to untrue conclusions about SPEC's results.

[Back to Top]

I.F. Retired Benchmarks

  1. Disclosure. If public claims are made using a retired benchmark, with compliant results that have not been previously reviewed and accepted by SPEC, then the fact that the benchmark has been retired and new results are no longer being accepted for review and publication by SPEC must be plainly disclosed.

    Example: he Acme Corporation Model A achieves a score of 527 SPECjvm 98. Note: SPECjvm 98 has been retired and SPEC is no longer reviewing or publishing results with that benchmark. We are providing this result as a comparison to older hardware that may still be in use at some customer sites.

  2. Benchmarks that require review. Some benchmarks require that SPEC review and accept results prior to public use. For such benchmarks, the review process is not available after benchmark retirement, and therefore no new results may be published.

  1. Normalized historical comparisons. When SPEC releases a new major version of a benchmark, the SPEC metrics are generally not comparable to the previous version, and there is no formula for converting from one to the other. Nevertheless, SPEC recognizes that there is value in historical comparisons, which are typically done by normalizing performance across current and one or more generations of retired benchmarks, using systems that have been measured with both the older and newer benchmarks as the bridges for the normalization. Historical comparisons are inherently approximate because picking differing 'bridge' systems may yield differing ratios and because an older workload exercises different system capabilities than a more modern workload.

Normalized historical comparisons are acceptable only if their inherently approximate nature is not misrepresented. At minimum:

  1. It must not be claimed that SPEC metrics for one benchmark generation are precisely comparable to metrics from another generation.

  2. The approximate nature must be apparent from the context.

    For example, a graph shown briefly in a presentation is labelled "Normalized Historic Trends for SPEC<benchmark>". As another example, in a white paper (where the expectation is for greater detail than presentations), the author explicitly calls out that workloads have differed over time, and explains how numbers are calculated.

[Back to Top]

II. Requirements for Public Use of Individual Benchmark Results

For further detail about the meaning of SPEC metrics, the individual benchmark run rules may be consulted. The benchmark names at the top of each table are links to that benchmark's run rules.

III. Definitions

Basis for Comparison
Information from a compliant result may be used to define a basis for comparing a subset of systems, including but not limited to memory size, number of CPU chips, operating system version, other software versions, or optimizations used. Other information, not derived from SPEC, may also be used to define a basis, for example, cost, size, cooling requirements, or other system characteristics. The basis must be clearly disclosed.
By Location

For benchmarks designated as having a submission requirement "By location", these requirements apply:

Each licensee test location (city, state/province and country) must measure and submit a single compliant result for review, and have that result accepted by the technically relevant subcommittee, before publicly disclosing or representing as compliant any result for the benchmark.

After acceptance of a compliant result from a test location, the licensee may publicly disclose future compliant results produced at that location without prior acceptance by the subcommittee.

The intent of this requirement is that the licensee test location demonstrates the ability to produce a compliant result.

Note that acceptance of a result for one SPEC benchmark does not relieve a licensee of the requirement to complete the procedure for any other SPEC benchmark(s) that also require initial acceptance by location.

Close Proximity
In the same paragraph or an adjacent paragraph for written materials; or visible simultaneously for visual materials. The font must be legible to the intended audience.
Compliant Result
  1. The set of measurements, logs, full disclosure report pages, and other artifacts that are the output of a process that follows the run and reporting rules of a SPEC benchmark. Depending on the benchmark and its rules, the process may have many steps and many ingredients, such as specific software, hardware, tuning, documentation, availability of support, and timeliness of shipment. To find the rules for a specific benchmark, click its name in the tables above.
  2. A number within such set that is labelled as a SPEC metric.

Note that benchmark reporting pages include other types of information, such as the amount of memory on the system. It is not allowed to represent such other information as a SPEC metric, although it may be used to define a Basis for Comparison.

SPEC reviews results prior to publication on its web site, but the accuracy and compliance of the submission remains the responsibility of the benchmark licensee. See the disclaimer.

Derived Value

A unit that is a numerical function of one or more SPEC Metrics, rather than the original metric. The function may be a constant divisor, to normalize performance to a comparison system of interest. The function may bring in quantities that are some other characteristic(s) of the system. Such other characteristics may include information from both SPEC result pages and from non-SPEC sources.

Examples:

SPECint®_rate2006 per chip" (metric is divided by number of chips reported on SPEC disclosure)

"Cubic feet per SPECint®_rate2006" (a non-SPEC quantity is divided by the metric)

"Normalized SPECsfs®2008_cifs" (metric is divided by result for a comparison system)

"GamePerfMark", from the trademark section above.

This definition is intentionally broad, encompassing any function that includes a SPEC metric as one of the inputs.

Disallowed Comparisons
As mentioned above, results of one benchmark may not be compared to a different benchmark, nor to a different major release of the same benchmark. Individual benchmarks may forbid other comparisons, typically where such comparisons are considered inherently misleading.
Estimate

An estimate is an alleged value for a SPEC metric that was not produced by a run rule compliant test measurement.

For purposes of this definition, it does not matter whether the alleged value for the metric was produced by extrapolating from known systems, or by cycle accurate simulation, or by whiteboard or dartboard, or by normal testing with the exception of a single missing mandatory requirement (e.g. the 3 month availability window). If the alleged value is not from a rule-compliant run, then it is an estimate.

The usage of estimates is limited.

Major Release
For purposes of this fair use rule, the term "major release" references a change in the year component of a benchmark product name, for example SPECjvm 98 vs. SPECjvm 2008.
Non-Compliant Number

A value for a SPEC metric that fails to meet all the conditions for a compliant result.

Usage Note: By the definition of Estimate, above, a non-compliant number is also an estimate; and, of course, an estimate does not comply with the run rules. Therefore, the terms are sometimes interchangeable. In practical usage, an estimate may bear no relationship to any measurement activity; whereas a non-compliant number is typically the product of running the SPEC-supplied tools in a manner that does not comply with the run rules. In such cases, the tools may print out numbers that are labelled with SPEC metric units, but the values that are printed are not compliant results. Such values are sometimes informally called "non-compliant results", but for the sake of clarity, this document prefers the term "non-compliant number".

Required Metric
A SPEC metric whose value must be supplied. Individual benchmark sections above list whether they have required metrics. If so, then when any data is used from a full disclosure report, then the values for this/these metric(s) must also be used.
SPEC Metric
  1. A unit of measurement defined by a benchmark, such as response time or throughput for a defined set of operations. The available units for each benchmark are named in the tables above, and are defined within the benchmark run rules (which can be found by clicking the benchmark name in the tables above).
Example: SPECjvm®2008 Peak ops/m.
  1. A specific value measured by such a unit.
Example: 320.52 SPECjvm®2008 Peak ops/m.

Usage Note: Both senses are used in this document, and it is expected that the sense is clear from context. For example, the prohibition against calling a derived value by a SPEC metric name is sense (i): do not define your own unit of measurement and then apply SPEC's trademarks to that unit. As another example, the rules for SPECpower_ssj® 2008 require disclosure of SPECpower_ssj®2008 overall ssj_ops/watt, which is sense (ii): one is required to supply the value measured for a particular system.

A printed SPEC metric value is not necessarily a Compliant Result: SPEC provides tools that display values for SPEC metrics, such as the above example of "320.52 SPECjvm®2008 Peak ops/m". Although SPEC's tools help to enforce benchmark run rules, they do not and cannot automatically enforce all rules. Prior to public use, the licensee remains responsible to ensure that all requirements for a compliant result are met. If the requirements are not met, then any printed values for the metrics are non-compliant numbers.

[Back to top]

IV. Violations Determination, Penalties, and Remedies

SPEC has a process for determining fair use violations and appropriate penalties and remedies that may be assessed.

[Back to top]

Change history

  • 11 April 2011 - The SPEC Fair Use rule has been re-written to:

    1. Promote greater Fair Use consistency across SPEC benchmarks; and
    2. Where Fair Use rules differ among SPEC benchmarks, make it easier to find differences.
  • 22 June 2011 - Editorial clarifications

    1. Emphasize that comparisons of non-compliant numbers must not be deceptive.
    2. Explain the term "major release" as used in the rule about comparisons.
    3. Clarify example for normalized historical comparisons.
    4. Clarify definition of Close Proximity.
    5. Prefer term "licensee" rather than synonyms.
    6. Minor editorial clarifications.
  • 18 August 2011 - Add SPEC Sip_Infrastructure 2010. Correct the metrics list for SPECweb 2009.

  • 7 February 2013 - Add SPECjbb 2013, SPEC OMP 2012.

  • 25 February 2013 - Add SERT.

  • 13 March 2014 - Add SPEC Accel, Chauffeur-WDK. Note retirement of SPECjAppServer 2004, SPECjbb 2005, SPECmail 2001 and SPECmail 2009, SPEC OMP 2001, SPEC VIRT_SC 2010, SPECweb 2005 and SPECweb 2009.

  • 3 November 2014 - Add SPEC SFS 2014.

  • 9 December 2014 - Note retirement of SPECjbb 2013

  • 2 September 2015 - Note retirement of SPEC SFS 2008, correct SPEC SFS 2014 required metrics listing

  • 23 September 2015 - Add SPECjbb 2015

  • 05 May 2016 - Add SPEC Cloud IaaS 2016

  • 27 September 2016 - Added recent versions and marked as retired older versions of SPECapc, SPECviewperf, and SPECwpc benchmarks.

  • 27 March 2017 - Added SERT 2.0, structural rearrangement of document.

  • 20 June 2017 - Added SPEC CPU 2017

  • 27 June 2017 - Updated SPEC ACCEL with new metrics

  • 07 November 2017 - Added SPECapc for Maya 2017, noted retirement of SPEC Sip_Infrastructure

  • 19 December 2017 - Added new EDA workload and metric for SPEC SFS 2014 SP2

  • 23 May 2017 - Added SPECviewperf 13, noted retirement of SPECviewperf 12.0 and SPECviewperf 12.1

  • 15 August 2018 - Added SPECapc for Solidworks 2017

  • 12 September 2018 - Added SPECjEnterprise 2018

  • 19 October 2018 - Added SPECworkstation 3, noted retirement of SPECwpc V2.0/2.1

  • 18 December 2018 - Added SPEC Cloud IaaS 2018

  • 8 March 2019 - Added SPECviewperf 13 Linux Edition, added (retired) SPEC CPU 2000, updated SPEC CPU 2006 to note retirement

  • 9 September 2019 - Updated for SPEC CPU 2017 V1.1

  • 8 December 2020 - Updated for SPECstorage Solution 2020, SPECapc for Solidworks 2020, SPECviewperf 2020

  • 16 March 2021 - Updated for SPECworkstation 3.1

  • 02 September 2021 - Updated for SPECvirt Datacenter 2021

  • 19 October 2021 - Updated for SPEChpc 2021

  • 9 February 2023 - Retired SPECapc for Solidworks 2019, updated for SPECapc for 3dsmax 2020, SPECapc for Maya 2023, SPECapc for Solidworks 2021, SPECapc for Solidworks 2022

  • 4 December 2024 - Updated for SPECworkstation 4.0, SPECapc for Solidworks 2024. Retired SPECapc for Solidworks 2020 and SPECapc for Solidworks 2021. Updated SPEC Cloud IaaS 2016 entry.

  • 29 April 2025 - Updated for SPECviewperf 15

  • 28 May 2025 - Updated for SPECapc for Siemens NX 2025

System Message: WARNING/2 (<stdin>, line 442)

Bullet list ends without a blank line; unexpected unindent.
SPECjbb 2015, SPECjbb 2013 (Retired), SPECjbb 2005 (Retired)
SPEC OMP 2012, SPEC OMP 2001 (Retired)
SPEC SERT Suite 2.0, SPEC SERT Suite 1.0
SPEC Sip_Infrastructure 2011 (Retired)
SPECstorage® Solution 2020, SPEC SFS 2014 (Retired), SPEC SFS 2008 (Retired)
SPEC VIRT_SC 2013, SPEC VIRT_SC 2010 (Retired)
SPECapc for 3ds Max 2020, SPECapc for 3ds Max 2015 (Retired), SPECapc for 3ds Max 9 (Retired)
SPECjAppServer 2004 (Retired)
SPECjEnterprise 2018 Web Profile, SPECjEnterprise 2010
SPECmail 2009 (Retired), SPECmail 2001 (Retired)
SPECviewperf 15, SPECviewperf 2020, SPECviewperf 13, SPECviewperf 12.1 (Retired), SPECviewperf 12 (Retired), SPECviewperf 11 (Retired)
SPECweb 2009 (Retired), SPECweb 2005 (Retired)
SPECworkstation 4.0, SPECworkstation 3.1 (Retired), SPECworkstation 3.0 (Retired), SPECwpc 2.x (Retired), SPECwpc 1.0 (Retired)
SPEC ACCEL® benchmark SPEC.org Submission Requirements None. Submission to SPEC is encouraged, but is not required. `SPEC Metrics`_ The bottom line metrics: SPECaccel®_ocl_base, SPECaccel®_acc_base, SPECaccel®_omp_base, SPECaccel®_ocl_peak, SPECaccel®_acc_peak, SPECaccel®_omp_peak, SPECaccel®_ocl_energy_base SPEC®accel_acc_energy_base, SPECaccel®_omp_energy_base, SPECaccel®_ocl_energy_peak, SPECaccel®_acc_energy_peak, SPECaccel®_omp_energy_peak Median individual benchmark SPECratios Median run times of the individual benchmarks `Required Metrics`_ None Conditionally Required Metrics For an individual benchmark, if a result other than the median is mentioned, then the median from the same set must also be mentioned.
SPEC Chauffeur WDK tool
SPEC Cloud IaaS 2018, SPEC Cloud IaaS 2016 (Retired)
SPEC CPU 2017, SPEC CPU 2006 (Retired), SPEC CPU 2000 (Retired)
SPECapc® for LightWave 3D v9.6 (Retired)
SPECapc for Maya 2024, SPECapc for Maya 2023, SPECapc for Maya 2017 (Retired), SPECapc for Maya 2012 (Retired). SPECapc for Maya 2009 (Retired)
SPECapc for Pro/ENGINEER Wildfire 2.0 (Retired)
SPECapc for PTC Creo 9, SPECapc for PTC Creo 3 (Retired), SPECapc for PTC Creo 2 (Retired)
SPECapc for Siemens NX 2024, SPECapc for Siemens NX 10 (Retired), SPECapc for Siemens NX 9 (Retired)
SPECapc for SolidEdge V19 (Retired)
SPECapc for Solidworks 2024, SPECapc for Solidworks 2022, SPECapc for Solidworks 2021 (Retired), SPECapc for Solidworks 2020 (Retired), SPECapc for Solidworks 2019 (Retired), SPECapc for Solidworks 2017 (Retired), SPECapc for Solidworks 2015 (Retired), SPECapc for Solidworks 2013 (Retired), SPECapc for Solidworks 2007 (Retired)
SPECapc for UGS NX8.5 (Retired), SPECapc for UGS NX6 (Retired), SPECapc for UGS NX4 (Retired)
SPECpower_ssj® 2008 benchmark SPEC.org Submission Requirements By location, as defined below `SPEC Metrics`_ SPECpower_ssj®2008 overall ssj_ops/watt for a specific target load level, its ssj_ops and Average Active Power(W). `Required Metrics`_ SPECpower_ssj®2008 overall ssj_ops/watt The required metric must be listed in `close proximity`_ to any other measured data from the disclosure or any `derived value`_. Conditionally Required Metrics Condition:Comparison of performance or power data from the same target load level Requirement:Both the performance and the power results for that target load must be disclosed in close proximity Condition:Comparison of SUTs with different numbers of nodes Requirement:The number of nodes for each SUT must be disclosed in close proximity.
SPECstorage® Solution 2020 benchmark SPEC.org Submission Requirements None. Submission to SPEC is encouraged, but is not required. `SPEC Metrics`_ SPECstorage® Solution 2020_ai_image and corresponding Overall Response Time SPECstorage® Solution 2020_genomics and corresponding Overall Response Time SPECstorage® Solution 2020_vda and corresponding Overall Response Time SPECstorage® Solution 2020_swbuild and corresponding Overall Response Time SPECstorage® Solution 2020_eda_blended and corresponding Overall Response Time `Required Metrics`_ Peak and ORT for: - SPECstorage® Solution 2020_ai_image JOBS & ORT or - SPECstorage® Solution 2020_genomics JOBS & ORT or - SPECstorage® Solution 2020_vda STREAMS & ORT or - SPECstorage® Solution 2020_swbuild BUILDS & ORT or - SPECstorage® Solution 2020_eda_blended JOBS & ORT Conditionally Required Metrics   Use of `Estimates`_ Not allowed Disallowed Comparisons No additional `requirements`_ beyond the requirements that results not be compared to other benchmarks.
(Retired) SPECviewperf® 11 benchmark RETIRED The SPECviewperf 11 benchmark has been retired. All public use of results for this benchmark must plainly disclose that the benchmark has been retired, as described above. No further submissions will be accepted for publication at www.spec.org. SPEC is no longer reviewing results for this benchmark. Rule-compliant results may be published independently, provided that the fact of retirement is plainly disclosed. SPEC.org Submission Requirements None.
SPECvirt® Datacenter 2021 benchmark SPEC.org Submission Requirements Results must be reviewed and accepted by SPEC prior to public disclosure. `SPEC Metrics`_ SPECvirt®_Datacenter-2021 `Required Metrics`_ SPECvirt®_Datacenter-2021 The required metric must be listed in close proximity to any other measured data from the disclosure or any derived value. Conditionally Required Metrics   Use of `Estimates`_ Not allowed Disallowed Comparisons In addition to the `requirements`_ that results not be compared to other benchmarks:
(Retired) SPECweb® 2005 benchmark RETIRED The SPECweb 2005 benchmark was retired on January 12, 2012. All public use of results for this benchmark must plainly disclose that the benchmark has been retired, as described above. No further submissions will be accepted for publication at www.spec.org SPEC is no longer reviewing results for this benchmark. Rule-compliant results may be published independently, provided that the fact of retirement is plainly disclosed. SPEC.
SPECworkstation® 3 benchmark SPEC.org Submission Requirements None. Submission to SPEC is encouraged, but is not required. Compliant results may be published independently. `SPEC Metrics`_ Media & Entertainment, Product Development, Life Sciences, Financial Services, Energy, General Operations Subsystem CPU, Storage, Graphics, and GPU Compute Any subtest of SPECworkstation `Required Metrics`_ None Conditionally Required Metrics None Use of `Estimates`_ Estimates are allowed if clearly identified. Disallowed Comparisons No additional `requirements`_ beyond the requirements that results not be compared to other benchmarks.
(Retired) SPECwpc® V1.0 benchmark RETIRED The SPECwpc V1.0 benchmark has been retired. All public use of results for this benchmark must plainly disclose that the benchmark has been retired, as described above. No further submissions will be accepted for publication at www.spec.org Rule-compliant results may be published independently, provided that the fact of retirement is plainly disclosed. SPEC.org Submission Requirements None. Submission to SPEC is encouraged, but is not required.