ERC-6734: L2 Token List

Token List that ensures the correct identification of tokens from different Layer 1, Layer 2, or Sidechains.


Metadata
Status: DraftStandards Track: ERCCreated: 2023-03-20
Authors
Kelvin Fichter (@smartcontracts), Andreas Freund (@Therecanbeonlyone1969), Pavel Sinelnikov (@psinelnikov)
Requires

Abstract


The document describes a JSON token list that ensures that two or more Layer 1, Layer 2, or Sidechains can identify tokens from a different Layer 1, Layer 2, or Sidechain.

Motivation


This particular work by the L2 WG of the EEA Communities Projects managed by OASIS, an open-source initiative, is motivated by a significant challenge around the definition and listing of tokens on Layer 1 (L1), Layer 2 (L2), and Sidechain systems. Note that for simplicity, this document we will collectively refer to L1, L2 and Sidechain systems as chains below since the challenge described below is valid across all such systems:

  • Consensus on the "canonical" token on chain B that corresponds to some token on chain A. When one wants to bridge token X from chain A to chain B, one must create some new representation of the token on chain B. It is worth noting that this problem is not limited to L2s -- every chain connected via bridges must deal with the same issue.

Related to the above challenge is the standardization around lists of bridges and their routes across different chains. This will be addressed in a separate document.

Note that both of these issues are fundamental problems for the current multi-chain world.

Therefore, the goal of this document is to help token users to operationalize and disambiguate the usage of a token in their systems.

For lists of canonical tokens, L2s currently maintain their own customized versions of the Uniswap token list. For example, Arbitrum maintains a token list with various custom extensions. Optimism also maintains a custom token list, but with different extensions. It should be noted that both of these custom extensions refer to the bridge that these tokens can be carried through. However, these are not the only bridges that the tokens can be carried through, which means that bridges and token lists should be separated. Also note that currently, both Optimism and Arbitrum base "canonicity" on the token name + symbol pair.

An example of an Arbitrum token entry is given below:


This standard will build upon the current framework and augment it with concepts from Decentralized Identifiers (DIDs) based on the JSON linked data model JSON-LD such as resolvable unique resource identifiers (URIs) and JSON-LD schemas which enable easier schema verification using existing tools.

Note that a standard for defining tokens is beyond the scope of this document.

Specification


Keywords:

The keywords "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this document are to be interpreted as described in [RFC2119] when, and only when, they appear in all capitals, as shown here.

Typographical Convention: Requirement Ids

A requirement is uniquely identified by a unique ID composed of its requirement level followed by a requirement number, as per convention [RequirementLevelRequirementNumber]. There are four requirement levels that are coded in requirement ids as per below convention:

[R] - The requirement level for requirements which IDs start with the letter R is to be interpreted as MUST as described in RFC2119.
[D] - The requirement level for requirements which IDs start with the letter D is to be interpreted as SHOULD as described in RFC2119.
[O] - The requirement level for requirements which IDs start with the letter O is to be interpreted as MAY as described in RFC2119.

Note that requirements are uniquely numbered in ascending order within each requirement level.

Example : It should be read that [R1] is an absolute requirement of the specification whereas [D1] is a recommendation and [O1] is truly optional.

[R1] The following data elements MUST be present in a canonical token list:

  • type
  • tokenListId
  • name
  • createdAt
  • updatedAt
  • versions
  • tokens

Note, that the detailed definition of the data elements in [R1] along with descriptions and examples are given in the schema itself below.

[R1] testability: See suggested test fixtures for the data schema below.

[R2] The tokens data element is a composite which MUST minimally contain the following data elements:

  • chainId
  • chainURI
  • tokenId
  • tokenType
  • address
  • name
  • symbol
  • decimals
  • createdAt
  • updatedAt

Note, that the detailed definition of the data elements in [R2] along with descriptions and examples are given in the schema itself below.

[R2] testability: See suggested test fixtures for this documents' data schema below.

[D1] All other data elements of the schema SHOULD be included in a representation of a canonical token list.

[D1] testability: See suggested test fixtures for this documents' data schema below.

[CR1]>[D1] If the extension data elements is used, the following data elements MUST be present in the schema representation:

  • rootChainId
  • rootChainURI
  • rootAddress

Note, that the detailed definition of the data elements in [D1] and [CR1]>[D1] along with descriptions and examples are given in the schema itself below.

[CR1]>[D1] testability: See suggested test fixtures for this documents' data schema below.

[R3] All properties in the schema identified in the description to be a Universal Resource Identifier (URI) MUST follow in their semantics RFC 3986.

[R3] testability: All requirements for RFC 3986 are testable.

[R4] The chainId property utilized MUST allow for the requirements of the EIP-155 standard to be met.

Namely, transaction replay protection on the network that is identified by the chainId property value. Note, that for replay protection to be guaranteed, the chainId should be unique. Ensuring a unique chainId is beyond the scope of this document.

[R4] testability: EIP-155 requires that a transaction hash is derived from the keccak256 hash of the following nine RLP encoded elements (nonce, gasprice, startgas, to, value, data, chainid, 0, 0) which can be tested easily with existing cryptographic libraries. EIP-155 further requires that the v value of the secp256k1 signature must be set to {0,1} + CHAIN_ID * 2 + 35 where {0,1} is the parity of the y value of the curve point for which the signature r-value is the x-value in the secp256k1 signing process. This requirement is testable with available open-source secp256k1 digital signature suites. Therefore, [R4] is testable.

[D2] The chainId property SHOULD follow EIP-3220 draft standard.

[D2] testability: The EIP-3220 draft standard can be tested because the crosschain id is specified as a concatenation of well-defined strings, and using open source tooling can be used to parse and split a crosschain id, the obtained string segments can be compared against expected string lengths, and context dependent, the values for the strings specified in the standard. Consequently, [D2] is testable.

[O1] The humanReadableTokenSymbol property MAY be used.

[O1] testability: A data property is always implementable in a schema.

[CR2]>[O1] The humanReadableTokenSymbol property MUST be constructed as the hyphenated concatenation of first the tokenSymbol and then the chainId.

An example would be:


[CR2]>[O1] testability: humanReadableTokenSymbol can be parsed and split based on existing open source packages and the result compared to the tokenSymbol and chainId used in the data schema.

The schema for a canonical token list is given below as follows and can be utilized as a JSON-LD schema if a JSON-LD context file is utilized (see [W3C-DID] for a concrete example in the context of a standard):


Data Schema Testability: As the above data schema follows a JSON/JSON-LD schema format, and since such formats are known to be testable for schema conformance (see for example the W3C CCG Traceability Work Item), the above data schema is testable.

Conformance

This section describes the conformance clauses and tests required to achieve an implementation that is provably conformant with the requirements in this document.

Conformance Targets

This document does not yet define a standardized set of test-fixtures with test inputs for all MUST, SHOULD, and MAY requirements with conditional MUST or SHOULD requirements.

A standardized set of test-fixtures with test inputs for all MUST, SHOULD, and MAY requirements with conditional MUST or SHOULD requirements is intended to be published with the next version of the standard.

Conformance Levels

This section specifies the conformance levels of this standard. The conformance levels offer implementers several levels of conformance. These can be used to establish competitive differentiation.

This document defines the conformance levels of a canonical token list as follows:

  • Level 1: All MUST requirements are fulfilled by a specific implementation as proven by a test report that proves in an easily understandable manner the implementation's conformance with each requirement based on implementation-specific test-fixtures with implementation-specific test-fixture inputs.
  • Level 2: All MUST and SHOULD requirements are fulfilled by a specific implementation as proven by a test report that proves in an easily understandable manner the implementation's conformance with each requirement based on implementation-specific test-fixtures with implementation-specific test-fixture inputs.
  • Level 3: All MUST, SHOULD, and MAY requirements with conditional MUST or SHOULD requirements are fulfilled by a specific implementation as proven by a test report that proves in an easily understandable manner the implementation's conformance with each requirement based on implementation-specific test-fixtures with implementation-specific test-fixture inputs.

[D3] A claim that a canonical token list implementation conforms to this specification SHOULD describe a testing procedure carried out for each requirement to which conformance is claimed, that justifies the claim with respect to that requirement.

[D3] testability: Since each of the non-conformance-target requirements in this documents is testable, so must be the totality of the requirements in this document. Therefore, conformance tests for all requirements can exist, and can be described as required in [D3].

[R5] A claim that a canonical token list implementation conforms to this specification at Level 2 or higher MUST describe the testing procedure carried out for each requirement at Level 2 or higher, that justifies the claim to that requirement.

[R5] testability: Since each of the non-conformance-target requirements in this documents is testable, so must be the totality of the requirements in this document. Therefore, conformance tests for all requirements can exist, be described, be built and implemented and results can be recorded as required in [R5].

Rationale


This specification is extending and clarifying current custom lists such as from Arbitrum and Optimism as referenced in the Motivation or the Uniswap Tokenlist Project to improve clarity, security and encourage adoption by non-Web3 native entities.

The specification is utilizing the current JSON-LD standard to describe a token list to allow for easy integrations with Self-Sovereign-Identity frameworks such as W3C DID and W3C Verifiable Credential standards that allow for interoperability across L2s, Sidechains and L1s when identifying token list relevant entities such as Token Issuers. In addition, being compatible to W3C utilized frameworks allows implementers to use existing tooling around JSON-LD, W3C DIDs and W3C Verifiable Credentials. The choice of referencing known data property definitions from schema.org further disambiguates the meaning and usage of terms.

Security Considerations


There are no additional security requirements apart from the warnings that URIs utilized in implementations of this standard might be direct to malicious resources such as websites, and that implementers should ensure that data utilized for a canonical token list is secure and correct. Since this standard is focused on a data schema and its data properties there are no additional security considerations from for example homoglyph attacks (see CVE-2021-42574 (2021-10-25T12:38:28)).

Security Considerations: Data Privacy

The standard does not set any requirements for compliance to jurisdiction legislation/regulations. It is the responsibility of the implementer to comply with applicable data privacy laws.

Security Considerations: Production Readiness

The standard does not set any requirements for the use of specific applications/tools/libraries etc. The implementer should perform due diligence when selecting specific applications/tools/libraries.

Security Considerations: Internationalization and Localization

The standard encourages implementers to follow the W3C "Strings on the Web: Language and Direction Metadata" best practices guide for identifying language and base direction for strings used on the Web wherever appropriate.

Copyright


Copyright and related rights waived via CC0.