16 Feb, 23

Diving into The Future of Ethereum

diving into the future of ehtereum
Nathan Lenga

Innovation Lead

Parallel to Bitcoin’s disruption of traditional finance and technology with the peer-to-peer ideologies underpinning the blockchain is Ethereum’s disruption of blockchains with smart contract functionality. With its own coding language, Solidity, and the Ethereum Virtual Machine (EVM), this blockchain has become known as the “World Computer”. The majority of the innovation in the cryptocurrency space occurs on Ethereum, rendering it fundamental to the industry. As a result, Ethereum must remain effective, efficient and usable through Ethereum Improvement Proposals (EIPs) as well as a robust roadmap which promotes scalability, decentralisation and security.

This article will explore the future of Ethereum in the context of its roadmap and forthcoming EIPs that have the capacity to revolutionise the way users interact with the blockchain – whether they are aware of it or not.

Ethereum’s Roadmap

The Ethereum roadmap has undergone many iterations; some facets have consistently appeared in each version, such as sharding and Proof of Stake (PoS), whilst some have disappeared, including sidechains and other sharded versions of Ethereum. Nonetheless, with some of the greatest minds in the blockchain space researching how to improve the blockchain, each improvement to the roadmap represents a significant breakthrough.

Source: Tweet by Vitalik Buterin

Ethereum’s most recent roadmap, released at the end of 2021 by co-founder, Vitalik Buterin, saw the future of the chain divided up into five separate categories. Thus far, the first major milestone has been achieved – the merge. Vitalik elucidated that following the merging of Ethereum’s execution layer with its PoS consensus layer, the Beacon Chain, Ethereum would be 55% complete. Considering the amount of work that went into the merge, developing the improvements for the final 45% would appear to take an elongated period of time. However, the Ethereum Foundation is working on each stage in parallel; significantly expediting the process.

In order for Ethereum to remain the dominant layer 1 blockchain, it must become more scalable. Though security and decentralisation, the two other facets of the blockchain trilemma, are foundationally important to Ethereum, most retail consumers focus on ease of usage; they want to transact on a blockchain with low gas fees, efficient finality and a high number of transactions per second. The Surge phase of Ethereum’s roadmap obtained its name from the focus on scaling the blockchain.

The Surge

By leveraging the compression technology of Optimistic and Zero-Knowledge rollups, Ethereum can scale as transactions are settled on layer 2s, where gas fees are paid, then pushed on the layer 1. However, as rollups consume more of Ethereum’s blockspace, they will increase the calldata storage requirements, increasing the barriers of entry for entities to begin validating the network. Hence, The Surge will temporarily see calldata limited in every block and pruned over time. Nonetheless, this will only rectify the storage implications on scalability. Creating a data availability layer and sharding it, giving each shard different layers of security via a committee voting model, whilst continuing to post all data on the main Ethereum blockchain, will enable users to transact on layer 2s for a comparatively meagre cost and substantially more efficiently. More specifically, this stage will see the implementation of Proto-Danksharding, also known as EIP-4844 and Data Availability sampling by network participants; these two concepts are detailed below. Unfortunately, storage issues emerge once again after calldata is replaced by Ethereum’s data availability layer. Cryptographic methods to sample the data are being developed as part of The Surge phase, enabling nodes to recreate the blockchain’s history without downloading and storing all of the data.

Further to these improvements, in order for The Surge to reach its goal of allowing Ethereum to facilitate over 100k transactions per second, the blockchain must become compatible with zero knowledge proofs at an execution level. This will allow protocols to be built on zero knowledge platforms, known as zkEVMs, in the same way it would be developed on Ethereum. As such, projects and code would efficiently be transpiled onto zkEVMs from Ethereum. Although many teams like Polygon Hermez, Scroll and zkSync are attempting to build zkEVMs, they do not meet Vitalik’s requirement of a Type 1 protocol. The integration of native zkEVMs with Ethereum is additionally fundamental to milestones in other stages of the blockchain’s roadmap.

The Scourge

As the most recent addition to Ethereum’s roadmap, improvements within The Scourge strive towards ensuring Ethereum is censorship-resistant and Maximal Extractable Value (MEV) issues relating to centralisation are resolved. This is achieved within The Scourge via in-protocol defences against vulnerabilities that are native to Ethereum; primarily those relating to MEV. MEV refers to the additional income of validators earnt by organising a block of transactions to optimise profits. Zerocap went into more detail on MEV here. Although the extraction of MEV allows the blockchain market to function without inefficiencies, malicious strategies can be used to maximise this value. Moreover, eliminating MEV from certain blockchains is impossible; if a chain has smart contract functionality, like Ethereum, opportunities to capture MEV will emerge. Nonetheless, MEV can be mitigated with strategies like single slot finality and a committee based approach to validating blocks. This will see transactions being finalised after the subsequent is proposed as opposed to the current epoch time of 6.4 minutes. Streamlining the finalisation of blocks can remove the ability of validators to reorganise the chain to extract more MEV. 

How Bots Capture MEV

Beyond minimising MEV, centralisation risks around who can capture MEV and how much they extract is another issue to be overcome in The Scourge stage of Ethereum’s roadmap. MEV smoothing enables all validators participating in the Ethereum blockchain to receive rewards garnered by those specialised bots capturing MEV with expert strategies. Accordingly, MEV can become democratised and accessible to all entities. Another means to surmount the centralisation of MEV is a protocol level version of Proposer Builder Separation (PBS). This approach is explained later in the piece. Furthermore, the roadmap alludes to MEV burn, which would result in MEV being tracked before it enters into validators’ wallets and is burnt, reducing the amount of value extracted per block. In this context, MEV burn functions similarly to the burning of gas fees under EIP-1559.

The Verge

This step of the roadmap is concentrated on scaling through Ethereum’s nodes and validators. Central to The Verge is the replacement of Patricia Merkle Trees with Verkle Trees – this change giving the roadmap phase its name. Currently, Ethereum uses a storage mechanism known as Patricia Merkle Trees to compress information in blocks along with the hashing function, SHA-256. This algorithm takes an input and deterministically provides an output; notably, however, the initial input cannot be obtained from the output. Since Ethereum’s inception, Patricia Merkle Trees have been beneficial to store the state of the blockchain, yet as the network scales, the proofs required for these trees will grow too large.

Patricia Merkle Tree

The tree structure can be used by nodes to verify whether transactions were part of a block without storing every byte of data within that block. Despite its efficacy, the Patricia Merkle Tree fails to scale downwards (as the tree increases in layers with more transactions per block), resulting in the storage requirement for nodes increasing proportionally to the network’s usage. This has the potential to discourage individuals from being validators and existing validators from remaining network participants.

On the other hand, Verkle Trees make use of Kate polynomial Vector commitments to require less hashes per tree by being more expansive. Introduced by John Kuszmaul in 2018, Verkle Trees shorten the difficulty in generating proofs that transactions were in blocks, ergo reducing the burden on nodes storing blockchain data. In this context, although they serve the same purpose, the distinguishing feature between Merkle and Verkle Trees is that the latter is meaningfully more effective in mitigating proving time for transactions. Statistically, the proof size of Verkle Trees can be decreased by a factor of 6-8 compared to optimised Merkle and by a factor of 20-30 compared to the Patricia Merkle Trees used by Ethereum. Additionally, the accessibility of Verkle Trees will pave the path for Ethereum to become stateless whereby “witnesses” can participate in the blockchain without relying on popular validator clients.

Verkle Tree

Furthermore, The Verge seeks to benefit validators through making the verification process for blocks more efficient through Zero Knowledge Succinct Non-Interactive Arguments of Knowledge (zk-SNARKs). This cryptographic primitive allows for data to be verified in milliseconds without any information external to what is being proved being shared or publicised. In the context of verifying blocks, zk-SNARKs ensure that reduced amounts of transaction information must be downloaded and verified to convince validators of the validity of blocks. Fundamentally, a “Fully SNARKed Ethereum” results in the zk-SNARK needed verification as opposed to all data within each block. To achieve this, Ethereum’s computation mechanism must natively support these zero knowledge proofs and SNARKs must be created for the newly enshrined Verkle Trees.

The Purge

The penultimate stage of Ethereum’s roadmap is The Purge wherein old data will be purified and eliminated. This phase primarily focuses on cleaning up the network to reduce inundation and congestion. Improvement proposals will be designed that result in clients not storing historical data that is over 1 year old. By purging this block data, the storage and hardware requirements for nodes will reduce, concurrently decreasing the bandwidth of the network.

Once historical data can be purified automatically, the single state Verkle Tree implemented after The Verge can be replaced with a list of trees representing Ethereum’s state; each tree in the list will retain a single year’s worth of transaction data. Such a change would increase the ability of an entity to become a witness given that they only need to verify one state tree with blocks that were proposed within the past year. Therefore, the end goal in this phase is state expiry – states that were not modified in the past two periods (years) are not stored by clients.

The Splurge

Finally, the fifth part of Ethereum’s roadmap, The Splurge, is where the other “miscellaneous but important” features will be actualised. Unlike the improvements under the previous phases, The Splurge looks to better Ethereum holistically with the majority of alterations operating independently of the other. Nonetheless, the underlying purpose of Ethereum’s roadmap is prevalent in this phase; making Ethereum more usable, equitable and efficient. 

For Buterin, account abstraction (AA) has been fundamental to making the Ethereum blockchain more user-friendly. This concept would see Ethereum’s user experience improve as wallet addresses go from being a string of alphanumeric bits to a name or word. Abstracting users’ accounts would placate fears of mistyping one’s wallet address when requesting or sending money. Accordingly, many in the community contend that without this change, decentralised applications and blockchains will fail to compete with the ease and clarity of using web2 sites. Despite the realisation of AA being in the distant future, two improvement proposals have been created. Initially, Buterin and other Ethereum Foundation members wrote EIP-2938, primarily focusing on user transactions. Secondly, EIP-4337 was released, attempting to enshrine AA into Ethereum’s mempool. Together, these proposals integrate AA into Ethereum’s consensus layer and execution layer, respectively EIP-2938 and EIP-4337.

The Splurge additionally features an Ethereum Virtual Machine (EVM) improvement track. The EVM, akin to other virtual machines like the Java Virtual Machine, executes and enacts written code. In the context of blockchains, the EVM gives smart contracts meaning. Every node runs a version of the EVM so that smart contracts in blocks they propose can impact the state of Ethereum. Some proposed improvements to the EVM include giving it the ability to assign different objects to smart contracts, similar to what can be done with the Move language (link to article), elevating the EVM’s capacity to execute larger algorithmic functions efficiently through importing mathematical modules and more.

All of the aforementioned improvements to Ethereum do not directly increase the blockchain’s security; instead strive towards advancing Ethereum’s decentralisation and scalability. Ensuring that the blockchain remains safeguarded from a plethora of exploits and attacks, without simultaneously sparking higher gas fees and lower transactions per second has long been a struggle for blockchains. However, with the implementation of verifiable delay functions (VDFs), committee leaders and validators can be randomly selected in Ethereum’s PoS consensus layer without the threat of an entity predicting when they will be chosen to propose a block. VDFs are functions which require a significant amount of sequential computation to find a solution, but upon being found, can be efficiently verified. In this context, these functions can create a timelock before a problem can be resolved. The delay element of VDFs are vastly superior to verifiable random functions as malicious validators cannot influence or predict the output. Integrating this technology into Ethereum will result in the chain being secure as long as there is a single honest validator who works to solve the VDF or proves that malicious actors’ solutions are inaccurate. 

Danksharding

Danksharding is believed by most to be the ultimate panacea for Ethereum’s scalability issues. Named after Ethereum researcher, Dankrad Feist, Danksharding is fundamentally important to Ethereum’s roadmap as it will elevate the ability of the blockchain to offer sharding. Unlike previously proposed approaches to sharding, this model does not strive to increase space for transactions, but instead reduces the amount of data needed to be stored on-chain. Consequently, this allows for more space for data blobs (binary large objects).

Danksharding is premised on the concept that different shards do not need their own blocks proposed by unique validators. Rather, each shard should contain different amounts of data, enabling a single participant to propose one block with all of the transaction data. This idea has the capacity to resolve the scaling detriments of layer 2s; rollups compress and post sizable amounts of data to the Ethereum mainnet, forcing all nodes to download and verify it. Danksharding will break up Ethereum’s settlement and data availability layer, sharding data availability to allow more transactional data to be allocated to each block.

EIP-4844: Proto-Danksharding

Although Danksharding is observed to be the endgame for Ethereum with respect to scalability, coding the requirements for this type of sharding and data sampling is extremely arduous. Indeed, experts in the Ethereum Foundation have speculated that Danksharding might only be implemented in 2-5 years. Subsequently, an interim solution known as Proto-Danksharding has been proposed by Diederik Loerakker, also known as Proto Lambda.

This proposal is more concrete when compared to Danksharding; Proto-Danksharding already has an Ethereum Improvement Proposal (EIP), EIP-4844, more formally known as Shard Blob Transactions. Given the reduced implementation difficulty, the Ethereum Foundation recently announced that it would no longer aim for EIP-4844 to go live during the Shanghai fork. Differing itself from Danksharding, Proto-Danksharding introduces a new transaction format for blobs which will be utilised by Layer 2s. These rollup networks currently make use of the storage space on the Ethereum mainnet – the calldata utilised remains on the Layer 1 permanently. Yet, this data is only necessary for a short period of time, namely when an individual may look to find faults in a data blob to submit a fraud proof and earn rewards.

Under EIP-4844, blob transactions will increase Ethereum’s capabilities of acting as the data availability layer, ergo reducing the cost of storing calldata on chain. Notably, these data transaction blobs are deleted after a month to mitigate storage requirements for validators and simultaneously increase data availability. As such, Proto-Danksharding will increase block capacity significantly; at the moment, the average Ethereum block size is just under 90 KB – EIP-4844 will roughly elevate block size to about 1 MB. This reflects more than a 10x jump in block capacity. Further, given the compression feature of rollups, Proto-Danksharding will reduce transaction fees on Layer 2s by up to 100x.

EIP-4488: Calldata Gas Cost Reduction With Total Calldata Limit

The implementation of Proto-Danksharding puts Ethereum on the course to full Danksharding, however, the plight of exorbitant gas fees need to be ameliorated immediately. High gas fees can often form a barrier that discourages new users from using Ethereum and concurrently lower volume transacted on the blockchain. Indeed, Ethereum gas fees have soared to hundreds of dollars. In October of 2021, Vitalik Buterin and Ansgar Dietrichs published EIP-4488, titling it “transaction calldata gas cost reduction with total calldata limit”. According to Buterin, the proposal has the capacity to decrease the cost of gas fees on rollups by 5x.

The purpose of EIP-4488 is to decrease transaction calldata gas cost by adding limitations to the total calldata per block. Evidently, EIP-4488 is being positioned as a predecessor to Proto-Danksharding. Transaction calldata is the data provided within the call being made by Ethereum’s Virtual Machine (EVM) to the smart contract being executed; this process results in users incurring a gas fee. Further, given that Ethereum functions can be called externally to the contract, the data of that call will be stored in calldata. As a block gets overly saturated, the cost of storing additional calldata gets more expensive. The price users are willing to pay for this is determined by the demand for blockspaces, however it eventually reaches an equilibrium for a single block.

EIP-4488 takes the approach of achieving its purposes by reducing the gas required to satisfy calldata costs from 16 gas per byte to 3 gas per byte. Clearly, this will not benefit Ethereum’s scalability at a base layer, but rather will allow more users to transact on the blockchain. Furthermore, this improvement proposal attempts to add a calldata limit to each block of 1,048,576 bytes. Notably, the second component is added to protect Ethereum in the case that a block is too replete with calldata given the reduced gas costs.

Despite the ease of putting EIP-4488 into effect given that it only necessitates minimal changes to the Ethereum code, a commonly raised criticism is that calldata cannot be pruned. Whereas data blobs can be trimmed after a month, calldata will remain in Ethereum’s storage in perpetuity. Accordingly, those running nodes will be faced with increased workloads in regards to the data they must store. 

EIP-4444: Bounding Historical Data in Execution Clients

As a solution for some of the negative implications of EIP-4488, another improvement proposal, EIP-4444, established by George Kadianakis, lightclient and Alex Stokes, will enable clients to locally prune their historical data, thereby mitigating disk space requirements for validators. Fundamentally, each node must store Ethereum’s history and state; currently, this equates to over 966 GB of data. History refers to every transaction in each block since Ethereum’s genesis block. The history grows through the passage of time, placing greater strains on validators with respect to storage requirements. On the other hand, state refers to all existing accounts, their balances, functioning smart contracts and more. Full node operators, which serve the network by providing this data upon request, must always store and update this information after every block. 

Whereas EIP-4844 sees transaction blobs being pruned over time, EIP-4444, named “Bound Historical Data in Execution Clients”, gives clients the capacity to delete historical calldata that existed for over than a year. This would result in nodes not being required to download the entire set of history as only state is required to validate new blocks. Accordingly, this can solve issues created by reducing gas costs by applying calldata limits, however it concurrently raises the question of who stores the historical data that has been pruned by the nodes. Buterin suggested that block explorers, including Etherscan and Etherchain, individual or institutional volunteers, data indexers such as The Graph or Covalent, as well as decentralised data storage protocols like Arweave and Filecoin. Regardless, reducing storage requirements for nodes lowers the barriers of entry to become a validator for Ethereum, consequently making the chain more decentralised.

Combating Censorship Within MEV Through Proposer Builder Separation

Underlying EIP-4488 and EIP-4444, is a push to increase Ethereum’s level of decentralisation by enabling more individuals to participate as network validators. However, as we discussed in our article on MEV, as specialised searcher bots leverage optimised strategies to capture more value, the network shifts towards being more centralised. Clearly, more than one facet of Ethereum must be improved to promote decentralisation. Fortunately, a temporary solution to this centralisation risk has been created; MEV-Boost. Created by Flashbots, MEV-Boost gives all validators, irrespective of their coding knowledge, the capacity to extract MEV through a custom client. With the potential to more than double their staking yield, most validators opted to use MEV-Boost; since the merge, over 75% of blocks have been relayed by this client.

MEV-Boost

The revolutionary facet of MEV-Boost is what is known as the Proposer Builder Separation (PBS). This concept sees the creation of three unique roles, block builders, relayers and proposers. Block builders are the bots that utilise optimised searching algorithms to find and create strategies to maximise MEV captured. Many entities can take on this role, resulting in a competitive market for profitable blocks. The relayer is tasked with selecting the most optimised block which is designed off-chain, subsequently passing it to the proposer. Finally, the proposer is the specific validator chosen by Ethereum’s pseudo-random algorithm to propose the next block to the chain. The proposed block is that which was relayed to them and created by the builder; notably, the block builder is incentivised by rewards based on value of the block. Together, these roles allow any entity to capture MEV, ergo democratising it.

However, Flashbots is a US regulated entity and accordingly remains compliant with the Office of Foreign Assets Control (OFAC). Upon the OFAC’s sanctioning of cryptocurrency mixer, Tornado Cash, its smart contracts and addresses that interacted with the protocol, Flashbots began censoring transactions relating to these wallets. As a consequence, Ethereum is facing this type of censorship at the protocol level; transactions from and to addresses on the OFAC’s Specially Designated Nationals list were not being included in blocks proposed through Flashbot’s MEV-Boost. Since the merge, over 75% of blocks have been OFAC compliant.

The solution to this issue is enshrining PBS into Ethereum itself as opposed to it being used via various clients. Many considerations must be taken to build this effectively. Firstly, block builders must be constrained in their ability to censor transactions without reapplying pressure on proposers to order blocks to capture MEV themselves. Moreover, developers must determine whether the block builder market becomes more public to the extent that validators have the choice over proposing blocks with censored transactions. Evidently, many limitations and edge-cases need to be documented and resolved before altering the base layer of Ethereum. On a recent Bankless podcast, Stephane Gosselin, co-founder of Flashbots, elucidated that given Flashbots and the Ethereum Foundation cannot yet optimise MEV-Boost without seeing its performance, enshrining PBS in Ethereum will be “impossible [for] the next year or two”.

Data Availability Sampling

As explained throughout the article, data availability is crucial to Ethereum as the blockchain moves towards modularising its stacks. Subsequent to the merge, Ethereum has two unique, yet combined layers, the execution layer and consensus layer. In the future, the data availability layer will be introduced to Ethereum, expediting the implementation of Danksharding, EIP-4844 and the PBS that is integrated into Ethereum. Undoubtedly, with Optimistic and Zero Knowledge rollups compressing millions of transactions, this foundationally important layer can potentially increase storage requirements.

To rectify this, various cryptographic and mathematical algorithms have been tested that allow nodes to sample the data availability layer, reconstructing blocks without the need to download all of Ethereum’s historical data. These solutions primarily rely on polynomials that utilise Erasure Code the layer – fragmenting data into parity blocks and encoding it to ensure the primary data can be obtained in the case where parts of the distributed information are unavailable. Through nodes leveraging Reed-Solomon codes, which detects and recovers lost or corrupted shards of data, the transactions in a block can be ascertained. As Ethereum edges closer to actualising the data availability layer, methods for data availability sampling will improve and be finalised, heralding substantial benefits to the entire network.

Conclusion

Most economic structures are in a perennial state of improvement; Ethereum is no different. In order to retain the title of “The World Computer”, the blockchain requires improvements in its scalability to increase network usability. Indeed, Metcalfe’s Law, which stipulates that the value of a network is commensurate to the number of its users, would see these upgrades will see Ethereum appreciate in value as it cements its position as the forum for blockchain-related innovation. Nonetheless, it would be quixotic to assume that all of these improvements will occur without obstacles; however, with communication as well as coordination, the long and winding road to scale Ethereum will inevitably be successfully conquered.

DISCLAIMER

Zerocap Pty Ltd carries out regulated and unregulated activities.

Spot crypto-asset services and products offered by Zerocap are not regulated by ASIC. Zerocap Pty Ltd is registered with AUSTRAC as a DCE (digital currency exchange) service provider (DCE100635539-001).

Regulated services and products include structured products (derivatives) and funds (managed investment schemes) are available to Wholesale Clients only as per Sections 761GA and 708(10) of the Corporations Act 2001 (Cth) (Sophisticated/Wholesale Client). To serve these products, Zerocap Pty Ltd is a Corporate Authorised Representative (CAR: 001289130) of AFSL 340799

All material in this website is intended for illustrative purposes and general information only. It does not constitute financial advice nor does it take into account your investment objectives, financial situation or particular needs. You should consider the information in light of your objectives, financial situation and needs before making any decision about whether to acquire or dispose of any digital asset. Investments in digital assets can be risky and you may lose your investment. Past performance is no indication of future performance.

FAQs

What is the focus of Ethereum’s roadmap?

Ethereum’s roadmap is divided into five categories: The Surge, The Scourge, The Verge, The Purge, and The Splurge. Each phase focuses on different aspects such as scalability, censorship resistance, node and validator scaling, data purification, and user experience improvements.

What is Proto-Danksharding in the context of Ethereum?

Proto-Danksharding is an interim solution proposed for Ethereum’s scalability issues. It introduces a new transaction format for blobs which will be utilized by Layer 2s. These data transaction blobs are deleted after a month to mitigate storage requirements for validators and increase data availability.

What is the purpose of EIP-4488 in Ethereum’s development?

EIP-4488 aims to decrease transaction calldata gas cost by adding limitations to the total calldata per block. This proposal will not directly benefit Ethereum’s scalability at a base layer, but it will allow more users to transact on the blockchain by reducing gas costs.

What is the Proposer Builder Separation (PBS) model?

The Proposer Builder Separation (PBS) model creates three unique roles: block builders, relayers, and proposers. Block builders are bots that utilize optimized searching algorithms to find and create strategies to maximize MEV captured. The relayer selects the most optimized block, which is then passed to the proposer, the specific validator chosen by Ethereum’s pseudo-random algorithm to propose the next block to the chain.

What is the goal of The Purge phase in Ethereum’s roadmap?

The Purge is the penultimate stage of Ethereum’s roadmap where old data will be purified and eliminated. This phase primarily focuses on cleaning up the network to reduce inundation and congestion. Improvement proposals will be designed that result in clients not storing historical data that is over 1 year old. By purging this block data, the storage and hardware requirements for nodes will reduce, concurrently decreasing the bandwidth of the network.

Like this article? Share
Latest Insights

16 Feb, 23

Internet of Things (IoT) in Blockchain

The integration of the Internet of Things (IoT) with blockchain technology marks a pivotal evolution in how devices interact, enhancing security, trust, and efficiency across

16 Feb, 23

What is Ondo Finance?

Ondo Finance represents a pioneering venture in the realm of decentralised finance (DeFi), merging traditional financial instruments with the innovative world of blockchain technology. At

Weekly Crypto Market Wrap, 25th March 2024

Download the PDF Zerocap provides digital asset liquidity and digital asset custodial services to forward-thinking investors and institutions globally. For frictionless access to digital assets

Receive Our Insights

Subscribe to receive our publications in newsletter format — the best way to stay informed about crypto asset market trends and topics.

Want to see how bitcoin and other digital assets fit into your portfolio?

Contact Us
Ready to sign up?
Create an Account