An implementation and benchmarking of various cache policies on the go ethereum implementation of a blockchain node in order to possibly improve read call reverts.
- Go-ethereum: Implements Ethereum blockchain in Go, used for Ethereum network interaction.
- Solidity: Programming language for writing smart contracts.
- Sepolia: Ethereum Testnet.
- Remix: Integrated Development Environment for writing, testing, and deploying Ethereum smart contracts.
- Etherscan: Used to access Ethereum network information such as transaction history, smart contract details, and blockchain statistics.
- Faucets: WebApps that distribute small amounts of cryptocurrency for free.
- Metamask: Browser extension enabling interaction with the Ethereum blockchain and accessing dApps.
- Setup Go Ethereum Node on local machine with Sepolia
- Develop Smart contracts and upload to Sepolia for Testing
- Implement various caching algorithms on the go ethereum local copy
- Develop a script to call the smart contracts and log the latency
- Run the script on the various caching policy implementations and benchmark the same.
- Compare and Contrast the caching policies in terms of their performance.
We Implemented 6 different Caching Policies:
- First In First Out (FIFO)
- Last In First Out (LIFO)
- Least Recently Used (LRU)
- Most Recently Used (MRU)
- Random Replacement (RR)
- Least Frequently Used (LFU)
Each cache is updated at a time duration of 10 seconds.
The following are the smart contracts implemented to check for eth calls:
- Simple Set and Get Message
- Simple counter
- Voting contract
- Auction
- Banking
- Marketplace
- To-do list
- Identity Verification
- Personal Information Storage
- A reward Mechanism Contract
Latency for caching policies of diffrent sizes.
Hit-Miss Ratio (For 100 calls)
- Caching across all policies generally leads to similar average results.
- A lack of caching layer also shows similar results to that of a caching layer at the RPC
- Different contracts tend to show slightly different outputs only: The latency does not depend on the contract.
- Hit/Miss ratio generally increases with cache size.
- LFU observed a better Hit/Miss ratio compared to other caching policies.