# Release notes Source: https://docs.chainstack.com/changelog export const Button = ({href, children}) => { return
; }; **Developer Portal** — Chainstack Developer Portal (this very site) migrated to [Mintlify](https://mintlify.com/) for AI-ready features, an open-source repository, and snappier performance overall. **Nodes** and **Add-ons**. You can now enable [Add-ons](/docs/add-ons): [Yellowstone gRPC Geyser plugin](/docs/yellowstone-grpc-geyser-plugin) on Solana and [Unlimited Node](/docs/unlimited-node) on any node. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) and [Dedicated Nodes](/docs/dedicated-node) in archive mode with Debug & Trace APIs for the Sonic Blaze testnet. **Protocols**. Now, you can deploy full [Global Nodes](/docs/global-elastic-node) for the Aptos testnet. **Protocols**. Now, you can deploy full [Global Nodes](/docs/global-elastic-node) and [Dedicated Nodes](/docs/dedicated-node) for the TRON Nile testnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for the Aptos mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for the Polygon zkEVM mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in full mode for the [TRON](/docs/tron-tooling) mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Gnosis Chain Chiado testnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for Gnosis Chain mainnet. **Nodes**. Now you can toggle MEV protection for your transactions on the Binance Smart Chain mainnet nodes. See [MEV protection](/docs/mev-protection). **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for the Sonic mainnet. See also a fun tutorial [Sonic: Swap farming for points walkthrough in Python](/docs/sonic-swap-farming-for-points-walkthrough-in-python). **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode for Cronos mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Bitcoin testnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Bitcoin mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for ZKsync Era mainnet. **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for Ronin Saigon testnet. * **Pricing**. New Pro plan introduced. See [Pricing](https://chainstack.com/pricing). * **Protocols**. All Tezos networks are now deprecated. **Platform**. Improved [node requests metrics](/docs/manage-your-node#view-node-requests-metrics)—the data is now granular to 1 minute; the timeframes are now 1 hour, 6 hours, 12 hours, 24 hours, 7 days. **Platform**. Improved [node requests metrics](/docs/manage-your-node#view-node-requests-metrics)—the data is now granular to 1 minute; the timeframes are now 1 hour, 6 hours, 12 hours, 24 hours, 7 days. **Nodes**. Now you can toggle MEV protection for your transactions on the Ethereum mainnet nodes. See [MEV protection](/docs/mev-protection). **Nodes**. The Fantom nodes are now running on the Sonic client. **Nodes**. The Solana nodes on the Devnet are now running on the Agave client. See also [Solana Agave 2.0 upgrade reference](/docs/solana-agave-20-upgrade-reference). **Nodes**. The Solana nodes on the Devnet are now running on the Agave client. See also [Solana Agave 2.0 upgrade reference](/docs/solana-agave-20-upgrade-reference). **Faucet**. The [Chainstack faucet](https://faucet.chainstack.com/) has a brand new and highly improved interface. **Faucet**. The [Chainstack faucet](https://faucet.chainstack.com/) can now disperse testnet TON. **Platform**. You can now access your requests stats for nodes and subgraphs. See [Statistics](/docs/see-statistics). **Protocols**. Now, you can deploy global elastic nodes for TON Mainnet & Testnet. See also [TON tooling](/docs/ton-tooling), [TON tutorials](/docs/protocols-tutorials), and the [TON API reference](/reference/getting-started-ton). **Protocols**. Now, Polygon zkEVM supports connecting over WebSocket. Trader nodes on Solana and 150+ cryptocurrencies to top up your balance. * **Trader nodes**. You can now send high-speed transactions on Solana. See [Trader nodes](/docs/warp-transactions). * **BIling**. You can now top up your balance with 150+ cryptocurrencies using NOWPayments as payment provider. See [Manage your billing](/docs/manage-your-billing). **Global elastic nodes**. Debug & trace on Fantom and `blockSubscribe` on Solana. * Fantom — you can now deploy global elastic nodes with debug & trace APIs for Fantom Mainnet. See the [debug & trace API reference](/reference/debug_traceblockbyhash-fantom-chain). * Solana — you now do the `blockSubcribe` on the Solana Mainnet & Devnet. See the [blockSubscribe API reference](/reference/blocksubscribe-solana). **Protocols**. Now, you can deploy global elastic nodes for opBNB Mainnet. **Global elastic nodes**. Now, you can deploy global elastic nodes for Oasis Sapphire Mainnet and Testnet. **Protocols**. Now, you can deploy global elastic nodes for Blast Mainnet. **Protocols**. Now, you can deploy global elastic nodes for Celo Mainnet and Moonbeam Mainnet. **Protocols**. Now, you can deploy global elastic nodes for Klaytn Mainnet. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Avalanche Mainnet. **Global elastic nodes**. Now, Scroll mainnet nodes are deployed with [debug and trace APIs](/docs/debug-and-trace-apis). **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode for Solana Mainnet. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Arbitrum One Mainnet. **Global elastic nodes** for Avalanche Fuji Testnet and Polygon Amoy Testnet. * Avalanche Fuji Testnet — you can now deploy global elastic nodes in archive mode with debug & trace APIs. * Polygon Amoy Testnet — you can now deploy global elastic nodes in archive mode with debug & trace APIs. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for BNB Smart Chain Testnet. **Protocols**. All NEAR nodes are now deprecated. Note that Aurora remains supported. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Arbitrum Sepolia Testnet. **Global elastic nodes** for Ethereum Holešky & Sepolia. * Ethereum Holešky Testnet — you can now deploy [global elastic nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs. The node client is running Erigon. * Ethereum Sepolia Testnet — you can now deploy [global elastic nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs. The node client is running Erigon. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for BNB Smart Chain Mainnet. **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Polygon Mainnet. **Global elastic nodes.** Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Ethereum Mainnet. **Global elastic nodes.** Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Ronin Mainnet. The node client is Geth-based. **Subgraphs**. Preconfigured and deployed subgraphs called Data APIs released. **Global elastic nodes**. Now, you can deploy global elastic nodes for Optimism Mainnet. The node client is op-erigon. You can now enjoy the debug & trace APIs out of the box. For the method run-down, see [Optimism API reference](//optimism-api-reference). **Global elastic nodes**. Now, you can deploy global elastic nodes for Base Mainnet. **Global elastic nodes**. Now, you can deploy global elastic nodes for Starknet Sepolia Testnet. **Global elastic nodes**. Now you can deploy global elastic nodes for Base Sepolia Testnet, Optimism Sepolia Testnet, zkSync Era Sepolia Testnet. **Billing**. No credit card required on sign-up. Sign up and get the free [Developer plan](https://chainstack.com/pricing/) with 3 million requests monthly. **Billing**. Pay-as-you-go released. You can now enable it in your [Billing](https://console.chainstack.com/user/settings/billing) to keep the Chainstack services operational on reach your plan's quota limit, or disable it to have a hard limit and stopping the services on reaching the quota. **Starknet**. Starknet mainnet and testnet nodes now support WebSocket. **Global elastic node**. Now you can deploy global elastic nodes for Starknet Mainnet. **Protocols**. You can now deploy global elastic nodes and dedicated nodes for Ronin Mainnet and Ronin Saigon Testnet. **Protocols**. You can now deploy global elastic nodes and dedicated nodes for Arbitrum Sepolia Testnet. **Console**. Now Developer subscription plan has a rate limit of 30 request per seconds. Note that this does not apply to all other subscription plans. **Global elastic node**. Now you can deploy global elastic nodes for Ethereum Holešky Testnet. * **Global elastic node**. Now you can deploy global elastic nodes for Scroll Mainnet and Aurora Testnet. * **Protocols**. Starknet Testnet2 is now deprecated. For development purposes, use Starknet Testnet. * **Global elastic node**. Now you can deploy global elastic nodes for Aurora Mainnet. * **Protocols**. All Fuse networks are now deprecated. **Chainstack Subgraphs**. Elastic indexer is now available for Ethereum Sepolia Testnet and Base Mainnet and Testnet. **Protocols**. Now you can deploy elastic zkSync Era archive nodes with debug and trace option enabled. * **Protocols**. Scroll Sepolia Testnet support. * **Global elastic node**. Now you can deploy global elastic nodes for Avalanche and Scroll Sepolia Testnet. **Protocols**. Base Mainnet support. **Global elastic node**. Now you can deploy global elastic nodes for Solana. **Protocols**. Base Goerli Testnet support for elastic nodes. * **Protocols**. zkSync Era Mainnet support. * **Billing**. When changing your subscription plan, you can now use promo codes with discounts to a plan's regular price. **Global elastic node**. Now you can deploy global elastic nodes for Arbitrum and Fantom. **Chainstack Subgraphs**. Elastic indexers are now available for Avalanche, Fantom, and Gnosis Chain mainnets. **Protocols**. zkSync Era Goerli Testnet support for elastic and dedicated nodes. **Global elastic node**. Now you can deploy global elastic nodes for Polygon and BNB Smart Chain. **Protocols**. Archive nodes are now available for Optimism Mainnet. * **Pricing** * Chainstack pricing was rehauled. Check out the changes in our [pricing page](https://chainstack.com/pricing/). And learn more about [how our pricing works](/docs/pricing-introduction). * With pricing rehaul, we're introducing [request units](/docs/pricing-introduction#what-are-request-units). This allows us to offer a fairer and more flexible pricing structure. * **Platform**. [Global elastic nodes](/docs/global-elastic-node) are available for Ethereum Mainnet. Enjoy geo-balanced nodes with enhanced performance and reduced latency. **Networks**. [Optimism Mainnet](/docs/protocols-networks) support. * **Networks**. Filecoin Hyperspace Testnet is deprecated. For development purposes, use Filecoin Calibration Testnet. * **Accounts**. Email verification is now mandatory for newly created accounts. **IPFS Storage**. You can now create and manage IPFS [dedicated gateways](/docs/ipfs-storage-introduction#what-are-different-types-of-gateways) to have more control over your files. **Debug and trace API** are now available for elastic Arbitrum archive nodes. **Endpoints**. WSS endpoints are now available for nodes with Warp transactions enabled. * **Services**. A revamped [Chainstack Marketplace](https://console.chainstack.com/marketplace) is now live with the Covalent and Valha applications already available for installation—boost up your DApps! * **Documentation**. The following article is added in Web3 \[De]Coded: * [Mastering JSON web tokens: How to implement secure user authentication](/docs/tutorial-mastering-jwt-how-to-implement-secure-user-authentication) * **Protocols**. All Oasis Sapphire nodes are now running on Chainstack Cloud. This delivers the lowest latency in the Europe region and handles any workload with zero throttling. **Protocols**. Ethereum MEV API is now deprecated since it hasn't been functional since the Merge. * **Services**. [Chainstack Subgraphs](/docs/subgraphs-introduction) has finished the closed beta and is now open to all our customers. * **Documentation**. We added a [series of developer tutorials](/docs/chainstack-subgraphs-tutorials) for Chainstack Subraphs users: from a newbie to an expert. Try them out and let us know what you think. * **Clouds and regions**. You can now deploy your elastic Arbitrum One Mainnet nodes in the following USA regions: * Full nodes — Amazon Web Services Oregon * Archive nodes — Virtuozzo Dallas * **Protocols** * **Documentation** * **Protocols**. [Polygon zkEVM](/docs/protocols-networks) testnet support for elastic and dedicated nodes. * **Documentation** * **IPFS Storage**. Chainstack launches the closed beta for the decentralized storage solution—IPFS Storage. [Reach out to us](/docs/ipfs-storage-introduction) to participate. * **Documentation**. [IPFS Storage](/docs/work-with-ipfs-storage) and [IPFS Storage API reference](/reference/chainstack-platform-api-get-pin-list). **Protocols**. Ethereum Rinkeby and Ropsten testnets are now deprecated. For development purposes, use [Sepolia and Goerli testnets](/docs/protocols-networks). * **Protocols**. Harmony migrated from devnet to [testnet](/docs/protocols-networks) for better experience. * **Billing**. You can now settle your failed payment by manually retrying either via topping up your crypto balance or via paying directly from your credit card. * **Clouds**. You can now deploy your elastic BNB Smart Chain full nodes in the Virtuozzo Amsterdam region. * **Tools**. You can now add the node endpoint with the **Add to MetaMask** button on the [node access page](/docs/manage-your-node#view-node-access-and-credentials). * **Billing**. The refined billing page now shows: * Failed payment notification * Past due invoices awaiting payments * **Protocols**. Gnosis Chain Sokol testnet is now deprecated. For development purposes, use [Gnosis Chain Chiado](/docs/protocols-networks) testnet. * **Protocols**. [Optimism](/docs/protocols-networks) Goerli testnet support for elastic and dedicated nodes. * **Documentation**. * **Billing**. The refined billing page now shows: * An overview of your subscription plan and support level * A detailed **Usage** table * A list of crypto payments and downloadable invoices * **Protocols**. [Filecoin](/docs/protocols-networks) Hyperspace testnet support for elastic and dedicated nodes. * **Documentation**. * **APIs**. 5 [Solana APIs](/reference/solana-getting-started) enabled on devnet. * **Documentation**. Solana API reference updates. * **Protocols**. [Aptos](/docs/protocols-networks) support. * **Documentation**. Aptos Welcome to the developer hub and documentation for Chainstack Developer Portal! * **Crypto payments**. In addition to the previously available cryptocurrencies, you can now [top up your balance](/docs/manage-your-billing) with Polygon MATIC. * **Protocols**. [Gnosis Chain Chiado](/docs/protocols-networks) testnet support for elastic and dedicated nodes. * **Protocols**. [Gnosis Chain clients](/docs/protocols-clients) updated for The Merge. * **Documentation**. * Gnosis Chain API [reference](/reference/gnosis-getting-started) update. * Debug and trace methods added to the Ethereum API [reference](/reference/ethereum-debug-trace-rpc-methods). * Chainstack [Subgraphs](/docs/deploy-a-subgraph). * **Documentation**. Solana API [reference](/reference/solana-getting-started) update. * **Protocols**. [StarkNet](/docs/protocols-networks) testnet2 (Goerli2) support for elastic and dedicated nodes. * **Support levels management**. You can now change a support level for your organization on the **Settings** > **Billing** page. * **Documentation**. Learn how to [change your support level or subscription plan](/docs/manage-your-billing#manage-your-organization-subscription-plan-and-support-level). * **Protocols**. [Aurora](/docs/protocols-networks) support for dedicated nodes. * **Documentation**. A simple on-chain governance tutorial for [Aurora](/docs/aurora-tutorial-simple-on-chain-governance-contract-with-remix-and-openzeppelin-wizard). * **Protocols**. [Solana](/docs/protocols-networks) elastic and dedicated full nodes now supported on both mainnet and devnet when using Chainstack Cloud in the Ashburn, USA region. * **Protocols**. [Cronos](/docs/protocols-networks) support. * **Documentation**. Cronos. * **Clouds**. You can now deploy your elastic Solana nodes using the new hosting option, Chainstack Cloud, in the Netherlands region. * **Protocols**. Ethereum [Sepolia testnet support](/docs/protocols-networks) for full nodes. * **Documentation**. Solana API [reference](/reference/solana-getting-started). * **APIs**. * [Debug and trace APIs](/reference/polygon-debug-trace-rpc-methods) for elastic Polygon archive nodes. * [Debug and trace APIs](/reference/avalanche-debug-trace-rpc-methods) for elastic Avalanche archive nodes. * **Protocols**. [Ethereum](/docs/protocols-networks) consensus layer Beacon Chain support due to the Merge. * **APIs**. [Ethereum consensus layer Beacon Chain API](/reference/ethereum-getting-started). * **Documentation**. Beacon Chain API [reference](/reference/ethereum-getting-started) added to Ethereum API reference. * **Protocols**. [Aurora](/docs/protocols-networks) support. * **Documentation**. Aurora operations. * **Protocols**. [Arbitrum](/docs/protocols-networks) support. * **Documentation**. Arbitrum operations, [simple L1 to L2 messaging tutorial](/docs/arbitrum-tutorial-l1-to-l2-messaging-smart-contract). * **Crypto payments**. In addition to the previously available BTC, ETH, and USDC, you can now [top up your balance](/docs/manage-your-billing) with Dogecoin, Litecoin, Dai, Bitcoin Cash, ApeCoin, SHIBA INU, and USDT. * **Protocols**. [Fuse](/docs/protocols-networks) support. * **Documentation**. Fuse operations, [simple multi-signature contract tutorial with hardhat](/docs/fuse-tutorial-simple-multisig-contract-with-hardhat). * **Protocols**. [Gnosis Chain](/docs/protocols-networks) support. * **Documentation**. Gnosis Chain operations, [simple soulbound token tutorial](/docs/gnosis-tutorial-simple-soulbound-token-with-remix-and-openzeppelin). * **Protocols**. [NEAR](/docs/protocols-networks) support. * **Documentation**. NEAR operations, [simple metamorphic contract tutorial](/docs/near-tutorial-creating-and-upgrading-a-simple-message-contract). * **Protocols**. * Harmony migrated from testnet to [devnet](/docs/protocols-networks) for better experience. * Tezos migrated from Hangzhounet to [Ithacanet](/docs/protocols-networks). * **Pricing**. An update to the pricing plans with new features and increased included requests. See [the blog post with an overview](https://chainstack.com/pricing-update-2022/). * **Node naming**. Shared nodes are now elastic node s to reflect the underlying architecture and scalable infrastructure. * **APIs**. * Fast transaction propagation with the [Warp transactions](/docs/warp-transactions) feature on Ethereum, Polygon, and BNB Smart Chain. * [Miner Extractable Value (MEV) API](/changelog/chainstack-updates-april-12-2023-1) for elastic Ethereum nodes for mainnet and Goerli testnet. * [Debug and trace APIs](/docs/debug-and-trace-apis) for elastic Ethereum archive nodes. * **Protocols**. * Ethereum [Erigon support](/docs/protocols-clients) for dedicated archive nodes. * StarkNet [testnet](/docs/protocols-networks) support. * **Protocols**. StarkNet support. * **Documentation**. StarkNet general description, operations, [simple L1L2 messaging tutorial](/docs/starknet-tutorial-an-nft-contract-with-nile-and-l1-l2-reputation-messaging). * **Clouds**. You can now deploy your nodes in the [Amazon Web Services Tokyo](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. * **Protocols**. Ethereum Goerli support. * **Role management** * You can now [assign roles](/docs/manage-your-organization#invite-a-user-to-the-organization) to your organization's users and [change roles](/docs/manage-your-organization#change-a-user-role-in-the-organization) of the existing users. * There are now three roles: [Admin, Editor, Viewer](/docs/manage-your-organization#users-and-their-roles). * **Protocols**. Solana support. You can now use Solana nodes on mainnet and devnet. * **Documentation**. Solana general description, operations, token vesting tutorial. * **Public API**. You can now use the Chainstack API to manage your Solana nodes. * **Crypto payments**. You can now top up your balance with crypto and Chainstack will automatically deduct all costs from your balance. * **Protocols**. * Avalanche support. You can now deploy full and archive Avalanche nodes with X-Chain and C-Chain endpoints on mainnet and Fuji testnet. * Tezos Hangzhounet testnet support. The Florencenet testnet support is deprecated. * **Documentation**. Avalanche general description, operations, flash loan tutorial. * **Public API**. You can now use the Chainstack API to manage your Avalanche, Fantom, and Tezos nodes. * **Billing**. Removed free usage period on the Developer plan. The Developer plan keeps other usage features, including the free 3M requests. * **Protocols**. * Fantom support. You can now deploy full and archive Fantom nodes on mainnet and testnet. * Tezos dedicated node support. You can now deploy dedicated full and archive Tezos nodes. * **Documentation**. Fantom general description, operations, ERC-721 contract tutorial. * **Node metrics**. Added node requests metrics for public chain protocols. * **Protocols**. Tezos support. You can now deploy full and archive Tezos nodes on mainnet and Florence testnet. * **Node details**. Node access and credentials for public chain and Quorum nodes now include key-protected endpoints. * **Clouds**. You can now send requests for dedicated public chain node deployment in the Amazon Web Services. * **Billing**. You can now upgrade to the Enterprise plan from your billing settings. * **Pricing**. Free usage period on the Developer plan reduced from 14 to 7 days. * **Documentation**. Tezos general description, operations, fund contract tutorial. * **Protocols**. Binance Smart Chain support. You can now deploy full Binance Smart Chain nodes on mainnet and testnet. * **2FA**. You can now add an extra layer of security to Chainstack account with two-factor authentication (2FA). Once enabled, you will be prompted to enter a code generated by your mobile device each time you log in. * **Documentation**. Binance Smart Chain general description, operations, BEP-1155 contract tutorial. * **Protocols**. Polygon support. You can now deploy full and archive Polygon nodes on mainnet and testnet. * **Documentation**. Polygon general description, operations, smart contract bridging tutorial. * **Pricing**. Flexible pay-per-request pricing for shared nodes on all [subscription plans](https://chainstack.com/pricing/). * **Billing**. Payments in cryptocurrency for customers on the [Enterprise plan](https://chainstack.com/pricing/). * **Public API**. You can now use the Chainstack API to manage your Corda applications. See also [Chainstack 2.3: Faster and better public blockchain APIs](https://chainstack.com/chainstack-2-3-faster-and-better-public-blockchain-apis/). * **Protocols**. Harmony support. You can now use shared and dedicated Harmony Shard 0 nodes on mainnet and devnet. * **Documentation**. Harmony general description, operations, simple metaverse tutorial. * **Subscription downgrading**. You have now the ability to downgrade your plan on the **Settings** > **Billing** page. * **Private hosting**. You can now use your own Amazon Elastic Kubernetes Service (EKS) infrastructure to deploy blockchain nodes and networks on Chainstack. * **Clouds**. You can now deploy your nodes and networks in the [Amazon Web Services US East](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. * **Identity management**. You can now securely manage your organization's identity for Hyperledger Fabric networks. * **Billing**. You can now reattempt past due payments through billing card change or directly in the activity log. The activity log also provides more information on your billing events. * **Protocols**. * Ethereum Rinkeby support. * Quorum 2.7.0 support. * **GraphQL**. You can now query dedicated Ethereum full nodes and archive nodes with GraphQL. * **Public API**. You can now use the Chainstack API to manage your resources. * **Protocols** * Corda 4.5 support. * Hyperledger Fabric 2.2 support. * **Node management**. You can now stop and start your nodes to save your usage costs. * **Service nodes**. You can now access in the UI the service nodes deployed with your consortium networks. * **Clouds**. You can now deploy your nodes and networks in the [Microsoft Azure UK South](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. * **Pricing**. No more user limit on all [pricing plans](https://chainstack.com/pricing/). * **Protocols**. Bitcoin testnet support. * **Pricing** * New Growth pricing plan. * Reduced fixed plan costs. * Reduced usage costs. * Uniform usage rates for all cloud providers. See the [new pricing plans](https://chainstack.com/pricing/) and a [blog post introducing the new pricing](https://chainstack.com/new-growth-tier-and-reduced-costs-for-all-plans/). * **Protocols** * Corda 4.4 support. * Quorum 2.6.0 support. [Tessera](https://docs.tessera.consensys.net/) 0.10.5 support. * **Node logs**. You can now access the logs for your dedicated nodes in the new node log viewer, with the ability to browse by date and container. * **Node resources**. You can now view the resources that have been dynamically allocated to each of your dedicated nodes. * **CorDapp notifications**. You will now receive notifications based on the successful or failed result of CorDapp installations or removals initiated through Chainstack's CorDapp management interface. * **UI**. You can now view hosting information directly on the node list for each network. * **Protocols** * Corda Network and Corda Pre-Production Network support. You can now join the Corda production and pre-production networks. See also Join a public network. * Hyperledger Fabric 2.1.0 support. * **Clouds**. You can now deploy your nodes and networks in the Amazon Web Services US West region. * **Vault**. You can now store your Corda identity key material in a secure vault. * **Identity management**. You can now securely manage your organization identity for the Corda networks. * **Protocols**. Hyperledger Fabric support. You can deploy a multi-org Hyperledger Fabric v2.0.1 network, complete with Raft ordering service, new chaincode lifecycle capabilities, and a network explorer. * **UI**. Updated cloud provider selector to provide easier visibility of available hosting options. * **Documentation**. Hyperledger Fabric general description, operations, chaincode tutorial. * **Protocols** * Ethereum archive node support. Deploy a dedicated Ethereum archive node. Connect to an elastic archive node for free on Business and Enterprise tiers. Geth 1.9.10 support. * Quorum transaction manager Tessera 0.10.3 support. * **UI**. In-line edit button for all resources to improve access to quick actions. * **Documentation**. Ethereum modes section to help in selecting whether to deploy a full or archive Ethereum node. * **Protocols** * Corda 4.3 support. You can also now install CorDapps containing workflows and contracts in a single JAR file. * Quorum 2.4.0 support. Learner node role support. [Tessera](https://docs.tessera.consensys.net/) 0.10.2 support. * MultiChain 2.0.3 support. * Bitcoin 0.19.0.1 support. Bitcoin nodes now run with the `txindex` flag allowing to query any transaction on the blockchain. * **Documentation**. Context-sensitive documentation links in the platform UI. * **Protocols**. Bitcoin 0.18 support. You can now deploy dedicated nodes with Bolt synchronization and free access to elastic nodes on the mainnet. * **Network and node statuses**. You can now see network and node statuses to when operations are occurring on any network resource. * **Node details**. Node access and credentials are now grouped into sections, so it's easier to read. * **Billing**. View individual metered usage cost line items for the current billing period, updated hourly. * **Documentation**. Bitcoin general description and Bitcoin operations. * **Protocols** * Corda 4.1 support. You can now deploy a consortium network with network map service, doorman, and notary. Dynamically add and remove participant nodes. Install and remove CorDapps. * Quorum 2.3.0 support. Replaced Constellation transaction manager with Tessera. * **User management**. Invite user s into the organization. * **Billing**. View metered usage cost for the current billing period, updated hourly. * **Documentation** * Corda general description and Corda operations. * New tutorials: [No ticket scalping CorDapp](/docs/corda-tutorial-no-ticket-scalping-cordapp) and [Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix). * **Security**. [Protected endpoints](https://chainstack.com/protected-endpoints-for-ethereum-and-quorum-nodes-on-chainstack/) added for Quorum and Ethereum nodes. * **Deployment**. Bolt snapshots are now updated hourly so that nodes are deployed and synchronized with fresher snapshots. * **Protocols.** Numerous stability improvements for Quorum networks and nodes. * **Activity and events**. In-platform notifications and activity log introduced to provide visibility into account activity. * **Documentation.** Complete content and structure overhaul for improved access, browsing, and discovery. **Documentation** * New guide: Deploying a hybrid MultiChain network. * New tutorial: [Loyalty program with Truffle](/docs/quorum-tutorial-loyalty-program-with-truffle). * **Protocols** * Added Ethereum Ropsten testnet support with Bolt. * Updated Quorum explorer from [blk-explorer-free](https://github.com/blk-io/blk-explorer-free) to [epirus-free](https://github.com/blk-io/epirus-free). * **Node details**. Added complete details for Quorum, including default wallet private/public keys. Standardized fields for all protocols. * **Documentation**. New tutorial: [Academic certificates with Truffle](/docs/ethereum-tutorial-academic-certificates-with-truffle). * **Protocols**. Introduced elastic Ethereum mainnet nodes support. * **Projects**. The project description field is now optional. * **Updating and deleting resources**. You can now edit project name and description, network and node name. You can delete nodes by the owner, networks are deleted automatically when the last node is deleted, projects can be deleted if empty. * **Navigation**. Updated menu with links to Documentation and [Support](https://support.chainstack.com/). * **Support**. Added Zendesk widget. **Protocols** * Added MultiChain 2.0 release support. * Added Quorum 2.2.3 support. **Registration and sign in**. Password recovery via email. * **Registration and sign in**. Signing up via email, password, and personal details. Signing up for a member invited to the consortium project. Email confirmation on successful registration. Email verification. * **Consortium project**. Wizards to create a new network and add a node to the existing network. Invitation of other organizations as members to the project via email. * **Public chain project**. Wizards to join a public network and add another node. * **Protocols** * MultiChain 2.0-beta-1 support with blockchain explorer. * Quorum 2.2.1 support with Raft and IBFT, and blk-explorer-free blockchain explorer. * Full Ethereum mainnet node deployment with Bolt rapid sync mechanism. * **Clouds**. Google Cloud Platform and Amazon Web Services in the Asia-Pacific region. * **Node details**. Default wallet private/public keys, and chain name for MultiChain. Constellation private/public keys, and network ID for Quorum. Sync mode and network ID for Ethereum. Client version for all protocols. * **Settings**. Editing personal and organization details. Changing password. * **Documentation**. Portal based on VuePress. # Chainstack updates: March 17, 2019 Source: https://docs.chainstack.com/changelog/chainstack-release-notes-mar-17-2019 * **Registration and sign in**. Signing up via email, password, and personal details. Signing up for a member invited to the consortium project. Email confirmation on successful registration. Email verification. * **Consortium project**. Wizards to create a new network and add a node to the existing network. Invitation of other organizations as members to the project via email. * **Public chain project**. Wizards to join a public network and add another node. * **Protocols** * MultiChain 2.0-beta-1 support with blockchain explorer. * Quorum 2.2.1 support with Raft and IBFT, and blk-explorer-free blockchain explorer. * Full Ethereum mainnet node deployment with Bolt rapid sync mechanism. * **Clouds**. Google Cloud Platform and Amazon Web Services in the Asia-Pacific region. * **Node details**. Default wallet private/public keys, and chain name for MultiChain. Constellation private/public keys, and network ID for Quorum. Sync mode and network ID for Ethereum. Client version for all protocols. * **Settings**. Editing personal and organization details. Changing password. * **Documentation**. Portal based on VuePress. # Chainstack updates: April 1, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-1-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) and [Dedicated Nodes](/docs/dedicated-node) in archive mode with Debug & Trace APIs for the Sonic Blaze testnet. # Chainstack updates: April 11, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-11-2019 **Protocols** * Added MultiChain 2.0 release support. * Added Quorum 2.2.3 support. # Chainstack updates: April 12, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-12-2023 * **Services**. [Chainstack Subgraphs](/docs/subgraphs-introduction) has finished the closed beta and is now open to all our customers. * **Documentation**. We added a [series of developer tutorials](/docs/chainstack-subgraphs-tutorials) for Chainstack Subraphs users: from a newbie to an expert. Try them out and let us know what you think. # Chainstack updates: April 12, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-12-2023-1 **Protocols**. Ethereum MEV API is now deprecated since it hasn't been functional since the Merge. # Chainstack updates: April 14, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-14-2022 * **Protocols**. Ethereum Goerli support. * **Role management** * You can now [assign roles](/docs/manage-your-organization#invite-a-user-to-the-organization) to your organization's users and [change roles](/docs/manage-your-organization#change-a-user-role-in-the-organization) of the existing users. * There are now three roles: [Admin, Editor, Viewer](/docs/manage-your-organization#users-and-their-roles). # Chainstack updates: April 17, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-17-2025 **Nodes** and **Add-ons**. You can now enable [Add-ons](/docs/add-ons): [Yellowstone gRPC Geyser plugin](/docs/yellowstone-grpc-geyser-plugin) on Solana and [Unlimited Node](/docs/unlimited-node) on any node. # Chainstack updates: April 18, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-18-2023 * **Services**. A revamped [Chainstack Marketplace](https://console.chainstack.com/marketplace) is now live with the Covalent and Valha applications already available for installation—boost up your DApps! * **Documentation**. The following article is added in Web3 \[De]Coded: * [Mastering JSON web tokens: How to implement secure user authentication](/docs/tutorial-mastering-jwt-how-to-implement-secure-user-authentication) # Chainstack updates: April 2, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-2-2019 **Registration and sign in**. Password recovery via email. # Chainstack updates: April 22, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-22-2020 * **Protocols** * Corda Network and Corda Pre-Production Network support. You can now join the Corda production and pre-production networks. See also Join a public network. * Hyperledger Fabric 2.1.0 support. * **Clouds**. You can now deploy your nodes and networks in the Amazon Web Services US West region. * **Vault**. You can now store your Corda identity key material in a secure vault. * **Identity management**. You can now securely manage your organization identity for the Corda networks. # Chainstack updates: April 22, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-22-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode for Solana Mainnet. # Chainstack updates: April 27, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-27-2023 **Endpoints**. WSS endpoints are now available for nodes with Warp transactions enabled. # Chainstack updates: April 28, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-28-2023 **Debug and trace API** are now available for elastic Arbitrum archive nodes. # Chainstack updates: April 28, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-28-2025 * **Developer Portal** — Chainstack Developer Portal (*this very site*) migrated to [Mintlify](https://mintlify.com/) for AI-ready features, an open-source repository, and snappier performance overall. You can now consume our docs as a human and as an agent through the [open-source repository](https://github.com/chainstack/dev-portal), [llms-full.txt](https://docs.chainstack.com/llms-full.txt), or an MCP server. See the repository's README for all the details. # Chainstack updates: April 4, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-4-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Arbitrum One Mainnet. # Chainstack updates: April 6, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-april-6-2020 * **Pricing**. Flexible pay-per-request pricing for shared nodes on all [subscription plans](https://chainstack.com/pricing/). * **Billing**. Payments in cryptocurrency for customers on the [Enterprise plan](https://chainstack.com/pricing/). * **Public API**. You can now use the Chainstack API to manage your Corda applications. See also [Chainstack 2.3: Faster and better public blockchain APIs](https://chainstack.com/chainstack-2-3-faster-and-better-public-blockchain-apis/). # Chainstack updates: August 10, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-10-2023 * **Protocols**. Scroll Sepolia Testnet support. * **Global elastic node**. Now you can deploy global elastic nodes for Avalanche and Scroll Sepolia Testnet. # Chainstack updates: August 15, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-15-2024 **Global elastic nodes**. Debug & trace on Fantom and `blockSubscribe` on Solana. * Fantom — you can now deploy global elastic nodes with debug & trace APIs for Fantom Mainnet. See the [debug & trace API reference](/reference/debug_traceblockbyhash-fantom-chain). * Solana — you now do the `blockSubcribe` on the Solana Mainnet & Devnet. See the [blockSubscribe API reference](/reference/blocksubscribe-solana). # Chainstack updates: August 20, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-17-2022 * **Protocols**. [NEAR](/docs/protocols-networks) support. * **Documentation**. NEAR operations, [simple metamorphic contract tutorial](/docs/near-tutorial-creating-and-upgrading-a-simple-message-contract). # Chainstack updates: August 18, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-18-2021 * **Protocols**. Tezos support. You can now deploy full and archive Tezos nodes on mainnet and Florence testnet. * **Node details**. Node access and credentials for public chain and Quorum nodes now include key-protected endpoints. * **Clouds**. You can now send requests for dedicated public chain node deployment in the Amazon Web Services. * **Billing**. You can now upgrade to the Enterprise plan from your billing settings. * **Pricing**. Free usage period on the Developer plan reduced from 14 to 7 days. * **Documentation**. Tezos general description, operations, fund contract tutorial. # Chainstack updates: August 20, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-20-2024 Trader nodes on Solana and 150+ cryptocurrencies to top up your balance. * **Trader nodes**. You can now send high-speed transactions on Solana. See [Trader nodes](/docs/warp-transactions). * **BIling**. You can now top up your balance with 150+ cryptocurrencies using NOWPayments as payment provider. See [Manage your billing](/docs/manage-your-billing). # Chainstack updates: August 23, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-23-2022 * **Protocols**. [Gnosis Chain](/docs/protocols-networks) support. * **Documentation**. Gnosis Chain operations, [simple soulbound token tutorial](/docs/gnosis-tutorial-simple-soulbound-token-with-remix-and-openzeppelin). # Chainstack updates: August 26, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-26-2024 **Protocols**. Now, Polygon zkEVM supports connecting over WebSocket. # Chainstack updates: August 29, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-29-2022 * **Protocols**. [Fuse](/docs/protocols-networks) support. * **Documentation**. Fuse operations, [simple multi-signature contract tutorial with hardhat](/docs/fuse-tutorial-simple-multisig-contract-with-hardhat). # Chainstack updates: August 31, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-31-2022 * **Protocols**. [Arbitrum](/docs/protocols-networks) support. * **Documentation**. Arbitrum operations, [simple L1 to L2 messaging tutorial](/docs/arbitrum-tutorial-l1-to-l2-messaging-smart-contract). * **Crypto payments**. In addition to the previously available BTC, ETH, and USDC, you can now [top up your balance](/docs/manage-your-billing) with Dogecoin, Litecoin, Dai, Bitcoin Cash, ApeCoin, SHIBA INU, and USDT. # Chainstack updates: August 4, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-august-4-2023 **Protocols**. Base Mainnet support. # Chainstack updates: December 1, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-1-2021 * **Protocols**. * Fantom support. You can now deploy full and archive Fantom nodes on mainnet and testnet. * Tezos dedicated node support. You can now deploy dedicated full and archive Tezos nodes. * **Documentation**. Fantom general description, operations, ERC-721 contract tutorial. * **Node metrics**. Added node requests metrics for public chain protocols. # Chainstack updates: December 15, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-15-2022 * **Protocols**. [Gnosis Chain Chiado](/docs/protocols-networks) testnet support for elastic and dedicated nodes. # Chainstack updates: December 15, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-15-2022-1 * **Crypto payments**. In addition to the previously available cryptocurrencies, you can now [top up your balance](/docs/manage-your-billing) with Polygon MATIC. # Chainstack updates: December 2, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-2-2022 * **Protocols**. [Gnosis Chain clients](/docs/protocols-clients) updated for The Merge. * **Documentation**. * Gnosis Chain API [reference](/reference/gnosis-getting-started) update. * Debug and trace methods added to the Ethereum API [reference](/reference/ethereum-debug-trace-rpc-methods). * Chainstack [Subgraphs](/docs/deploy-a-subgraph). # Chainstack updates: December 24, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-24-2024 **Nodes**. The Fantom nodes are now running on the Sonic client. # Chainstack updates: December 30, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-30-2021 * **Protocols**. * Avalanche support. You can now deploy full and archive Avalanche nodes with X-Chain and C-Chain endpoints on mainnet and Fuji testnet. * Tezos Hangzhounet testnet support. The Florencenet testnet support is deprecated. * **Documentation**. Avalanche general description, operations, flash loan tutorial. * **Public API**. You can now use the Chainstack API to manage your Avalanche, Fantom, and Tezos nodes. * **Billing**. Removed free usage period on the Developer plan. The Developer plan keeps other usage features, including the free 3M requests. # Chainstack updates: December 31, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-31-2019 * **Protocols** * Corda 4.3 support. You can also now install CorDapps containing workflows and contracts in a single JAR file. * Quorum 2.4.0 support. Learner node role support. [Tessera](https://docs.tessera.consensys.net/) 0.10.2 support. * MultiChain 2.0.3 support. * Bitcoin 0.19.0.1 support. Bitcoin nodes now run with the `txindex` flag allowing to query any transaction on the blockchain. * **Documentation**. Context-sensitive documentation links in the platform UI. # Chainstack updates: December 6, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-december-6-2023 **Billing**. No credit card required on sign-up. Sign up and get the free [Developer plan](https://chainstack.com/pricing/) with 3 million requests monthly. # Chainstack updates: February 10, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-10-2022 * **Protocols**. Solana support. You can now use Solana nodes on mainnet and devnet. * **Documentation**. Solana general description, operations, token vesting tutorial. * **Public API**. You can now use the Chainstack API to manage your Solana nodes. * **Crypto payments**. You can now top up your balance with crypto and Chainstack will automatically deduct all costs from your balance. # Chainstack updates: February 12, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-12-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for ZKsync Era mainnet. # Chainstack updates: February 13, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-13-2020 * **Protocols** * Ethereum archive node support. Deploy a dedicated Ethereum archive node. Connect to an elastic archive node for free on Business and Enterprise tiers. Geth 1.9.10 support. * Quorum transaction manager Tessera 0.10.3 support. * **UI**. In-line edit button for all resources to improve access to quick actions. * **Documentation**. Ethereum modes section to help in selecting whether to deploy a full or archive Ethereum node. # Chainstack updates: February 14, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-14-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Bitcoin mainnet. # Chainstack updates: February 15, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-15-2023 * **Tools**. You can now add the node endpoint with the **Add to MetaMask** button on the [node access page](/docs/manage-your-node#view-node-access-and-credentials). # Chainstack updates: February 18, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-18-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Bitcoin testnet. # Chainstack updates: February 2, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-2-2023 * **Protocols**. [Optimism](/docs/protocols-networks) Goerli testnet support for elastic and dedicated nodes. * **Documentation**. # Chainstack updates: February 20, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-20-2023 * **Clouds**. You can now deploy your elastic BNB Smart Chain full nodes in the Virtuozzo Amsterdam region. # Chainstack updates: February 21, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-21-2024 **Subgraphs**. Preconfigured and deployed subgraphs called Data APIs released. # Chainstack updates: February 21, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-21-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode for Cronos mainnet. # Chainstack updates: February 22, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-22-2023 * **Protocols**. Harmony migrated from devnet to [testnet](/docs/protocols-networks) for better experience. * **Billing**. You can now settle your failed payment by manually retrying either via topping up your crypto balance or via paying directly from your credit card. # Chainstack updates: February 23, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-23-2024 **Global elastic nodes.** Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Ronin Mainnet. The node client is Geth-based. # Chainstack updates: February 26, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-26-2024 **Global elastic nodes.** Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Ethereum Mainnet. # Chainstack updates: February 27, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-27-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Polygon Mainnet. # Chainstack updates: February 28, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-28-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for BNB Smart Chain Mainnet. # Chainstack updates: February 5, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-5-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for Ronin Saigon testnet. # Chainstack updates: February 8, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-8-2023 * **Protocols**. Gnosis Chain Sokol testnet is now deprecated. For development purposes, use [Gnosis Chain Chiado](/docs/protocols-networks) testnet. # Chainstack updates: February 9, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-9-2023 * **Billing**. The refined billing page now shows: * Failed payment notification * Past due invoices awaiting payments # Chainstack updates: February 9, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-february-9-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes for Optimism Mainnet. The node client is op-erigon. You can now enjoy the debug & trace APIs out of the box. For the method run-down, see [Optimism API reference](/reference/optimism-api-reference). # Chainstack updates: January 10, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-10-2023 * **Protocols**. [Aptos](/docs/protocols-networks) support. * **Documentation**. Aptos # Chainstack updates: January 17, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-17-2023 * **APIs**. 5 [Solana APIs](/reference/solana-getting-started) enabled on devnet. * **Documentation**. Solana API reference updates. # Chainstack updates: January 17, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-17-2025 **Platform**. Improved [node requests metrics](/docs/manage-your-node#view-node-requests-metrics) —the data is now granular to 1 minute; the timeframes are now 1 hour, 6 hours, 12 hours, 24 hours, 7 days. # Chainstack updates: January 18, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-18-2023 **Global elastic nodes**. Now you can deploy global elastic nodes for Base Sepolia Testnet, Optimism Sepolia Testnet, zkSync Era Sepolia Testnet. # Chainstack updates: January 19, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-19-2023 * **Protocols**. [Filecoin](/docs/protocols-networks) Hyperspace testnet support for elastic and dedicated nodes. * **Documentation**. # Chainstack updates: January 22, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-22-2020 * **Private hosting**. You can now use your own Amazon Elastic Kubernetes Service (EKS) infrastructure to deploy blockchain nodes and networks on Chainstack. * **Clouds**. You can now deploy your nodes and networks in the [Amazon Web Services US East](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. * **Identity management**. You can now securely manage your organization's identity for Hyperledger Fabric networks. # Chainstack updates: January 22, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-22-2025 * **Pricing**. New Pro plan introduced. See [Pricing](https://chainstack.com/pricing). * **Protocols**. All Tezos networks are now deprecated. # Chainstack updates: January 25, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-25-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes for Starknet Sepolia Testnet. # Chainstack updates: January 30, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-30-2023 * **Billing**. The refined billing page now shows: * An overview of your subscription plan and support level * A detailed **Usage** table * A list of crypto payments and downloadable invoices # Chainstack updates: January 30, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-30-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes for Base Mainnet. # Chainstack updates: January 7, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-7-2025 **Nodes**. Now you can toggle MEV protection for your transactions on the Ethereum mainnet nodes. See [MEV protection](/docs/mev-protection). # Chainstack updates: January 9, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-january-9-2025 **Nodes**. Enjoy a new state-of-the-art node selection wizard and easily select the node type that best matches your needs: [Global Node](/docs/global-elastic-node), [Unlimited Node](/docs/unlimited-node), [Trader Node](/docs/trader-node), or a [Dedicated Node](/docs/dedicated-node). # Chainstack updates: July 1, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-1-2019 **Documentation** * New guide: Deploying a hybrid MultiChain network. * New tutorial: [Loyalty program with Truffle](/docs/quorum-tutorial-loyalty-program-with-truffle). # Chainstack updates: July 10, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-10-2023 **Protocols**. zkSync Era Goerli Testnet support for elastic and dedicated nodes. # Chainstack updates: July 13, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-13-2023 **Chainstack Subgraphs**. Elastic indexers are now available for Avalanche, Fantom, and Gnosis Chain mainnets. # Chainstack updates: July 14, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-14-2023 **Global elastic node**. Now you can deploy global elastic nodes for Arbitrum and Fantom. # Chainstack updates: July 18, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-18-2023 * **Protocols**. zkSync Era Mainnet support. * **Billing**. When changing your subscription plan, you can now use promo codes with discounts to a plan's regular price. # Chainstack updates: July 20, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-20-2024 **Protocols**. Now, you can deploy global elastic nodes for opBNB Mainnet. # Chainstack updates: July 27, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-27-2023 **Protocols**. Base Goerli Testnet support for elastic nodes. # Chainstack updates: July 28, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-28-2023 **Global elastic node**. Now you can deploy global elastic nodes for Solana. # Chainstack updates: July 4, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-4-2023 **Global elastic node**. Now you can deploy global elastic nodes for Polygon and BNB Smart Chain. # Chainstack updates: July 7, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-7-2020 * **Protocols**. Bitcoin testnet support. * **Pricing** * New Growth pricing plan. * Reduced fixed plan costs. * Reduced usage costs. * Uniform usage rates for all cloud providers. See the [new pricing plans](https://chainstack.com/pricing/) and a [blog post introducing the new pricing](https://chainstack.com/new-growth-tier-and-reduced-costs-for-all-plans/). # Chainstack updates: August 31, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-7-2020-1 * **Public API**. You can now use the Chainstack API to manage your resources. * **Protocols** * Corda 4.5 support. * Hyperledger Fabric 2.2 support. * **Node management**. You can now stop and start your nodes to save your usage costs. * **Service nodes**. You can now access in the UI the service nodes deployed with your consortium networks. * **Clouds**. You can now deploy your nodes and networks in the [Microsoft Azure UK South](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. * **Pricing**. No more user limit on all [pricing plans](https://chainstack.com/pricing/). # Chainstack updates: July 7, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-july-7-2022 * **Protocols**. * Harmony migrated from testnet to [devnet](/docs/protocols-networks) for better experience. * Tezos migrated from Hangzhounet to [Ithacanet](/docs/protocols-networks). * **Pricing**. An update to the pricing plans with new features and increased included requests. See [the blog post with an overview](https://chainstack.com/pricing-update-2022/). * **Node naming**. Shared nodes are now elastic node s to reflect the underlying architecture and scalable infrastructure. * **APIs**. * Fast transaction propagation with the [Warp transactions](/docs/warp-transactions) feature on Ethereum, Polygon, and BNB Smart Chain. * [Miner Extractable Value (MEV) API](/changelog/chainstack-updates-april-12-2023-1) for elastic Ethereum nodes for mainnet and Goerli testnet. * [Debug and trace APIs](/docs/debug-and-trace-apis) for elastic Ethereum archive nodes. # Chainstack updates: June 1, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-1-2021 * **Protocols**. Binance Smart Chain support. You can now deploy full Binance Smart Chain nodes on mainnet and testnet. * **2FA**. You can now add an extra layer of security to Chainstack account with two-factor authentication (2FA). Once enabled, you will be prompted to enter a code generated by your mobile device each time you log in. * **Documentation**. Binance Smart Chain general description, operations, BEP-1155 contract tutorial. # Chainstack updates: June 1, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-1-2022 * **Protocols**. * Ethereum [Erigon support](/docs/protocols-clients) for dedicated archive nodes. * StarkNet [testnet](/docs/protocols-networks) support. # Chainstack updates: June 11, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-11-2024 **Protocols**. Now, you can deploy global elastic nodes for Blast Mainnet. # Chainstack updates: June 14, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-14-2023 **Networks**. [Optimism Mainnet](/docs/protocols-networks) support. # Chainstack updates: June 15, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-15-2023 * **Pricing** * Chainstack pricing was rehauled. Check out the changes in our [pricing page](https://chainstack.com/pricing/). And learn more about [how our pricing works](/docs/pricing-introduction). * With pricing rehaul, we're introducing [request units](/docs/pricing-introduction#what-are-request-units). This allows us to offer a fairer and more flexible pricing structure. * **Platform**. [Global elastic nodes](/docs/global-elastic-node) are available for Ethereum Mainnet. Enjoy geo-balanced nodes with enhanced performance and reduced latency. # Chainstack updates: June 2, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-2-2020 * **Protocols** * Corda 4.4 support. * Quorum 2.6.0 support. [Tessera](https://docs.tessera.consensys.net/) 0.10.5 support. * **Node logs**. You can now access the logs for your dedicated nodes in the new node log viewer, with the ability to browse by date and container. * **Node resources**. You can now view the resources that have been dynamically allocated to each of your dedicated nodes. * **CorDapp notifications**. You will now receive notifications based on the successful or failed result of CorDapp installations or removals initiated through Chainstack's CorDapp management interface. * **UI**. You can now view hosting information directly on the node list for each network. # Chainstack updates: June 21, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-21-2019 * **Protocols** * Added Ethereum Ropsten testnet support with Bolt. * Updated Quorum explorer from [blk-explorer-free](https://github.com/blk-io/blk-explorer-free) to [epirus-free](https://github.com/blk-io/epirus-free). * **Node details**. Added complete details for Quorum, including default wallet private/public keys. Standardized fields for all protocols. * **Documentation**. New tutorial: [Academic certificates with Truffle](/docs/ethereum-tutorial-academic-certificates-with-truffle). # Chainstack updates: June 26, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-26-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes for Oasis Sapphire Mainnet and Testnet. # Chainstack updates: June 27, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-june-27-2023 **Protocols**. Archive nodes are now available for Optimism Mainnet. # Chainstack updates: March 10, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-10-2022 * **Protocols**. Harmony support. You can now use shared and dedicated Harmony Shard 0 nodes on mainnet and devnet. * **Documentation**. Harmony general description, operations, simple metaverse tutorial. * **Subscription downgrading**. You have now the ability to downgrade your plan on the **Settings** > **Billing** page. # Chainstack updates: March 10, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-10-2024 **Protocols**. All NEAR nodes are now deprecated. Note that Aurora remains supported. # Chainstack updates: March 11, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-11-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for Gnosis Chain Chiado testnet. # Chainstack updates: March 12, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-12-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for BNB Smart Chain Testnet. # Chainstack updates: March 12, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-12-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in full mode for the [TRON](/docs/tron-tooling) mainnet. # Chainstack updates: March 13, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-13-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for the Polygon zkEVM mainnet. # Chainstack updates: March 2, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-2-2020 * **Protocols**. Hyperledger Fabric support. You can deploy a multi-org Hyperledger Fabric v2.0.1 network, complete with Raft ordering service, new chaincode lifecycle capabilities, and a network explorer. * **UI**. Updated cloud provider selector to provide easier visibility of available hosting options. * **Documentation**. Hyperledger Fabric general description, operations, chaincode tutorial. # Chainstack updates: March 20, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-20-2025 **Protocols**. Now, you can deploy full [Global Nodes](/docs/global-elastic-node) and [Dedicated Nodes](/docs/dedicated-node) for the TRON Nile testnet. # Chainstack updates: March 21, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-21-2024 **Global elastic nodes** for Avalanche Fuji Testnet and Polygon Amoy Testnet. * Avalanche Fuji Testnet — you can now deploy global elastic nodes in archive mode with debug & trace APIs. * Polygon Amoy Testnet — you can now deploy global elastic nodes in archive mode with debug & trace APIs. # Chainstack updates: March 21, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-21-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) for the Aptos mainnet. # Chainstack updates: March 22, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-22-2023 * **IPFS Storage**. Chainstack launches the closed beta for the decentralized storage solution—IPFS Storage. [Reach out to us](/docs/ipfs-storage-introduction) to participate. * **Documentation**. [IPFS Storage](/docs/work-with-ipfs-storage) and [IPFS Storage API reference](/reference/chainstack-platform-api-get-pin-list). # Chainstack updates: March 24, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-24-2025 **Protocols**. Now, you can deploy full [Global Nodes](/docs/global-elastic-node) for the Aptos testnet. # Chainstack updates: March 27, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-27-2023 * **Protocols**. [Polygon zkEVM](/docs/protocols-networks) testnet support for elastic and dedicated nodes. * **Documentation** # Chainstack updates: March 29, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-29-2023 * **Protocols** * **Documentation** # Chainstack updates: March 31, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-31-2023 * **Clouds and regions**. You can now deploy your elastic Arbitrum One Mainnet nodes in the following USA regions: * Full nodes — Amazon Web Services Oregon * Archive nodes — Virtuozzo Dallas # Chainstack updates: March 4, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-4-2024 **Global elastic nodes** for Ethereum Holešky & Sepolia. * Ethereum Holešky Testnet — you can now deploy [global elastic nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs. The node client is running Erigon. * Ethereum Sepolia Testnet — you can now deploy [global elastic nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs. The node client is running Erigon. # Chainstack updates: March 4, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-4-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for the Sonic mainnet. See also a fun tutorial [Sonic: Swap farming for points walkthrough in Python](/docs/sonic-swap-farming-for-points-walkthrough-in-python). # Chainstack updates: March 5, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-5-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Arbitrum Sepolia Testnet. # Chainstack updates: March 5, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-5-2025 **Nodes**. Now you can toggle MEV protection for your transactions on the Binance Smart Chain mainnet nodes. See [MEV protection](/docs/mev-protection). # Chainstack updates: March 6, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-6-2023 **Protocols**. Ethereum Rinkeby and Ropsten testnets are now deprecated. For development purposes, use [Sepolia and Goerli testnets](/docs/protocols-networks). # Chainstack updates: March 7, 2025 Source: https://docs.chainstack.com/changelog/chainstack-updates-march-7-2025 **Protocols**. Now, you can deploy [Global Nodes](/docs/global-elastic-node) in archive mode with debug & trace APIs for Gnosis Chain mainnet. # Chainstack updates: May 1, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-1-2024 **Global elastic nodes**. Now, Scroll mainnet nodes are deployed with [debug and trace APIs](/docs/debug-and-trace-apis). # Chainstack updates: May 10, 2021 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-10-2021 * **Protocols**. Polygon support. You can now deploy full and archive Polygon nodes on mainnet and testnet. * **Documentation**. Polygon general description, operations, smart contract bridging tutorial. # Chainstack updates: May 11, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-11-2023 **IPFS Storage**. You can now create and manage IPFS [dedicated gateways](/docs/ipfs-storage-introduction#what-are-different-types-of-gateways) to have more control over your files. # Chainstack updates: May 16, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-16-2024 **Global elastic nodes**. Now, you can deploy global elastic nodes in archive mode with debug & trace APIs for Avalanche Mainnet. # Chainstack updates: May 23, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-23-2024 **Protocols**. Now, you can deploy global elastic nodes for Klaytn Mainnet. # Chainstack updates: May 27, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-27-2024 **Protocols**. Now, you can deploy global elastic nodes for Celo Mainnet and Moonbeam Mainnet. # Chainstack updates: May 29, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-29-2023 * **Accounts**. Email verification is now mandatory for newly created accounts. # Chainstack updates: May 30, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-30-2023 * **Networks**. Filecoin Hyperspace Testnet is deprecated. For development purposes, use Filecoin Calibration Testnet. # Chainstack updates: May 5, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-5-2022 * **Protocols**. StarkNet support. * **Documentation**. StarkNet general description, operations, [simple L1L2 messaging tutorial](/docs/starknet-tutorial-an-nft-contract-with-nile-and-l1-l2-reputation-messaging). * **Clouds**. You can now deploy your nodes in the [Amazon Web Services Tokyo](https://support.chainstack.com/hc/en-us/articles/360024804711-Data-center-locations) region. # Chainstack updates: May 9, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-may-9-2019 * **Protocols**. Introduced elastic Ethereum mainnet nodes support. * **Projects**. The project description field is now optional. # Chainstack updates: November 14, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-14-2019 * **Protocols**. Bitcoin 0.18 support. You can now deploy dedicated nodes with Bolt synchronization and free access to elastic nodes on the mainnet. * **Network and node statuses**. You can now see network and node statuses to when operations are occurring on any network resource. * **Node details**. Node access and credentials are now grouped into sections, so it's easier to read. * **Billing**. View individual metered usage cost line items for the current billing period, updated hourly. * **Documentation**. Bitcoin general description and Bitcoin operations. # Chainstack updates: November 16, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-16-2022 * **Protocols**. [StarkNet](/docs/protocols-networks) testnet2 (Goerli2) support for elastic and dedicated nodes. # Chainstack updates: November 16, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-16-2023 **Protocols**. You can now deploy global elastic nodes and dedicated nodes for Ronin Mainnet and Ronin Saigon Testnet. # Chainstack updates: November 2, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-2-2020 * **Billing**. You can now reattempt past due payments through billing card change or directly in the activity log. The activity log also provides more information on your billing events. # Chainstack updates: November 20, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-20-2023 **Global elastic node**. Now you can deploy global elastic nodes for Starknet Mainnet. # Chainstack updates: November 23, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-23-2022 * **Documentation**. [Chainstack Marketplace listing guidelines](/docs/list-your-app-on-marketplace). # Chainstack updates: November 29, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-29-2022 * **Documentation**. Solana API [reference](/reference/solana-getting-started) update. # Chainstack updates: November 29, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-29-2023 **Starknet**. Starknet mainnet and testnet nodes now support WebSocket. # Chainstack updates: November 3, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-3-2022 * **Protocols**. [Solana](/docs/protocols-networks) elastic and dedicated full nodes now supported on both mainnet and devnet when using Chainstack Cloud in the Ashburn, USA region. # Chainstack updates: November 30, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-30-2023 **Billing**. Pay-as-you-go released. You can now enable it in your [Billing](https://console.chainstack.com/user/settings/billing) to keep the Chainstack services operational on reach your plan's quota limit, or disable it to have a hard limit and stopping the services on reaching the quota. # Chainstack updates: November 6, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-6-2024 **Nodes**. The Solana nodes on the Devnet are now running on the Agave client. See also [Solana Agave 2.0 upgrade reference](/docs/solana-agave-20-upgrade-reference). # Chainstack updates: November 7, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-7-2022 * **Protocols**. [Aurora](/docs/protocols-networks) support for dedicated nodes. * **Documentation**. A simple on-chain governance tutorial for [Aurora](/docs/aurora-tutorial-simple-on-chain-governance-contract-with-remix-and-openzeppelin-wizard). # Chainstack updates: November 8, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-november-8-2022 * **Support levels management**. You can now change a support level for your organization on the **Settings** > **Billing** page. * **Documentation**. Learn how to [change your support level or subscription plan](/docs/manage-your-billing#manage-your-organization-subscription-plan-and-support-level). # Chainstack updates: October 1, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-1-2024 **Faucet**. The [Chainstack faucet](https://faucet.chainstack.com/) has a brand new and highly improved interface. # Chainstack updates: October 12, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-12-2022 * **Clouds**. You can now deploy your elastic Solana nodes using the new hosting option, Chainstack Cloud, in the Netherlands region. # Chainstack updates: October 14, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-14-2022 * **Protocols**. [Cronos](/docs/protocols-networks) support. * **Documentation**. Cronos. # Chainstack updates: October 17, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-17-2019 * **Protocols** * Corda 4.1 support. You can now deploy a consortium network with network map service, doorman, and notary. Dynamically add and remove participant nodes. Install and remove CorDapps. * Quorum 2.3.0 support. Replaced Constellation transaction manager with Tessera. * **User management**. Invite user s into the organization. * **Billing**. View metered usage cost for the current billing period, updated hourly. * **Documentation** * Corda general description and Corda operations. * New tutorials: [No ticket scalping CorDapp](/docs/corda-tutorial-no-ticket-scalping-cordapp) and [Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix). # Chainstack updates: October 17, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-17-2023 * **Global elastic node**. Now you can deploy global elastic nodes for Scroll Mainnet and Aurora Testnet. * **Protocols**. Starknet Testnet2 is now deprecated. For development purposes, use Starknet Testnet. # Chainstack updates: October 19, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-19-2023 **Global elastic node**. Now you can deploy global elastic nodes for Ethereum Holešky Testnet. # Chainstack updates: October 2, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-2-2024 **Faucet**. You can now access the Chainstack faucet as a Telegram Mini App (TMA). Open [Chainstack Faucet TMA](https://t.me/ChainstackFaucetBot). # Chainstack updates: October 25, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-25-2023 **Console**. Now Developer subscription plan has a rate limit of 30 request per seconds. Note that this does not apply to all other subscription plans. # Chainstack updates: October 3, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-3-2022 * **APIs**. * [Debug and trace APIs](/reference/polygon-debug-trace-rpc-methods) for elastic Polygon archive nodes. * [Debug and trace APIs](/reference/avalanche-debug-trace-rpc-methods) for elastic Avalanche archive nodes. # Chainstack updates: October 5, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-3-2022-1 * **Protocols**. Ethereum [Sepolia testnet support](/docs/protocols-networks) for full nodes. * **Documentation**. Solana API [reference](/reference/solana-getting-started). # Chainstack updates: October 31, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-31-2023 **Protocols**. You can now deploy global elastic nodes and dedicated nodes for Arbitrum Sepolia Testnet. # Chainstack updates: October 5, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-october-5-2023 * **Global elastic node**. Now you can deploy global elastic nodes for Aurora Mainnet. * **Protocols**. All Fuse networks are now deprecated. # Chainstack updates: September 08, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-08-2023 **Protocols**. Now you can deploy elastic zkSync Era archive nodes with debug and trace option enabled. # Chainstack updates: September 10, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-10-2024 **Protocols**. Now, you can deploy global elastic nodes for TON Mainnet & Testnet. See also [TON tooling](/docs/ton-tooling), [TON tutorials](/docs/protocols-tutorials), and the [TON API reference](/reference/getting-started-ton). # Chainstack updates: September 13, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-12-2022 * **Protocols**. [Aurora](/docs/protocols-networks) support. * **Documentation**. Aurora operations. # Chainstack updates: September 13, 2023 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-13-2023 **Chainstack Subgraphs**. Elastic indexer is now available for Ethereum Sepolia Testnet and Base Mainnet and Testnet. # Chainstack updates: September 13, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-13-2024 **Platform**. You can now access your requests stats for nodes and subgraphs. See [Statistics](/docs/see-statistics). # Chainstack updates: September 15, 2022 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-15-2022 * **Protocols**. [Ethereum](/docs/protocols-networks) consensus layer Beacon Chain support due to the Merge. * **APIs**. [Ethereum consensus layer Beacon Chain API](/reference/ethereum-getting-started). * **Documentation**. Beacon Chain API [reference](/reference/ethereum-getting-started) added to Ethereum API reference. # Chainstack updates: September 16, 2020 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-16-2020 * **Protocols**. * Ethereum Rinkeby support. * Quorum 2.7.0 support. * **GraphQL**. You can now query dedicated Ethereum full nodes and archive nodes with GraphQL. # Chainstack updates: September 19, 2024 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-19-2024 **Faucet**. The [Chainstack faucet](https://faucet.chainstack.com/) can now disperse testnet TON. # Chainstack updates: September 3, 2019 Source: https://docs.chainstack.com/changelog/chainstack-updates-september-3-2019 * **Security**. [Protected endpoints](https://chainstack.com/protected-endpoints-for-ethereum-and-quorum-nodes-on-chainstack/) added for Quorum and Ethereum nodes. * **Deployment**. Bolt snapshots are now updated hourly so that nodes are deployed and synchronized with fresher snapshots. * **Protocols.** Numerous stability improvements for Quorum networks and nodes. * **Activity and events**. In-platform notifications and activity log introduced to provide visibility into account activity. * **Documentation.** Complete content and structure overhaul for improved access, browsing, and discovery. # Welcome to Chainstack Developer Portal Source: https://docs.chainstack.com/changelog/welcome-to-chainstack-developer-portal Welcome to the developer hub and documentation for Chainstack Developer Portal! # About Marketplace Source: https://docs.chainstack.com/docs/about-marketplace The [Chainstack Marketplace](/marketplace/) is designed to meet the needs of developers and connect tools to the Web3 infrastructure to build high-quality apps and services. In addition to marketing and promotion, Chainstack Marketplace handles billing and integration with the infrastructure layer. This enables app developers to focus on bringing valuable enhancements and new features to Web3 services. Chainstack Marketplace includes the following types of products: Third-party apps to install on your nodes that add advanced capabilities to your networks. Complementary services that extend your nodes and networks. Tools designed to complement the blockchain development process. Learn how to interact with [Chainstack Marketplace and its applications](/docs/chainstack-marketplace-tutorials). # Introduction Source: https://docs.chainstack.com/docs/advanced-apis-introduction An application programable interface (API) provides the necessary tools for modern software development and gives developers the capabilities to unlock higher levels of functionality within applications. Advanced APIs offer an enhanced user experience due to their extended capabilities compared with basic APIs. Advanced APIs may include features such as authentication and authorization, WebSockets, advanced data structures and analytics, and other developer tools. # Aptos methods Source: https://docs.chainstack.com/docs/aptos-methods | Method | Availability | Comment | | -------------------------------------------------------------- | --------------------------------------------- | ------- | | /v1 | | | | /v1/accounts/\{address} | | | | /v1/accounts/\{address}/events/\{creation\_number} | | | | /v1/accounts/\{address}/events/\{event\_handle}/\{field\_name} | | | | /v1/accounts/\{address}/module/\{module\_name} | | | | /v1/accounts/\{address}/modules | | | | /v1/accounts/\{address}/resource/\{resource\_type} | | | | /v1/accounts/\{address}/resources | | | | /v1/accounts/\{address}/transactions | | | | /v1/blocks/by\_height/\{block\_height} | | | | /v1/blocks/by\_version/\{version} | | | | /v1/estimate\_gas\_price | | | | /v1/healthy | | | | /v1/spec | | | | /v1/submit\_transactions | | | | /v1/tables/\{table\_handle}/item | | | | /v1/tables/\{table\_handle}/raw\_item | | | | /v1/transactions | | | | /v1/transactions/batch | | | | /v1/transactions/by\_hash/\{txn\_hash} | | | | /v1/transactions/by\_version/\{txn\_version} | | | | /v1/transactions/encode\_submission | | | | /v1/transactions/simulate | | | | /v1/view | | | # Aptos tooling Source: https://docs.chainstack.com/docs/aptos-tooling ## Martian wallet You can set your [Martian wallet](https://martianwallet.xyz/) to interact through your Aptos nodes deployed with Chainstack. Open your Martian wallet and click the network selector. In the network selector, click **ADD CUSTOM NETWORK**. In the **Enter Node Url** field, enter the endpoint. See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). In the **Enter Faucet Url** field, enter the [Aptos testnet faucet](https://aptoslabs.com/testnet-faucet) URL. Click **ADD NETWORK**. ## REST API Interact with your Aptos node using the [Aptos Node API](https://fullnode.devnet.aptoslabs.com/v1/spec#/). Use your Chainstack Aptos REST endpoint. Example to get block information by block height: ```bash cURL curl --request GET \ --url YOUR_CHAINSTACK_ENDPOINT/v1/blocks/by_height/2047048\ --header 'Content-Type: application/json' ``` where YOUR\_CHAINSTACK\_ENDPOINT is your HTTPS endpoint protected either with the key or password # Aptos: Publish a module to save & retrieve a message on-chain Source: https://docs.chainstack.com/docs/aptos-tutorial-publish-a-module-to-save-and-retrieve-a-message-on-aptos **TLDR** * Aptos uses the Move language and “published” modules in place of traditional smart contracts. * This tutorial shows you how to set up an Aptos node, initialize a Move project, and publish a simple module that stores a string on-chain. * You’ll run through the basics of module creation, compilation, testing, and then sending transactions to set and retrieve data. * Use the Aptos CLI for everything from local testing to publishing, and wrap up by querying the on-chain data with Aptos’ REST API. ## Main article Aptos uses its own terminology for widely-known Web3 entities. Smart contracts are called Modules and are written in the [Move language](https://move-language.github.io/move/). Modules are also not deployed but *published* on the Aptos chain. The objective of this tutorial is to familiarize you with the Aptos network, the Move language and modules written in it. In the end of this tutorial, you will be able to publish, test, and interact with Move modules in Aptos. Specifically, in this tutorial, you will: * Initialize an Aptos project using the Aptos CLI. * Publish a module on the Aptos testnet. * Interact with the module to save a message. * Use the Aptos REST API to retrieve the message. ## Prerequisites * [Chainstack account ](https://console.chainstack.com/)to deploy an Aptos node. * [Martian Aptos wallet](https://martianwallet.xyz/) to receive testnet Aptos token (APT). * [Aptos CLI](https://github.com/aptos-labs/aptos-core) to compile, publish, and interact with the Move module. ## Overview To get from zero to publishing your string via the module to Aptos testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join Aptos testnet. With Chainstack, access your Aptos node credentials. Set up your Martian wallet to work through the Chainstack Aptos node. Fund your account through the [Aptos testnet faucet](https://aptoslabs.com/testnet-faucet). Install the [Aptos CLI](https://aptos.dev/tools/aptos-cli/install-cli/). Create a Move project. Create and configure your Aptos project. Create a module in the Move language. Compile and test the Move module. Publish the Move module. Save and retrieve a message on the Aptos chain. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Aptos testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Set up Martian wallet See [Aptos tooling: Martian wallet](/docs/aptos-tooling#martian-wallet). ### Fund your account Your account needs to pay fees in testnet APT to publish the module and interact with it. Fund your account with the [Aptos testnet faucet](https://aptoslabs.com/testnet-faucet). ### Install the Aptos CLI You need the Aptos CLI to interact with your Move module. Set up the [Aptos CLI](https://aptos.dev/tools/aptos-cli/install-cli/). ### Create a Move project 1. In your project directory, create a Move project: ```bash Shell aptos move init --name save-message ``` where `save-message` — name of the package. This creates a `sources` directory and a `Move.toml` file. 2. Open your `Move.toml` file and edit it to add `[addresses]` and `[dev-addresses]`, where: * `dev = "_"` — your default Aptos account. * `dev = "0xC0FFEE"` — an alternative Aptos account for tests. Example: ```toml TOML [package] name = 'save-message' version = '1.0.0' [addresses] dev = "_" [dev-addresses] dev = "0xC0FFEE" [dependencies.AptosFramework] git = 'https://github.com/aptos-labs/aptos-core.git' rev = 'main' subdir = 'aptos-move/framework/aptos-framework' ``` Note that packages have one-time names. If you want to re-publish the package, you must change its name. ### Create and configure an Aptos project 1. In your project directory, run `aptos init > custom`. This will start a configuration process, during which you need to set up your Chainstack endpoint and Martian wallet private key. Adding the private key will retrieve your Aptos public address automatically. 2. Add your [Aptos node endpoint](/docs/manage-your-node#view-node-access-and-credentials) deployed with Chainstack. 3. At the faucet URL request, type `skip` since you have already funded your account on the previous step. 4. Paste your Martian wallet private key to finish configuring your project. The key is used to send transactions and retrieve your public address. Example of a successful result: ```bash Shell Aptos CLI is now set up for account ...4474 as profile default! Run `aptos --help` for more information about commands { "Result": "Success" } ``` As a result, you get a `.aptos` directory with a `config.yaml` file inside it. In `config.yaml`, you will find your project setup. ### Create a Move module In your Move project directory, navigate to the `sources` directory. Create your Move module file `message.move` which allows you to call the `set_message` function and save a string on-chain: ```js JavaScript module dev::message { use std::error; use std::signer; use std::string; use aptos_framework::account; use aptos_framework::event; //:!:>resource struct MessageHolder has key { message: string::String, message_change_events: event::EventHandle, } //<:!:resource struct MessageChangeEvent has drop, store { from_message: string::String, to_message: string::String, } /// There is no message present const ENO_MESSAGE: u64 = 0; public fun get_message(addr: address): string::String acquires MessageHolder { assert!(exists(addr), error::not_found(ENO_MESSAGE)); *&borrow_global(addr).message } public entry fun set_message(account: signer, message: string::String) acquires MessageHolder { let account_addr = signer::address_of(&account); if (!exists(account_addr)) { move_to(&account, MessageHolder { message, message_change_events: account::new_event_handle(&account), }) } else { let old_message_holder = borrow_global_mut(account_addr); let from_message = *&old_message_holder.message; event::emit_event(&mut old_message_holder.message_change_events, MessageChangeEvent { from_message, to_message: copy message, }); old_message_holder.message = message; } } #[test(account = @0x1)] public entry fun sender_can_set_message(account: signer) acquires MessageHolder { let addr = signer::address_of(&account); aptos_framework::account::create_account_for_test(addr); set_message(account, string::utf8(b"Hello Chainstack dev")); assert!( get_message(addr) == string::utf8(b"Hello Chainstack dev"), ENO_MESSAGE ); } } ``` ### Compile and test the Move module 1. To compile your Move module, run: ```bash Shell aptos move compile --named-addresses dev=default ``` 2. After the module compiled, run a build-in test which checks if the `set_message` and `get_message` functions work: ```bash Shell aptos move test ``` ### Publish the Move module 1. Publish your compiled and tested Move module by running: ```bash Shell aptos move publish --named-addresses dev=default ``` 2. Type `yes` to confirm publishing the transaction on the Aptos chain. The module will publish and the terminal will return the module information. You can use the transaction hash to retrieve transaction details. To do so, run: ```bash cURL curl --location --request GET 'YOUR_CHAINSTACK_ENDPOINT/v1/transactions/by_hash/0x815cecb01a962331ca34904653a26715e6cd8e631d2d1b7da17b593adda1ea65' \ --header 'Content-Type: application/json' ``` where YOUR\_CHAINSTACK\_ENDPOINT is your Aptos node endpoint you used earlier. ### Save and retrieve a message on the Aptos chain 1. To save a message on the Aptos chain, run: ```bash Shell aptos move run --function-id 'default::message::set_message' --args 'string:Hello Chainstack dev' ``` where: * `run` — a Move command to call functions * `function-id` — a function to call * `args` — arguments of the function 2. Type `yes` to confirm publishing the transaction on the Aptos chain. 3. Retrieve the published message via the REST API by running: ```bash cURL curl --location --request GET 'YOUR_CHAINSTACK_ENDPOINT/v1/accounts/c0e0ce57eaf9680ae67252fb3126f25aa86bb098b05f7b72cf0cf0de57c72a7f/resource/0xc0e0ce57eaf9680ae67252fb3126f25aa86bb098b05f7b72cf0cf0de57c72a7f::message::MessageHolder' \ --header 'Content-Type: application/json' ``` where YOUR\_CHAINSTACK\_ENDPOINT is your Aptos node endpoint you used earlier. Successful response example: ```Json JSON { "type": "0xc0e0ce57eaf9680ae67252fb3126f25aa86bb098b05f7b72cf0cf0de57c72a7f::message::MessageHolder", "data": { "message": "Hello Chainstack dev", "message_change_events": { "counter": "0", "guid": { "id": { "addr": "0xc0e0ce57eaf9680ae67252fb3126f25aa86bb098b05f7b72cf0cf0de57c72a7f", "creation_num": "4" } } } } } ``` ## Conclusion This tutorial guided you through the basics of creating, publishing, and testing a simple module that saves a string on the Aptos chain. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Arbitrum methods Source: https://docs.chainstack.com/docs/arbitrum-methods See also [interactive Arbitrum API call examples](/reference/arbitrum-getting-started). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | arbtrace\_block | | | | arbtrace\_call | | | | arbtrace\_callMany | | | | arbtrace\_filter | | | | arbtrace\_replayBlockTransactions | | | | arbtrace\_replayTransaction | | | | arbtrace\_transaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Arbitrum tooling Source: https://docs.chainstack.com/docs/arbitrum-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Arbitrum nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Arbitrum endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Arbitrum nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](/docs/arbitrum-tooling#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Arbitrum nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Arbitrum nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Arbitrum nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Arbitrum nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Arbitrum nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Arbitrum network ID: * Mainnet: `42161` * Testnet: `421613` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint * NETWORK\_ID — Arbitrum network ID: * Mainnet: `42161` * Testnet: `421613` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Arbitrum ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Arbitrum network ID: * Mainnet: `42161` * Testnet: `421613` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Arbitrum: L1 to L2 messaging smart contract Source: https://docs.chainstack.com/docs/arbitrum-tutorial-l1-to-l2-messaging-smart-contract ### Deprecation notice As the Goerli testnet has been deprecated, this guide is for historical reference. Sending a message from the Ethereum chain (L1) to the Arbitrum chain (L2) does not involve the state challenge period and is as fast as the block confirmation time on L1 and L2 combined. In this tutorial, you will: * Deploy greeter contracts on Ethereum and on Arbitrum. * Send a message from the greeter contract deployed on Ethereum (L1) to the greeter contract deployed on Arbitrum (L2). ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an Ethereum node and an Arbitrum node. * [MetaMask](https://metamask.io/) to fund your account on L2 with GoerliETH. ## Overview To get from zero to your first L1 to L2 message, do the following: With Chainstack, create a public chain project. With Chainstack, join the Ethereum Goerli testnet. With Chainstack, join the Arbitrum Goerli testnet. Set up your MetaMask to work through the Chainstack Ethereum and Arbitrum nodes. Fund your account through a faucet on the Ethereum Goerli testnet and on the Arbitrum Goerli testnet. Run the tutorial script to deploy the contracts on L1 and L2 and send the message from L1 to L2. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum and Arbitrum Goerli testnets Deploy a node on the Ethereum Goerli testnet and a node on the Arbitrum Goerli testnet. See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get the access and credentials to your deployed nodes See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Set up MetaMask See [Arbitrum tooling: MetaMask](/docs/arbitrum-tooling#metamask). ### Fund your account Your account will need Goerli ether on both the Ethereum Goerli testnet and the Arbitrum Goerli testnet as you will deploy a contract on each of the chains. * Ethereum Goerli faucet * Arbitrum Goerli faucet: see [Nitro Goerli Rollup](https://developer.offchainlabs.com/public-chains) in the Arbitrum documentation The default Arbitrum Goerli faucet may fund your account with 0.001 GoerliETH, which is not enough to deploy the greeter contract on L2. If you do not have enough GoerliETH on L2, you may bridge some more from the Ethereum Goerli testnet using the [Arbitrum bridge](https://bridge.arbitrum.io/). ### Clone and prepare the tutorials repository You will use the [Offchain Labs tutorials repository](https://github.com/OffchainLabs/arbitrum-tutorials) to deploy the contracts and send the message. Clone the repository: ```bash Shell git clone https://github.com/OffchainLabs/arbitrum-tutorials.git ``` Change to `arbitrum-tutorials/packages/greeter`. Install dependencies by running `yarn`. Set up the `.env` file by renaming the sample one in `arbitrum-tutorials/packages/greeter`: ```bash Shell cp .env-sample .env ``` In the `.env` file, add your account key and the endpoints: * DEVNET\_PRIVKEY — the private key of your account that has GoerliETH both on the Ethereum Goerli testnet and the Arbitrum Goerli testnet. * L2RPC — the Chainstack HTTPS endpoint of your Arbitrum node deployed on the Arbitrum Goerli testnet. * L1RPC — the Chainstack HTTPS endpoint of your Ethereum node deployed on the Ethereum Goerli testnet. Example: ```env env DEVNET_PRIVKEY=YOUR_DEVNET_PRIVATE_KEY L2RPC=YOUR_CHAINSTACK_ENDPOINT L1RPC=YOUR_CHAINSTACK_ENDPOINT ``` ### Deploy the contract and send the message from L1 to L2 You are now all set to run the tutorial script that will deploy the greeter contracts and send a message from L1 to L2. In `arbitrum-tutorials/packages/greeter`, run: ```bash Shell yarn run greeter ``` The script will: * Deploy the L1 greeter contract on the Ethereum Goerli testnet. Example: [0x9B4F541D6A82Beb594Ee2A1EfF14d88f2898176c](https://goerli.etherscan.io/address/0x9B4F541D6A82Beb594Ee2A1EfF14d88f2898176c). * Deploy the L2 greeter contract on the Arbitrum Goerli testnet. Example: [0x890443aB733bd527F0036aEd3E249358a30Ff3ce](https://goerli-rollup-explorer.arbitrum.io/address/0x890443aB733bd527F0036aEd3E249358a30Ff3ce). * On the L1 contract, [set the L2 contract address](https://goerli.etherscan.io/tx/0xbd20609976a96ce791eae71dae0e87a254f542eab1ab400ce8b4681cc4f6b5aa). * On the L2 contract, [set the L1 contract address](https://goerli-rollup-explorer.arbitrum.io/tx/0x98dcfec500561985cdaf0f3933f1b361b3106edc055e0a2644c0f67396596d42/internal-transactions). * Retrieve the current gas costs for the transaction off the [ArbRetryableTx contract on L2](https://goerli-rollup-explorer.arbitrum.io/address/0x000000000000000000000000000000000000006E/read-contract#address-tabs). See also Arbitrum documentation: [Messaging Between Layers](https://developer.offchainlabs.com/arbos/l1-to-l2-messaging). * Using the retrieved gas cost values, [submit the message transaction on L1](https://goerli.etherscan.io/tx/0xa39ecbb53844d009dc121825c26b0608def2c4117d81a6ebeb6000fcf304ac9e). The transaction will send the message to the [inbox contract on L1](https://goerli.etherscan.io/address/0x6BEbC4925716945D46F0Ec336D5C2564F419682C#readProxyContract). See also Arbitrum documentation: [Contract addresses](https://developer.offchainlabs.com/useful-addresses). * The transaction will then be submitted as a retryable ticket by the [ArbRetryableTx contract on L2](https://goerli-rollup-explorer.arbitrum.io/address/0x000000000000000000000000000000000000006E/). [Example](https://goerli-rollup-explorer.arbitrum.io/tx/0xac1f89c9d449145aaa6a715bfb7a678009654191b379c03d20bd0a27b8f6968f). * Then the retryable ticket will be redeemed and change the state in the greeter contract on L2 with the message from the greeter contract on L1. [Example](https://goerli-rollup-explorer.arbitrum.io/tx/0x6c8dd56c1ef93064b7b219154327361c051588dfadf716cc23e9d5e3ed610814). ## Conclusion This tutorial guided you through the basics of creating and deploying a simple greeter contract that sends a message from the Ethereum chain to the Arbitrum chain. The tutorial also provided the examples and an explanation of the step-by-step state changes and the contracts involved in the L1 to L2 messaging. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Aurora methods Source: https://docs.chainstack.com/docs/aurora-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | parity\_pendingTransactions | | | # Aurora tooling Source: https://docs.chainstack.com/docs/aurora-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Aurora nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Aurora endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Aurora nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](/docs/aurora-tooling#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Aurora nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Aurora nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Aurora nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Aurora nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Aurora nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password * NETWORK\_ID — Aurora network ID: * Mainnet: `1313161554` * Testnet: `1313161555` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint * NETWORK\_ID — Aurora network ID: * Mainnet: `1313161554` * Testnet: `1313161555` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Aurora ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Aurora network ID: * Mainnet: `1313161554` * Testnet: `1313161555` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Aurora: Simple on-chain governance contract with Remix and OpenZeppelin wizard Source: https://docs.chainstack.com/docs/aurora-tutorial-simple-on-chain-governance-contract-with-remix-and-openzeppelin-wizard **TLDR** * Demonstrates how to create a governance token, a mintable token, and a DAO governor to manage proposals on Aurora. * Uses OpenZeppelin Wizard to generate ERC-20 contracts (one with Votes, one mintable) and a Governor contract, deployed via Remix. * Walks through the governance cycle: delegating votes, creating a proposal, voting, and executing successful proposals to mint new tokens. * Showcases seamless interaction with Aurora’s testnet using MetaMask, Explorer, and your own node endpoint. ## Main article The battle-tested set of [OpenZeppelin contracts](https://www.openzeppelin.com/contracts) and the user-friendly [contract wizard](https://wizard.openzeppelin.com/) make it very easy to run through a simple on-chain governance example. In this tutorial, you will: * Create an ERC-20 governance token contract that you will use to cast votes on the governance proposals. * Create a standard governor contract for your on-chain governance. * Create a mintable ERC-20 contract that will mint new tokens on your successfully passed on-chain governance proposals. * Run a full on-chain governance cycle. ## Prerequisites * Aurora Testnet node. * [Remix IDE](https://remix.ethereum.org/) to compile the contracts and deploy through MetaMask. * [MetaMask](https://metamask.io/) to deploy the contracts through your Chainstack node and interact with the contracts. ## Overview To get from zero to the deployed contracts on the Aurora testnet, do the following: Get an Aurora testnet node. Set up your MetaMask to work through a Chainstack node. Fund your account with Aurora ETH on the Aurora testnet. With OpenZeppelin Wizard, create your contracts. With Remix IDE, deploy the contracts on the Aurora testnet. With MetaMask, run your on-chain governance process. ## Step-by-step ### Set up MetaMask See [Aurora tooling: MetaMask](/docs/aurora-tooling#metamask). ### Fund your account You can use the [Aurora faucet](https://aurora.dev/faucet) to fund your account, although the amount of Aurora ether dispensed will not be enough to cover the costs of this tutorial. If you have some Goerli ether, you may bridge it from the Ethereum Goerli testnet to the Aurora testnet using the [Rainbow bridge](https://testnet.rainbowbridge.app/transfer). The transfer may take up to 20 mins. ### Create and deploy your ERC-20 governance token This will be your governance token—the token that you will use to vote on proposals in the governance contract. #### OpenZeppelin Wizard Open [OpenZeppelin Wizard](https://wizard.openzeppelin.com/). Select **ERC20**. Provide a name and a symbol for the token. For example: *GovernanceToken*, *GTK*. In **Premint**, provide the number of tokens to issue to your account. For example: 100. In **Feature**, select **Votes**. Click **Open in Remix**. #### Remix In Remix, click **Compile contract**. Click the deployment tab on your left. Select **Environment** > **Injected Provider - MetaMask**. In **Contract**, select your contract. For example, *GovernanceToken*. Click **Deploy**. This will engage your MetaMask to deploy the contract to the Aurora testnet through your currently selected MetaMask account. Click **Confirm** in the MetaMask modal. ### Verify the contract To use the [Aurora explorer](https://explorer.testnet.aurora.dev/) as a web app to interact with your contracts, verify them in the explorer. In Remix, click **Plugins** > **Flattener** > **Activate**. Select the activated flattener plugin in your left pane and click **Flatten contract**. In the [Aurora explorer](https://explorer.testnet.aurora.dev/), navigate to the contract that you deployed. Click **Verify and Publish**. In **Compiler Type**, select **Solidity (Single file)**. In **Complier version**, select the compiler that you used in Remix. In **Open Source License Type**, select **MIT License (MIT)**. Click **Continue**. In **Optimization**, select **No**. In **Solidity Contract Code**, provide the flattened code of your contract from Remix. Click **Verify and Publish**. You now have the ERC-20 contract verified and your governance tokens in your account—the account that you used to deploy the contract. ### Create and deploy your ERC-20 mintable token contract This will be the contract that will be used by the governance contract to mint tokens on passing the on-chain proposal. #### OpenZeppelin Wizard Open [OpenZeppelin Wizard](https://wizard.openzeppelin.com/). Select **ERC20**. Provide a name and a symbol for the token. For example: *MintableToken*, *MTK*. In **Premint**, keep `0`. In **Feature**, select **Mintable**. Note that this will automatically make the contract *Ownable*. Click **Open in Remix**. #### Remix In Remix, click **Compile contract**. Click the deployment tab on your left. Select **Environment** > **Injected Provider - MetaMask**. In **Contract**, select your contract. For example, *MintableToken*. Click **Deploy**. This will engage your MetaMask to deploy the contract to the Aurora testnet through your currently selected MetaMask account. Click **Confirm** in the MetaMask modal. Verify the contract similarly to the previous one. ### Create and deploy your governance contract This will be your on-chain governance contract. #### OpenZeppelin Wizard Open [OpenZeppelin Wizard](https://wizard.openzeppelin.com/). Select **Governor**. Provide a name and a symbol for the token. For example: *MyGovernor*. In **Voting Delay**, keep `1 block`. In **Voting Period**, put `600 blocks`. In **1 block=**, put `1 second`. This will be about 10 minutes for a voting period. In **Quorum**, switch to **#** and put `1` to make the quorum of 1 participating account pass the proposals. Remove the **Updatable Settings** check mark. In **Votes**, select **ERC20Votes**. Remove the **Timelock** check mark. Click **Open in Remix**. #### Remix In Remix, click **Compile contract**. Click the deployment tab on your left. Select **Environment** > **Injected Provider - MetaMask**. In **Contract**, select your contract. For example, *MintableToken*. In **Deploy**, provide the address of the ERC-20 governance token that you deployed as the first contract of this tutorial. This will make the deployed contract recognize the ERC-20 token as the contract's voting token. Click **Deploy**. This will engage your MetaMask to deploy the contract to the Aurora testnet through your currently selected MetaMask account. Click **Confirm** in the MetaMask modal. Verify the contract similarly to the previous one. ### Delegate the votes You have preminted the ERC-20 governance token to the account you deployed the contract with. As the governance token owner, you now need to delegate the voting power to an account. For simplicity, you can delegate it to your account: In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your verified ERC-20 governance contract. Click **Contract** > **Write Contract**. Click **Connect to Web3** > **MetaMask**. Make sure you connect with the same account that you used to deploy the contract as this is the account that holds the preminted tokens. In `delegatee`, provide your account address and click **Write**. This will make the provided account be able to cast votes in the governance contract. ### Transfer the ERC-20 mintable contract ownership to the governance contract The ERC-20 mintable contract is ownable. To be able to mint the tokens through the governance contract, you need to transfer the ownership of the ERC-20 contract to the governance contract. In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your verified ERC-20 mintable contract. Click **Contract** > **Write Contract**. Click **Connect to Web3** > **MetaMask**. Make sure you connect with the same account that you used to deploy the contract. In `transferOwnership` > `newOwner`, provide the address of your governance contract. Click **Write**. This will transfer the contract ownership. Your governance contract can now mint the ERC-20 tokens. ### Create a proposal At this point, you have three contracts deployed: * An ERC-20 governance token contract with preminted tokens assigned to your account * An ERC-20 mintable token contract owned by the governance contract * The governance contract to run proposals It is now time to create your proposal. First, get the call data that the governance contract takes as a proposal to execute. For this tutorial, the call data is minting the ERC-20 tokens to an address and an amount to mint: `mint(address to, uint256 amount)`. To get the call data: Go to the [online ABI encoding service](https://abi.hashex.org/). In **Enter your parameters manually**, select: * **Function** > **your function** > `mint`. * **Argument** > **Address** > the address to mint the tokens to. * **Argument** > **Uint256** > the amount of tokens to mint in 18 decimal unit. See [Ethereum unit converter](https://eth-converter.com/). Copy the resulting data and precede it with `0x`. This is your call data to mint the tokens. Now create the actual proposal. In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your verified governance contract. Click **Contract** > **Write Contract**. Click **Connect to Web3** > **MetaMask**. In `propose`, provide your proposal details: * In `targets`, provide the address of your ERC-20 mintable token. * In `values`, provide the amount of Aurora ether that should be deposited with the proposal. Typically, it is `0`. * In `calldatas`, provide the call data that you acquired with MetaMask. * In `description`, provide any description. Once the proposal transaction is confirmed, the contract will emit an event that contains the proposal ID. You will need the proposal ID to vote for the proposal and to execute the proposal. To get the proposal ID: In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your transaction that created the proposal. Switch to **Logs**. In *Data*, copy the `proposalId` value. ### Cast your vote At this point, you have: * A running proposal * The proposal ID * The voting power delegated to your account To cast your vote: In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your verified governance contract. Click **Contract** > **Write Contract**. Click **Connect to Web3** > **MetaMask**. In `castVote`, provide your vote: * In `proposalId`, paste the proposal ID. * In `support`, provide your voting decision: `0` is against, `1` is for, `2` is abstain. For the proposal to pass, vote `1`. Click **Write**. This will create a transaction that casts your vote. ### Execute the passed proposal Once the proposal is voted on and the voting period reaches the deadline, the proposal is passed. You can now execute the passed proposal. To execute: In the [Aurora explorer](https://explorer.testnet.aurora.dev/), open your verified governance contract. Click **Contract** > **Write Contract**. Click **Connect to Web3** > **MetaMask**. In `execute`, provide details: * In `execute`, provide `0` as payable amount as you are minting ERC-20 tokens and not depositing Aurora ether. * In `targets`, provide the address of your ERC-20 mintable token. * In `values`, provide `0` for the amount of Aurora ether. * In `calldatas`, provide the same call data that you generated earlier for the function `mint(address to, uint256 amount)`. * In `descriptionHash`, provide the hash of the description that you used to generate the proposal. To do this the easy way, go to an [online Keccak-256 generator](https://emn178.github.io/online-tools/keccak_256.html) and paste your description text, copy the result and precede it with `0x`. Click **Write**. This will execute the passed proposal and the governance contract will mint the ERC-20 mintable tokens to the account that you provided in the call data when creating the proposal. ## Conclusion This tutorial guided you through the basics of creating and deploying a set of DAO contracts to mint ERC-20 tokens through on-chain governance on the Aurora testnet through your node. You have also interacted with the contracts to run the full cycle governance process using the Aurora explorer as a web app and MetaMask as your interaction tool that works through your Aurora node. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Authentication methods available on Chainstack Source: https://docs.chainstack.com/docs/authentication-methods-for-different-scenarios **TLDR** * The Chainstack platform API uses a bearer token (API key) passed in the header for requests. * Blockchain node requests on Chainstack often use either the auth token appended to the URL or basic authentication (username and password). * Always keep API keys, usernames, and passwords private to avoid unauthorized node access. * Choose an authentication method based on your security needs and infrastructure constraints – e.g., basic auth for simplicity or key-based for more robust usage. ## Main article This guide provides a comprehensive overview of the different authentication methods that Chainstack offers. Our goal is to guide you in understanding these options and to assist in selecting the most appropriate method tailored to your specific use cases. Let's explore the authentication methods available and how they apply to various scenarios. ## Authentication methods API authentication is a crucial factor in application programming interface (API) development, used to verify the identities of applications or users utilizing the API. Different kinds of authentication methods are available in the API world; let's briefly explore the four primary methods used for API authentication. 1. **API keys**. These are the simplest form of API authentication. The client includes an API key, a unique identifier, in the header or as a parameter in the URL. The server matches the key to a corresponding key in its database and, if it matches, grants access. Although easy to implement, the API key can be misused if intercepted. 2. **Basic authentication**. This method involves sending a user ID and password with each API request. The credentials are Base-64 encoded but not encrypted, making it easily decipherable by anyone who intercepts the transmission. Basic authentication should always be used over HTTPS to add an additional layer of security. 3. **Digest authentication**. This is a step up from basic authentication, where the client sends a hashed (or digested) version of the password. It's more secure than basic authentication because even if an attacker intercepts the hashed password, they cannot use it to make API requests. 4. **OAuth (open authorization)**. OAuth is more complex but secure. It enables third-party applications to make requests on behalf of a user without needing their password. OAuth2, the latest version, uses short-lived access tokens rather than user credentials for authentication. 5. **JWT (JSON web tokens)**. JWT is a token-based authentication that allows for stateless authentication. An encoded string of characters represents a payload of data, which often includes issued at time (iat), expiration time (exp), and not before (nbf) statements. Here's a quick comparison table: | Method | Security | Complexity | | --------------------- | --------------------------- | ---------- | | API Key | Low | Low | | Basic authentication | Medium (if used over HTTPS) | Low | | Digest authentication | Medium | Medium | | OAuth | High | High | | JWT | High | Medium | Remember that choosing the right authentication method for your API depends on your specific use case, including your security needs and the resources available for implementation. The Chainstack platform api only accepts an API key as a bearer token as a form of authentication, so we’ll focus on this in the next section. ### Header authentication (bearer token) Header authentication with a bearer token is a common method employed in API requests. This approach involves attaching an authorization header with a bearer token in each HTTP request to the server. This token is a cryptic string, ensuring that data access is only granted to the token's bearer, thus giving this authentication method its name. In the context of the Chainstack platform, it's crucial to note that header authentication using a bearer token is fully supported for the [platform API requests](/reference/platform-api-getting-started). This means users can authenticate their API calls on the platform by simply including the bearer token in their request headers. However, bearer token authentication is currently unavailable when it comes to blockchain APIs. Blockchain nodes typically don't provide traditional user-based authentication. Chainstack uses API keys or similar mechanisms to authenticate requests to the hosted nodes. However, these are not traditional bearer tokens. The following is an example of how to send a header authenticated to the Chainstack platform API using a bearer token. Check out the Docs to learn [how to generate your Chainstack API key](/reference/platform-api-getting-started#create-api-key). ```bash cURL curl --request GET \ --url https://api.chainstack.com/v1/organization/ \ --header 'accept: application/json' \ --header 'authorization: Bearer YOUR_CHAINSTACK_API_KEY' ``` This example calls the `Get Organization name and ID` API, which returns the organization name and ID belonging to the API key. Edit `YOUR_CHAINSTACK_API_KEY` with the API key you get from the Chainstack console. Example response: ```shell Shell { "id": "RG-123-456", "name": "Cool Organization" } ``` ## Selecting the appropriate authentication method When choosing an authentication method for your scenario, consider the following points: * **Use case and purpose**. Identify the specific use case and the purpose of the API requests. Understanding the requirements will guide you in selecting the most suitable authentication method. * **Security and complexity**. Evaluate the level of security and complexity required for your API requests. Basic authentication provides a straightforward approach, whereas API keys offer a secure way to manage different URLs and chains. * **Compatibility and flexibility**. Determine the compatibility of the authentication method with your existing systems and the flexibility it provides for future expansion. ## Authenticating blockchain requests to a node Chainstack offers two sets of credentials to access a node. One is via endpoints incorporating the API key directly in the URL, and the other is through endpoints requiring a username and password for access. ### Access via auth token You can use the endpoint with an auth token like you can find in your Chainstack console: * HTTPS endpoint example: `https://ethereum-mainnet.core.chainstack.com/YOUR_AUTH_TOKEN` * WebSocket endpoint example: `wss://ethereum-mainnet.core.chainstack.com/ws/YOUR_AUTH_TOKEN` You can use a POST request like the following to access the HTTPS endpoint via a curl command. This example shows how to [retrieve the client version](/reference/ethereum_node_api/client_info/web3_clientVersion), one of the standard Ethereum JSON-RPC methods. ```bash cURL curl -X POST --location 'https://ethereum-mainnet.core.chainstack.com/YOUR_AUTH_TOKEN' \ --header 'Content-Type: application/json' \ --data '{"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":1}' ``` Make sure to replace the placeholder endpoint with your endpoint. ### Password-protected access For the blockchain API requests, you can also use basic authentication: * HTTPS endpoint: `https://ethereum-mainnet.core.chainstack.com` * WSS endpoint: `wss://ethereum-mainnet.core.chainstack.com/ws` * Username: `YOUR_USER_NAME` * Password: `YOUR_PASSWORD` You can find your [username and password credentials](/docs/manage-your-node#view-node-access-and-credentials) in the Chainstack console. For password-protected access, you include the username and password in your `curl` command like so: ```bash cURL curl -X POST \ -u YOUR_USER_NAME:YOUR_PASSWORD \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":1}' \ https://ethereum-mainnet.core.chainstack.com ``` In this command, `-u YOUR_USER_NAME:YOUR_PASSWORD` includes your username and password for basic authentication. Please replace `YOUR_PASSWORD` with your actual password. Keeping your API key and username/password secure is critical to prevent unauthorized access to your blockchain node. ## Conclusion The realm of API authentication is vast, encompassing a variety of methods, each with its distinct security levels and complexities. From the simplest API Keys to the secure OAuth and JWT methods, the choice depends largely on your specific use case and the resources you have available for implementation. In the context of Chainstack, platform API requests are authenticated through a bearer token. However, when it comes to blockchain APIs, bearer token authentication is currently not available, and API keys or basic authentication are the preferred methods. Regardless of your chosen method, it is crucial to remember that the security of your API access points is paramount. Always safeguard your API keys, usernames, and passwords to prevent unauthorized access. Also, consider your use case's specific needs and requirements, including the level of security and complexity required for your API requests and the compatibility and flexibility of the chosen authentication method with your existing systems. Remember, the ultimate goal of API authentication is to facilitate secure access to services, ensure user data privacy, and prevent unauthorized access. By understanding and effectively implementing the right authentication methods, you are well on your way to achieving these goals in your software development endeavors. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Available node methods Source: https://docs.chainstack.com/docs/available-node-methods API methods available for the Chainstack supported protocols Check the table subpages for each of the protocols to see the method availability. If there's anything that you are missing, always feel free to [reach out to us](https://support.chainstack.com/). We also provide additional node customizability on request. See also the interactive [API call examples](/reference/blockchain-apis). # Avalanche methods Source: https://docs.chainstack.com/docs/avalanche-methods See also [interactive Avalanche API call examples](/reference/avalanche-getting-started). ## Contract chain (C-Chain) | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_get\_asset\_balance | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | avax.getAtomicTx | | | | avax.getAtomicTxStatus | | | | avax.getUTXOs | | | | avax.issueTx | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | ## Exchange chain (X-Chain) | Method | Availability | Comment | | ----------------------- | --------------------------------------------- | ------- | | avm.buildGenesis | | | | avm.getAddressTxs | | | | avm.getAllBalances | | | | avm.getAssetDescription | | | | avm.getBalance | | | | avm.getBlockByHeight | | | | avm.getHeight | | | | avm.getTx | | | | avm.getTxStatus | | | | avm.getUtxOs | | | | avm.issueTx | | | ## Platform chain (P-Chain) | Method | Availability | Comment | | ----------------------------- | --------------------------------------------- | ------- | | platform.getBalance | | | | platform.getBlockchainStatus | | | | platform.getBlockchains | | | | platform.getCurrentSupply | | | | platform.getCurrentValidators | | | | platform.getFeeConfig | | | | platform.getFeeState | | | | platform.getHeight | | | | platform.getMinStake | | | | platform.getRewardUtxOs | | | | platform.getStake | | | | platform.getStakingAssetId | | | | platform.getSubnets | | | | platform.getTimestamp | | | | platform.getTotalStake | | | | platform.getTx | | | | platform.getTxStatus | | | | platform.getUtxOs | | | | platform.getValidatorsAt | | | | platform.issueTx | | | | platform.sampleValidators | | | | platform.validatedBy | | | | platform.validates | | | # Avalanche tooling Source: https://docs.chainstack.com/docs/avalanche-tooling ## C-Chain ### MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ### Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Avalanche nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Avalanche endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ### Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Avalanche nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ### Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](/docs/avalanche-tooling#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ### web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Avalanche nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. #### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. #### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ### web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Avalanche nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). #### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). #### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ### web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Avalanche nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Key Protected eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ### web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Avalanche nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ### ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Avalanche nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Avalanche C-Chain network ID: * Mainnet: `43114` * Testnet: `43113` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). #### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Avalanche C-Chain network ID: * Mainnet: `43114` * Testnet: `43113` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```bash Shell brownie networks add Avalanche ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Avalanche C-Chain network ID: * Mainnet: `43114` * Testnet: `43113` Example to run the deployment script: ```bash Shell brownie run deploy.py --network chainstack-mainnet ``` ### Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. #### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```bash Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password #### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```bash Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ## X-Chain ### JSON-RPC API Interact with the X-Chain through your Avalanche nodes using JSON-RPC API. Use [curl](https://curl.haxx.se) or [Postman](https://www.getpostman.com) to invoke Avalanche X-Chain API methods. Example below demonstrates how to get AVAX balance of an address through your Avalanche node HTTPS endpoint on the X-Chain mainnet: ```bash Curl curl -X POST --data '{ "jsonrpc":"2.0", "id" : 1, "method" :"avm.getBalance", "params" :{ "address":"X-avax1slt2dhfu6a6qezcn5sgtagumq8ag8we75f84sw", "assetID": "FvwEAhmxKfeiG8SnEvq42hc6whRyY3EFYAvebMqDNDGCgxN5Z" } }' -H 'content-type:application/json;' YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ### AvalancheJS Install [AvalancheJS](https://github.com/ava-labs/avalanchejs). Use [AvalancheJS examples](https://github.com/ava-labs/avalanchejs/tree/master/examples) to interact with the X-Chain through your Avalanche node with the following settings: ```javascript Javascript const ip: string = "BASE_ENDPOINT" // const port: number = 9650 const protocol: string = "https" const networkID: number = CHAIN_ID const avalanche: Avalanche = new Avalanche(ip, null, protocol, networkID) ``` where * BASE\_ENDPOINT — your node key-protected endpoint without the `https` prefix and the `ext` postfix. For example, `nd-123-456-789.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d`. * CHAIN\_ID — the chain ID of the network you are connecting to: * Mainnet: `1` * Fuji testnet: `5` Make sure you remove `const port` and change `port` to `null` in the default example. Example to get AVAX balance of an address through your Avalanche node HTTPS endpoint on the X-Chain mainnet: ```javascript Javascript import { Avalanche } from "../../dist" import { AVMAPI } from "../../dist/apis/avm" const ip: string = "nd-123-456-789.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d" // const port: number = 9650 const protocol: string = "https" const networkID: number = 1 const avalanche: Avalanche = new Avalanche(ip, null, protocol, networkID) const xchain: AVMAPI = avalanche.XChain() const main = async (): Promise => { const address: string = "X-avax1k30tskunzxr2tmapy8p4y0ujn2802yr3743679" const balance: object = await xchain.getBalance(address, "AVAX") console.log(balance) } main() ``` # Avalanche: Aave V3 flash loan with Hardhat Source: https://docs.chainstack.com/docs/avalanche-tutorial-aavev3-flash-loans-with-hardhat **TLDR** * You’ll configure a Hardhat project to interact with Aave V3 flash loans on Avalanche’s Fuji testnet. * You’ll use Chainstack for your Avalanche node endpoint and deploy a custom FlashLoan contract. * You’ll borrow USDC, then repay it plus fees in a single transaction, demonstrating a flash loan’s instant, collateral-free mechanics. * By the end, you’ll have a working Aave flash loan flow on a testnet environment ready for deeper custom logic. ## Flash loans A flash loan is a type of loan that can be obtained instantly and without any collateral, unlike traditional loans that require time-consuming application processes and collateral such as property or assets. This type of loan is available through Aave, a decentralized lending platform, where borrowers can borrow any amount they need and repay it within a single transaction. The loan is secured by the borrower's smart contract and is only valid for the duration of that transaction. If the borrower cannot repay the loan and the associated fees within the same transaction, the loan is automatically canceled, and the transaction is reverted. Flash loans are often used in the context of cryptocurrency trading and arbitrage, as they enable traders to obtain funds quickly and cheaply to take advantage of market opportunities. For detailed documentation, see Aave Developers: [Flash Loans](https://docs.aave.com/developers/guides/flash-loans). The objective of this tutorial is to make you familiar with the Avalanche C-Chain, the [Hardhat framework](/docs/avalanche-tooling#hardhat), and the [Aave flash loans](https://docs.aave.com/faq/flash-loans). Specifically, in this tutorial, you will: * Deploy an Avalanche node on the Fuji testnet. * Create a flash loan project using Hardhat. * Run a flash loan on the Fuji testnet through an Avalanche node deployed with Chainstack. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an Avalanche node. * [Node.js](https://nodejs.org/en/) as the JavaScript framework. * [Hardhat](https://hardhat.org/hardhat-runner/docs/getting-started#overview) to create, deploy, and interact with contracts. ## Dependencies * Hardhat: ^2.12.7 * @aave/core-v3: ^1.17.2 * dotenv: ^16.0.3 ## Overview This tutorial shows you how to request a flash loan on the Fuji testnet to borrow USDC. To get from zero to an executed Aave V3 flash loan on the Avalanche Fuji C-Chain testnet, do the following: 1. With Chainstack, create a public chain project. 2. With Chainstack, join the Avalanche Fuji testnet. 3. With Chainstack, access your Avalanche node endpoint. 4. With Hardhat, create and set up an Aave flash loan project. 5. With Hardhat, execute the flash loan through your Avalanche node. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Avalanche Fuji testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Avalanche node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Fund your wallet Before diving into the flash loan project, make sure to top up your wallet with testnet AVAX and USDC tokens. Use the following faucets: * [Aave faucet](https://app.aave.com/faucet/) for USDC. Make sure you are on the Avalanche Market. * [Fuji faucet](https://faucet.avax.network/) for AVAX. ### Install HardHat See [Installing Hardhat](https://hardhat.org/hardhat-runner/docs/getting-started#installation). ### Create a Hardhat project Create a new directory for your project, then run the following from a terminal: ```shell Shell npx hardhat ``` This will launch the Hardhat CLI, which will prompt you to choose a starter project. For this project, answer **yes** to the following: * **Create a JavaScript project** * **Do you want to install this sample project's dependencies with npm (hardhat @nomicfoundation/hardhat-toolbox)?** ### Install the required dependencies This project uses the [aave/core-v3](https://github.com/aave/aave-v3-core) package for the smart contracts and the [dotenv](https://github.com/motdotla/dotenv) package to safely use environment variables. Run the following command in your root directory to install: ```shell Shell npm i @aave/core-v3 dotenv ``` ### Create a .env file In your project's root directory, create a new file and name it `.env`. Here is where you will set up the environment variables for your Chainststack Avalanche Fuji endpoint and your wallet's private key. ```Text .env PRIVATE_KEY="YOUR_WALLET_PRIVATE_KEY" FUJI_CHAINSTACK="YOUR_CHAINSTACK_ENDPOINT" ``` Save it after you added your information. ### Edit the Hardhat configuration file You will find a file named `hardhat.config.js` in the root directory. This file is used to configure various settings for your Hardhat projects, such as the network you want to deploy your contracts on, the compilers you want to use, and the plugins you want to enable. Delete the default code in the file and replace it with the following: ```javascript hardhat.config.js require("@nomicfoundation/hardhat-toolbox"); require("dotenv").config(); /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { solidity: "0.8.10", networks: { fuji: { url: process.env.FUJI_CHAINSTACK, accounts: [process.env.PRIVATE_KEY], }, }, }; ``` Let's break down what each part of the file does: * `require("@nomicfoundation/hardhat-toolbox");` imports the Hardhat Toolbox plugin, which provides several useful tools and utilities for Hardhat projects. * `require("dotenv").config();` loads environment variables from a `.env` file using the `dotenv` package. * `module.exports = { ... }` exports a JavaScript object containing the configuration for the Hardhat project. * `solidity: "0.8.10",` sets the Solidity compiler version to 0.8.10. * `networks: { ... }` defines the network configurations for the Hardhat project. In this case, it defines a network called `fuji` that connects to the Avalanche Fuji blockchain network. * `fuji: { ... }` defines the configuration for the `fuji` network. * `url: process.env.FUJI_CHAINSTACK,` sets the URL for the Fuji network using the `FUJI_CHAINSTACK` environment variable. * `accounts: [process.env.PRIVATE_KEY],` sets the accounts for the `fuji` network using the `PRIVATE_KEY` environment variable. This will allow the Hardhat project to deploy contracts and interact with the Fuji network using the specified private key. ### Create the flash loan smart contract In the root directory, you will find a directory named `contracts` with a sample contract in it. Rename this contract to `FlashLoan.sol` and replace its code with the following: ```sol FlashLoan.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.10; import {FlashLoanSimpleReceiverBase} from "@aave/core-v3/contracts/flashloan/base/FlashLoanSimpleReceiverBase.sol"; import {IPoolAddressesProvider} from "@aave/core-v3/contracts/interfaces/IPoolAddressesProvider.sol"; import {IERC20} from "@aave/core-v3/contracts/dependencies/openzeppelin/contracts/IERC20.sol"; /** * @title FlashLoan * @dev A contract that demonstrates how to use Aave's flash loans. This contract can borrow any token from the Aave lending pool, perform custom logic with the borrowed funds, and repay the loan plus interest in a single transaction. */ contract FlashLoan is FlashLoanSimpleReceiverBase { address payable public owner; // The owner of this contract, who can withdraw funds. /** * @dev Constructor function that sets the address provider for the Aave lending pool and the contract owner. * @param _addressProvider The address provider for the Aave lending pool. */ constructor(address _addressProvider) FlashLoanSimpleReceiverBase(IPoolAddressesProvider(_addressProvider)) { owner = payable(msg.sender); // Set the contract owner to the creator of this contract. } /** * @dev This function is called after this contract receives a flash loan. It executes custom logic with the borrowed funds and repays the loan plus interest to the Aave lending pool. * @param asset The token being borrowed. * @param amount The amount of the token being borrowed. * @param premium The fee paid to the Aave lending pool. * @param initiator The address that initiated the flash loan. * @param params Additional parameters for the flash loan. * @return true to indicate that the flash loan has been repaid. */ function executeOperation( address asset, uint256 amount, uint256 premium, address initiator, bytes calldata params ) external override returns (bool) { // This function is called by the Aave lending pool contract after this contract receives the flash loan. // The asset parameter represents the token being borrowed, amount is the amount borrowed, and premium is the fee paid to the pool. // 👇 Your custom logic for the flash loan should be implemented here 👇 /** YOUR CUSTOM LOGIC HERE */ // 👆 Your custom logic for the flash loan should be implemented above here 👆 // Approve the lending pool contract to pull funds from this contract to pay back the flash loan. uint256 amountOwed = amount + premium; IERC20(asset).approve(address(POOL), amountOwed); return true; // Return true to indicate that the flash loan has been repaid. } /** * @dev Function to request a flash loan for a specified token and amount. * receiverAddress The address of this contract, which will receive the flash loan. * @param _token The token to be borrowed. * @param _amount The amount of the token to be borrowed. * params No additional parameters are needed. * referralCode No referral code is used. */ function requestFlashLoan(address _token, uint256 _amount) public onlyOwner { address receiverAddress = address(this); address asset = _token; uint256 amount = _amount; bytes memory params = ""; uint16 referralCode = 0; // Call the Aave lending pool contract to initiate the flash loan. POOL.flashLoanSimple( receiverAddress, asset, amount, params, referralCode ); } /** * @dev Get the balance of a specific token in this contract. * @param _tokenAddress The address of the token to check the balance of. * @return The balance of the specified token in this contract. */ function getBalance(address _tokenAddress) external view returns (uint256) { return IERC20(_tokenAddress).balanceOf(address(this)); } /** * @dev Withdraw a specific token from this contract to the contract owner's address. * @param _tokenAddress The address of the token to withdraw. */ function withdraw(address _tokenAddress) external onlyOwner { IERC20 token = IERC20(_tokenAddress); // Create an instance of the token contract. token.transfer(msg.sender, token.balanceOf(address(this))); // Transfer the token balance to the contract owner. } /** * @dev Modifier to ensure that only the contract owner can call a specific function. */ modifier onlyOwner() { require( msg.sender == owner, "You are not the owner!" ); _; } /** * @dev Fallback function to receive ETH payments. */ receive() external payable {} } ``` ### Default flash loan logic This smart contract receives the flash loan but performs no further actions on it. You will need to add your own logic. This smart contract is heavily commented on to explain its inner workings, but you can find more details on the [Aave docs](https://docs.aave.com/developers/guides/flash-loans). ### Create the deploying and interacting script In the `scripts` directory inside the root of your project, you will find a file named `deploy.js`. Replace its content with the following: ```javascript deploy.js const { ethers } = require("hardhat"); // Contract addresses and other values const AVA_FUJI_POOL_PROVIDER = "0x220c6A7D868FC38ECB47d5E69b99e9906300286A"; const USDC_ADDRESS = "0x6a17716Ce178e84835cfA73AbdB71cb455032456"; const USDC_DECIMALS = 6; const FLASHLOAN_AMOUNT = ethers.utils.parseUnits("1000", USDC_DECIMALS); // USDC tranfer function ABI const USDC_ABI = ["function transfer(address to, uint256 value) external returns (bool)"]; async function main() { try { console.log("Deploying FlashLoan contract..."); const FlashLoan = await ethers.getContractFactory("FlashLoan"); const flashLoan = await FlashLoan.deploy(AVA_FUJI_POOL_PROVIDER); await flashLoan.deployed(); console.log(`FlashLoan contract deployed at: ${flashLoan.address}`); console.log(`View contract at: https://testnet.snowtrace.io/address/${flashLoan.address}`); console.log("---------------------------------------------------------------\n"); // Transfer USDC to the FlashLoan contract const erc20 = new ethers.Contract(USDC_ADDRESS, USDC_ABI, ethers.provider.getSigner()); const amount = ethers.utils.parseUnits("5", USDC_DECIMALS); console.log(`Transferring ${amount / 1e6} USDC to the FlashLoan contract...`); const transferErc20 = await erc20.transfer(flashLoan.address, amount); console.log(`Transferred ${amount / 1e6} USDC tokens to the FlashLoan contract`); console.log("Waiting for 1 block to verify the transfer..."); await transferErc20.wait(1); // Wait 1 block for the transaction to be verified to update the balance console.log(`---------------------------------------------------------------\n`); // Check USDC balance of the FlashLoan contract const usdcBalance = await flashLoan.getBalance(USDC_ADDRESS); console.log(`USDC balance of the FlashLoan contract is: ${usdcBalance / 1e6} USDC`); console.log("---------------------------------------------------------------\n"); // Call flash loan console.log(`Requesting a flash loan of ${FLASHLOAN_AMOUNT / 1e6} USDC...`); const flashloanTx = await flashLoan.requestFlashLoan(USDC_ADDRESS, FLASHLOAN_AMOUNT); console.log("Flash loan executed!"); console.log(`View transaction at: https://testnet.snowtrace.io/tx/${flashloanTx.hash}`); await flashloanTx.wait(1); // Wait 1 block for the transaction to be verified to update the balance console.log("---------------------------------------------------------------\n"); // Withdraw remaining USDC const remainingUSDC = await flashLoan.getBalance(USDC_ADDRESS); console.log(`Withdrawing ${remainingUSDC / 1e6} USDC from the FlashLoan contract...`); const withdrawFunds = await flashLoan.withdraw(USDC_ADDRESS); await withdrawFunds.wait(1); // Wait 1 block for the transaction to be verified console.log(`Funds sent!`) console.log(`View transaction at: https://testnet.snowtrace.io/tx/${withdrawFunds.hash}`); } catch (error) { console.error(error); process.exitCode = 1; } } main(); ``` This code is a script that deploys a `FlashLoan` smart contract and uses it to request a flash loan of 1,000 USDC tokens. The script first sets some constants, including the addresses of the Aave pool provider and the USDC token on the Fuji testnet. Verify that the addresses are up to date on the [Aave docs](https://docs.aave.com/developers/deployed-contracts/v3-testnet-addresses#contract-name-changes-from-v2-greater-than-v3) by finding the addresses for `PoolAddressesProvider-Avalanche`and `USDC-TestnetMintableERC20-Avalanche`. The amount of USDC to be borrowed is also declared here. It then deploys the `FlashLoan` contract and transfers 5 USDC tokens to the contract from the deployer's account. To request flash loans, the smart contract must hold some of the tokens that you are planning to borrow; these tokens are used to repay the fee. On the V3 version, the fee is a fixed percentage, and you can find the updated fee value on the [Aave docs](https://docs.aave.com/developers/guides/flash-loans#flash-loan-fee). Next, the script checks the USDC balance of the `FlashLoan` contract, this is only for displaying it to the user, but you can easily implement some logic to stop the process if the funds to repay the borrowing fee are too low. It then requests a flash loan of 1,000 USDC tokens. Once the loan is executed, the remaining USDC tokens in the contract are withdrawn. The [Aave documentation](https://docs.aave.com/developers/guides/flash-loans#2.-calling-flashloan-or-flashloansimple) recommends not leaving any funds in the smart contract to avoid possible misuse by an attacker. The script uses the Hardhat development framework and the `ethers.js` library to interact with the blockchain network and the `FlashLoan` contract. It also prints out messages to the console at various points in the script's execution to provide information about the progress of the `FlashLoan` operation. ### Run the flash loan To run the flash loan on the Fuji network, execute the following command in the console from your root directory: ```shell Shell npx hardhat run --network fuji scripts/deploy.js ``` This command will compile the smart contracts, deploy the FlashLoan contract and execute the operation The result in the console will look like the following: ```shell Shell Deploying FlashLoan contract... FlashLoan contract deployed at: 0x77609a96E67455EcbBb3d8AD38567511dc587C54 View contract at: https://testnet.snowtrace.io/address/0x77609a96E67455EcbBb3d8AD38567511dc587C54 --------------------------------------------------------------- Transferring 5 USDC to the FlashLoan contract... Transferred 5 USDC tokens to the FlashLoan contract Waiting for 1 block to verify the transfer... --------------------------------------------------------------- USDC balance of the FlashLoan contract is: 5 USDC --------------------------------------------------------------- Requesting a flash loan of 1000 USDC... Flash loan executed! View transaction at: https://testnet.snowtrace.io/tx/0xdcc41e8eec65f798aae643b99036f595137090fa777a729ce15c1a0397b247fa --------------------------------------------------------------- Withdrawing 4.5 USDC from the FlashLoan contract... Funds sent! View transaction at: https://testnet.snowtrace.io/tx/0x759fbd6c513e59f48001566b86d1358c329519f6e6096ffb1748ab3ff0b6f97c ``` ### Possible compiler warnings Note that you might receive two warnings from the Solidity compiler about two `Unused function parameter`. You can ignore the warnings as they do not stop the compiler or the execution of the flash loan. This is happening because the function is being overridden and the parameters are needed to keep the same function's signature. You can see how a completed deployment and flash loan looks like on the Fuji explorer by checking the following transactions: ## Conclusion This tutorial guided you through setting up Hardhat to work with Chainstack nodes and creating a project to run your own flash loan transaction on the Avalanche network. This tutorial uses a testnet; however, the exact same instructions and sequence will work on the mainnet as well. ### About the author Developer Advocate @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Base methods Source: https://docs.chainstack.com/docs/base-methods See also [interactive Base API call examples](/reference/base-api-reference). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | --------------------- | | eth\_accounts | | Deprecated | | eth\_blockNumber | | | | eth\_call | | | | eth\_callMany | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_simulateV1 | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | erigon\_blockNumber | | | | erigon\_forks | | | | erigon\_getBlockByTimestamp | | | | erigon\_getBlockReceiptsByBlockHash | | | | erigon\_getHeaderByHash | | | | erigon\_getHeaderByNumber | | | | erigon\_getLatestLogs | | | | erigon\_getLogsByHash | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | Not available on Reth | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | optimism\_outputAtBlock | | | | optimism\_syncStatus | | | | optimism\_rollupConfig | | | | optimism\_version | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Base tooling Source: https://docs.chainstack.com/docs/base-tooling ## Geth Interact with your Base node using [Geth](https://geth.ethereum.org/docs/getting-started). Install [Geth](https://github.com/ethereum/go-ethereum). Use `geth attach` command with the node endpoint. ```shell Shell geth attach YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Invoke any methods from [Web3 JavaScript API](https://web3js.readthedocs.io/). Example below demonstrates how to get the balance of an address in wei value and convert it to ether value: ```js JavaScript > web3.fromWei(web3.eth.getBalance("0xde0b295669a9fd93d5f28d9ec85e40f4cb697bae")) 642538.078574759898951277 ``` ## GraphQL You can use GraphQL on a dedicated node on the [paid plans](https://chainstack.com/pricing/). ### UI You can query data using the graphical interface. On Chainstack, navigate to your dedicated Base node. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Hover over **GraphQL IDE URL** and click **Open**. In the graphical interface that opens, run a GraphQL query. Example to get the latest block number: ```graphql GraphQL { block { number } } ``` ### node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require("axios"); const main = async () => { try { const result = await axios.post("YOUR_CHAINSTACK_ENDPOINT", { query: ` { block { number } } `, }); console.log(result.data); } catch (error) { console.error(error); } }; main(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your GraphQL query. In this case, to get the latest block number. See also [Using GraphQL with EVM-compatible nodes](https://support.chainstack.com/hc/en-us/articles/4409604331161-Using-GraphQL-with-EVM-compatible-nodes). ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Base nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Base endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Base nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. For a detailed tutorial with Remix IDE, see [Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix). ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Base nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require("web3"); const web3 = new Web3( new Web3.providers.HttpProvider("YOUR_CHAINSTACK_ENDPOINT") ); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require("web3"); const web3 = new Web3( new Web3.providers.WebsocketProvider("YOUR_CHAINSTACK_ENDPOINT") ); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Base nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Base nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Key Protected eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Base nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Base nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: "YOUR_CHAINSTACK_ENDPOINT", }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: "YOUR_CHAINSTACK_ENDPOINT", user: "USERNAME", password: "PASSWORD", }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Base network ID: * Mainnet: `8453` * Goerli Testnet: `84531` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider( "YOUR_CHAINSTACK_ENDPOINT", NETWORK_ID ); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Base network ID: * Mainnet: `8453` * Goerli Testnet: `84531` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint. For example, Base mainnet: ```shell Shell brownie networks add Base ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Base network ID: * Mainnet: `8453` * Goerli Testnet: `84531` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```bash Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```bash Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Base: Deploy an ERC-721 contract with Hardhat Source: https://docs.chainstack.com/docs/base-tutorial-deploy-an-erc-721-contract-with-hardhat * You’ll bridge Sepolia ETH to the Base Sepolia Testnet, taking advantage of Base’s L2 benefits. * You’ll configure a Hardhat project and deploy an ERC-721 NFT contract to Base Testnet. * You’ll set environment variables for your Chainstack endpoint and private key. * You’ll verify and explore your contract on the Base Sepolia block explorer. ## Main article Base is an Ethereum layer 2 (L2) solution built atop the Ethereum blockchain and incubated within Coinbase. As an L2 chain, Base provides enhanced security, stability, and scalability, instrumental for the efficient operation of decentralized applications. It supports the deployment of any Ethereum Virtual Machine (EVM) codebase, allowing for an efficient transition of users and assets from Ethereum layer 1 (L1), Coinbase, and other interoperable blockchains. Using Base's L2 solution substantially reduces the cost associated with operating within the EVM environment. Base provides early access to advanced Ethereum features such as account abstraction (ERC-4337), developer APIs for gasless transactions, and smart contract wallets, broadening the scope of possibilities for developers. Base's infrastructure is built on the open-source, MIT-licensed [OP Stack](https://stack.optimism.io/) in a collaborative effort with Optimism. This collaboration has positioned Base as the second Core Dev team working on the OP Stack, reinforcing its commitment to maintaining the OP Stack as a freely accessible public resource. As a product scaled by Coinbase, Base facilitates the integration of decentralized applications with Coinbase's extensive suite of products and distribution channels. This integration includes seamless interfacing with Coinbase, efficient fiat onramps, and access to a vast user base within the Coinbase ecosystem. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Base Testnet node * [node.js](https://nodejs.org/en) as the JavaScript framework ## Dependencies * Hardhat: ^2.17.0 * @nomicfoundation/hardhat-toolbox: ^3.0.0 * dotenv: ^16.0.3 ## Overview In this tutorial, we will go over how to bridge funds between Ethereum Sepolia and the Base Sepolia testnets and deploy a smart contract to the Base Sepolia Testnet using Hardhat. Here is a brief overview of the tutorial: With Chainstack, create a public chain project. With Chainstack, join the Base Testnet. With Chainstack, access your nodes' credentials. Bridge funds between the Sepolia Testnet and the Base Testnet. Create a Hardhat project using node.js. Install the required dependencies. Create a `.env` file to store the secrets. Edit the Hardhat config file. Write the smart contract. Write and run the deployment script. Deploy smart contracts to the Base Testnet. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia Testnet and the Base Testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Base Testnet node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Fund your wallet Before diving into the project, make sure to top up your wallet with Sepolia ether. You can use the Chainstack Sepolia faucet]\([https://faucet.chainstack.com](https://faucet.chainstack.com)). ### Bridging Sepolia ETH to Base Testnet Using the [Base bridge](https://bridge.base.org/), we can easily move assets between Ethereum (L1) and Base (L2). To bridge assets between L1 and L2, the user has to lock up any amount of those assets in the original network using the Base bridge. An equivalent amount of wrapped tokens are then minted in the other chain. ### Create a Hardhat project Create a new directory for your project, then run the following from a terminal: ```shell Shell npm init --y ``` This will create a new Node project. Check out [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) to learn more about creating a Node project. Then run the following to install Hardhat: ```shell Shell npm install --save-dev hardhat ``` When Hardhat is installed, initialize a Hardhat project: ```shell Shell npx hardhat ``` This will launch the Hardhat CLI, which will prompt you to configure a starter project. For this tutorial, create a JavaScript project and click **Yes** on all the prompts Hardhat offers. ### Set up environment variables This project uses the [dotenv](https://github.com/motdotla/dotenv) package to use environment variables safely. Run the following command in your root directory to install the dotenv package: ```shell Shell npm install dotenv ``` In your project's root directory, create a new file and name it `.env`. Here, you will set up the environment variables for your Chainststack endpoint and your wallet's private key. ```sh .env CHAINSTACK_ENDPOINT="YOUR_CHAINSTACK_ENDPOINT" PRIVATE_KEY="YOUR_WALLET_PRIVATE_KEY" ``` Save the file after you add your information. ### Edit the Hardhat configuration file You will find a file named `hardhat.config.js` in the root directory. This file is used to configure various settings for your Hardhat projects, such as the network you want to deploy your contracts on, the compilers you want to use, and the plugins you want to enable. Delete the default code in the file and replace it with the following: ```js hardhat.config.js require("@nomicfoundation/hardhat-toolbox"); require('dotenv').config(); module.exports = { solidity: "0.8.18", defaultNetwork: "base_testnet", networks: { base_testnet: { url: `${process.env.CHAINSTACK_ENDPOINT}`, accounts: [process.env.YOUR_PRIVATE_KEY] }, }, }; ``` Let's break down what each part of the file does: * `require("@nomicfoundation/hardhat-toolbox");` imports the Hardhat Toolbox plugin, which provides several useful tools and utilities for Hardhat projects. * `require("dotenv").config();` loads environment variables from a `.env` file using the `dotenv` package. * `module.exports = { ... }` exports a JavaScript object containing the configuration for the Hardhat project. * `solidity: "0.8.18",` sets the Solidity compiler version to 0.8.18. * `networks: { ... }` defines the network configurations for the Hardhat project. * `defaultNetwork: { ... }` defines the default network that Hardhat will use. * `base_testnet: { ... }` defines the configuration for the `base` network. * `url: ${process.env.CHAINSTACK_ENDPOINT},` sets the RPC URL for the Base network. * `accounts: [process.env.YOUR_PRIVATE_KEY],` sets the accounts for the `base` network using the `PRIVATE_KEY` environment variable. This will allow the Hardhat project to deploy contracts and interact with the Base Testnet using the specified private key. ### Develop an ERC-721 smart contract with OpenZeppelin Now it's time to make our NFT smart contract. You can use the example from here or the [OpenZeppelin Wizard](https://wizard.openzeppelin.com/#erc721) to create a new one. In the root directory, you will find a directory named `contracts`. Create a new file named `CBS.sol`, and paste the following code inside it: ```sol CBS.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.18; import "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import "@openzeppelin/contracts/token/ERC721/extensions/ERC721Burnable.sol"; import "@openzeppelin/contracts/access/Ownable.sol"; import "@openzeppelin/contracts/utils/Counters.sol"; contract ChainBase is ERC721, ERC721Burnable, Ownable { using Counters for Counters.Counter; Counters.Counter private _tokenIdCounter; constructor() ERC721("ChainBase", "CBS") {} function safeMint(address to) public onlyOwner { uint256 tokenId = _tokenIdCounter.current(); _tokenIdCounter.increment(); _safeMint(to, tokenId); } } ``` This smart contract named `ChainBase`is an implementation of an ERC-721 token. ERC-721 is a standard for non-fungible tokens on the Ethereum blockchain, meaning each token has a unique value and is not interchangeable with any other token. ERC-721 tokens are often used for digital collectibles or assets. Then install the OpenZeppelin package in your project: ```shell Shell npm install @openzeppelin/contracts ``` ### Create and run the deployment script In the `scripts` directory inside the root of your project, you will find a file named `deploy.js`. Replace its content with the following: ```js deploy.js const { ethers } = require("hardhat"); async function main() { console.log("Deploying your contract, please Wait..."); const NFT = await ethers.deployContract("ChainBase"); await NFT.waitForDeployment(); console.log("NFT Contract deployed to:", NFT.target); } main() .then(() => process.exit(0)) .catch((error) => { console.error(error); process.exit(1); }); ``` This is a simple deploy script that deploys the `ChainBase` smart contract to the Base Sepolia Testnet, and returns the address of the newly deployed contract in the terminal. You can search for your contract on the [Base Sepolia explorer](https://base-sepolia.blockscout.com/). To run this script, execute the following command in the terminal: ```shell Shell npx hardhat run --network base_testnet scripts/deploy.js ``` You will get a similar response in the console: ```shell Shell $ npx hardhat run --network base_testnet scripts/deploy.js Deploying your contract, please Wait... NFT Contract deployed to: 0x2CfAf4441995344451F10054eE25d4df286768F7 ``` ## Conclusion This tutorial guided you through bridging funds between Ethereum Sepolia Testnet and Base Sepolia Testnet. We also deployed a smart contract to the Base Sepolia Testnet using Hardhat. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Best practices for error handling in API requests Source: https://docs.chainstack.com/docs/best-practices-for-error-handling-in-api-requests **TLDR** * Explains how Chainstack’s global node feature can boost your DApp’s reliability by balancing traffic automatically based on user location. * Demonstrates a JavaScript load balancer script using multiple Chainstack endpoints, distributing requests across different regions to avoid single-point failures. * Shows examples with both web3.js and ethers.js, detailing how to fail over to the next endpoint if one fails. * Concludes that both global nodes and custom load-balancing approaches help ensure your blockchain app can handle high traffic and unexpected downtimes. ## Main article In the world of API requests, error handling is not just a best practice—it's a necessity. Effectively handling HTTP status codes is crucial for ensuring smooth and reliable communication between clients and servers. Whether you're a seasoned developer or just starting out, understanding how to automate the retrieval of response codes from any request can help you build more robust applications, implement effective retry logic, and create comprehensive error backlogs. This guide will walk you through the best practices for error handling in API requests, with a focus on handling HTTP status codes and implementing retry logic. ### Importance of handling HTTP status codes HTTP status codes are the server's way of telling the client about the status of the operation it requested. They play a vital role in API requests as they can indicate success, failure, or need for further action. By properly handling these status codes, you can ensure your application responds appropriately to each possible outcome of an API request. This can significantly enhance the user experience and the overall performance of your application. ### Overview of HTTP status codes HTTP status codes are grouped into five major categories, each representing a specific class of responses. These include: * **1xx** (informational) — the request has been received and understood, and the client should continue the process. * **2xx** (success) — the action was successfully received, understood, and accepted. * **3xx** (redirection) — the client must take additional action to complete the request. * **4xx** (client errors) — the request contains bad syntax or cannot be fulfilled. * **5xx** (server errors) — the server failed to fulfill an apparently valid request. Understanding these status codes and how to handle them is the first step toward effective error handling in API requests. In the following sections, we'll dive deeper into how to retrieve and handle these status codes in your Python code and how to implement a retry logic for temporary failures. ## Practical example Before we can handle HTTP status codes, we first need to know how to retrieve them. In Python, this can be done using the `status_code` attribute of the response object. This attribute holds the status code that the server returned for the HTTP request.  Let's consider a scenario where we're interested in getting the logs of the latest block. We can do this using the following Python code: ```python Python import json import requests node_url = 'YOUR_CHAINSTACK_ENDPOINT' headers = { "Content-Type": "application/json" } payload = { "jsonrpc": "2.0", "method": "eth_getLogs", "params": [ { "fromBlock": "latest", "toBlock": "latest", } ], "id": 1 } response = requests.post(node_url, headers=headers, json=payload) print(response.text) ``` If the above code is successfully run, it will output the logs for the latest block. This means that the response code received by the client (you, who made the request) was equal to 200. To retrieve the response code of the request presented above, we can simply use the following: ```python Python # Considering this request: response = requests.post(node_url, headers=headers, json=payload) # Here's how we can get the response code for such request: response_code = response.status_code print('status code:', response_code) ``` This will store the HTTP status code of the response in the `response_code` variable. Now that we know how to retrieve the status code of a response, we can move on to handling these codes and analyzing error responses. ### Analyzing error responses In addition to dealing with response codes, it's also important to analyze other information in the response to understand and deal with errors. This can be particularly useful when the server returns a 4xx or 5xx status code, indicating a client or server error. For instance, let's consider a possible response for a `eth_getLogs` request that contains an error content in the output: ``` {"jsonrpc":"2.0","id":1,"error":{"code":-32000,"message":"failed to get logs for block #192001 (0xa388fd..65beb8)"}} ``` In this case, the server returned a JSON object with an `error` field, which contains further information about the error that occurred. We can extract this information in our Python code like this: ``` response = requests.post(node_url, headers=headers, json=payload) response_code = response.status_code print('status code:', response_code) if response_code == 200: response_data = json.loads(response.text) if 'error' in response_data: error_content = response_data['error'] print('Error:', error_content) ``` In this code, we first check if the response's status code is 200, indicating a successful request. If it is, we parse the JSON content of the response and check if it contains an `error` field. If it does, we store the content of this field in the `error_content` variable. This information can be used to implement a retry logic and keep a record of whenever those errors happen in time. ### Importance of implementing retry logic Incorporating retry logic into your code can significantly enhance the reliability of your application. By leveraging the tools and techniques we have discussed, you can implement a retry mechanism that automatically handles temporary failures and retries the request when necessary. This can reduce the impact of temporary failures on you, increase system availability, and ensure data integrity. In the worst-case scenario, this enables you to keep track of the errors you face with precise timestamps for such incidents. Implementing retry logic is particularly important when dealing with 5xx server errors. These errors indicate a problem with the server and are often temporary. By implementing a retry logic, your application can automatically retry the request after a short delay, giving the server a chance to recover. This can significantly improve the user experience by reducing the number of failed requests the user has to deal with. ### Implementing retry logic in code Now that we understand the importance of implementing retry logic let's dive into how to implement it in our Python code. Our retry logic aims to automatically retry the request when a temporary failure occurs. This can be a 5xx server error, a connection error, or any other type of error that we deem temporary. Here's an example of how to implement retry logic in Python using both the response code and error messages to determine when to retry a request: ``` import json import time import requests node_url = 'YOUR_CHAINSTACK_ENDPOINT' headers = { "Content-Type": "application/json" } payload = { "jsonrpc": "2.0", "method": "eth_getLogs", "params": [ {"fromBlock": "latest", "toBlock": "latest"} ], "id": 1 } # Max retries retries = 5 delay = 1 # Seconds def get_logs(): for i in range(retries): response = requests.post(node_url, headers=headers, json=payload) if response.status_code != 200: print(f"Request failed with status code {response.status_code}. Retrying attempt {i+1}...") time.sleep(delay) continue response_data = json.loads(response.text) if 'error' in response_data: print(f"There was an error in attempt {i+1}: {response_data['error']}") time.sleep(delay) continue logs = response_data.get("result", []) if len(logs) > 0: block_number = int(logs[0]['blockNumber'], 16) print(f"Block number from eth_getLogs call: {block_number} in attempt {i+1}") print('Processing the event logs...') return print(f"Result is empty for this block in attempt {i+1}") time.sleep(delay) get_logs() ``` The retry logic is governed by a for loop that runs up to a predefined maximum number of attempts (the `retries` variable). For each iteration of the loop, which represents an attempt to fetch the logs, the code performs the following steps: 1. A `POST` request is sent to the Ethereum node with the defined headers and payload. 2. If the HTTP status code of the response is not 200 (indicating a successful request), the code prints a message indicating that the request failed and the current attempt number. Then, it waits for the specified delay period (the `delay` variable) before proceeding to the next iteration of the loop. This delay provides a pause before retrying, which can be helpful in cases where the server might be temporarily overloaded or experiencing other transient issues. 3. If the status code is 200 (indicating a successful request), the response is parsed into JSON format and checked for an `error` key. If `error` is present, the code prints a message with the error details and the current attempt number, waits for the specified delay period, and proceeds to the next iteration of the loop. This handles cases where the request was technically successful, but the response indicates an error condition that might be resolved with a retry. 4. If there's no `error` key in the response but the `result` is empty, the code prints a message indicating this fact and the current attempt number, waits for the specified delay period, and proceeds to the next iteration of the loop. This handles situations where the request was successful and didn't result in an error but didn't provide any logs to process. If the function hasn't returned by the end of the loop (meaning it hasn't successfully processed a set of logs), it will have retried the request the maximum number of times. At this point, the function will exit, and the code will continue, effectively giving up on fetching logs after exhausting all the allowed attempts. ### Using response code and error messages in retry logic As you can see in the above example, we use both the response code and error messages in our retry logic. The response code allows us to determine whether the request was successful, while the error messages provide more detailed information about what went wrong. By using both of these pieces of information, we can make our retry logic more intelligent and effective. For example, we can decide to retry the request immediately if the error message indicates a temporary problem with the server or wait for a longer delay if the error message indicates a more serious problem. In addition, by logging the error messages, we can keep a record of the errors that occurred, which can be useful for debugging and improving our application. ## Common problems and gotchas While handling HTTP status codes and implementing retry logic can significantly improve the reliability of your application, there are a few common problems and gotchas that you should be aware of. ### Importance of effective retry logics and robust error backlogs Another common problem is the lack of effective retry logic and robust error backlogs. Without these, your application may not be able to recover from temporary failures, resulting in poor user experience and potential data loss. An effective retry logic should take into account the nature of the error and adjust its behavior accordingly. For example, if the error is temporary (such as a 5xx server error), the retry logic should wait for a short delay before retrying the request. If the error is permanent (such as a 4xx client error), the retry logic should not retry the request and should log the error and notify the user. A robust error backlog, on the other hand, can help you keep track of the errors that occur in your application, allowing you to debug and fix issues more effectively. It can also provide valuable insights into the performance and reliability of your application, helping you identify areas for improvement. ## Conclusion Handling HTTP status codes and implementing retry logic are crucial aspects of working with API requests. They ensure smooth and reliable communication between clients and servers and enhance your applications' overall performance and resilience. As the volume and complexity of data continue to increase, the importance of these practices cannot be overstated. Remember, the key to effective error handling is understanding the different types of HTTP status codes and how to handle them. This includes knowing how to retrieve these codes, analyze error responses, and implement robust retry logic. By doing so, you can build applications that are capable of handling temporary failures and maintaining data integrity, even in the face of increasing data volume and complexity. However, it's also important to be aware of the common challenges and gotchas associated with these practices. This includes dealing with the constantly growing data in Web3, implementing effective retry logic, and maintaining robust error backlogs. By being aware of these challenges and knowing how to handle them, you can ensure that your applications remain reliable and resilient, no matter what comes their way. In conclusion, while error handling in API requests can be complex, it's an essential skill for any developer working with APIs. By following the best practices outlined in this guide, you can ensure that your applications are well-equipped to handle any errors that may occur, resulting in a better user experience and a more reliable application. # Bitcoin methods Source: https://docs.chainstack.com/docs/bitcoin-methods See also [interactive Bitcoin API call examples](/reference/bitcoin-api-reference). | Method | Availability | Comment | | ---------------------------- | --------------------------------------------- | ------- | | getbestblockhash | | | | getblock | | | | getblockchaininfo | | | | getblockfilter | | | | getblockhash | | | | getblockheader | | | | getblockstats | | | | getchaintips | | | | getchaintxstats | | | | getdifficulty | | | | getmempoolancestors | | | | getmempooldescendants | | | | getmempoolentry | | | | getmempoolinfo | | | | getrawmempool | | | | gettxoutsetinfo | | | | gettxout | | | | verifychain | | | | gettxoutproof | | | | preciousblock | | | | pruneblockchain | | | | savemempool | | | | scantxoutset | | | | verifytxoutproof | | | | uptime | | | | getmemoryinfo | | | | getrpcinfo | | | | help | | | | logging | | | | stop | | | | generateblock | | | | generatetoaddress | | | | generatetodescriptor | | | | getblocktemplate | | | | getmininginfo | | | | getnetworkhashps | | | | prioritisetransaction | | | | submitblock | | | | submitheader | | | | getpeerinfo | | | | getnetworkinfo | | | | getconnectioncount | | | | getnettotals | | | | listbanned | | | | ping | | | | addnode | | | | disconnectnode | | | | getnodeaddresses | | | | setnetworkactive | | | | setban | | | | analyzepsbt | | | | createpsbt | | | | combinepsbt | | | | createrawtransaction | | | | combinerawtransaction | | | | decodepsbt | | | | decoderawtransaction | | | | decodescript | | | | finalizepsbt | | | | fundrawtransaction | | | | getrawtransaction | | | | sendrawtransaction | | | | signrawtransactionwithkey | | | | testmempoolaccept | | | | createmultisig | | | | deriveaddresses | | | | estimatesmartfee | | | | getindexinfo | | | | signmessagewithprivkey | | | | validateaddress | | | | verifymessage | | | | createwallet | | | | listwalletdir | | | | loadwallet | | | | listtransactions | | | | abandontransaction | | | | abortrescan | | | | bumpfee | | | | dumpprivkey | | | | getaddressesbylabel | | | | getbalance | | | | getbalances | | | | getnewaddress | | | | getrawchangeaddress | | | | getreceivedbyaddress | | | | getreceivedbylabel | | | | gettransaction | | | | importaddress | | | | importdescriptors | | | | importmulti | | | | importprivkey | | | | importprunedfunds | | | | importpubkey | | | | importwallet | | | | keypoolrefill | | | | listaddressgroupings | | | | listlabels | | | | listlockunspent | | | | listreceivedbyaddress | | | | listsinceblock | | | | listunspent | | | | lockunspent | | | | psbtbumpfee | | | | removeprunedfunds | | | | rescanblockchain | | | | send | | | | sendmany | | | | sendtoaddress | | | | sethdseed | | | | setlabel | | | | settxfee | | | | setwalletflag | | | | signmessage | | | | signrawtransactionwithwallet | | | | unloadwallet | | | | upgradewallet | | | | walletcreatefundedpsbt | | | | walletlock | | | | walletpassphrase | | | | walletpassphrasechange | | | | walletprocesspsbt | | | | getzmqnotifications | | | | enumeratesigners | | | # Bitcoin tooling Source: https://docs.chainstack.com/docs/bitcoin-tooling ## JSON-RPC API Interact with your Bitcoin nodes using [JSON-RPC API](https://en.bitcoin.it/wiki/API_reference_\(JSON-RPC\)#JSON-RPC). Use [curl](https://curl.haxx.se) or [Postman](https://www.getpostman.com) to invoke [Bitcoin API methods](https://bitcoin.org/en/developer-reference#bitcoin-core-apis). Example below demonstrates how to get basic network information from your Bitcoin node HTTPS endpoint: ```bash cURL curl YOUR_CHAINSTACK_ENDPOINT -d '{"method":"getblockchaininfo","params":[],"id":1}' ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ## Python Work with Bitcoin from your Python application. Install [bitcoincli](https://github.com/chainstack/bitcoincli). Configure the client to use `host` and `port` of the node HTTPS endpoint, the corresponding node `username`, and `password`: ```python Python from bitcoincli import Bitcoin host = "nd-123-456-789.p2pify.com" port = "443" username = "user-name" password = "pass-word-pass-word-pass-word" bitcoin = Bitcoin(username, password, host, port) ``` Invoke any methods from the [Bitcoin API specification](https://bitcoin.org/en/developer-reference#bitcoin-core-apis): ```python Python info = bitcoin.getblockchaininfo() print(info) ``` The example code above should output basic network information: ```python Python {u'pruned': False, u'blocks': 603580, u'chainwork': u'00000000000000000000000000000000000000000a0baf330c67a89653c67005', u'chain': u'main', u'difficulty': 12720005267390.51, u'bip9_softforks': {u'csv': {u'status': u'active', u'since': 419328, u'timeout': 1493596800, u'startTime': 1462060800}, u'segwit': {u'status': u'active', u'since': 481824, u'timeout': 1510704000, u'startTime': 1479168000}}, u'warnings': u'', u'softforks': [{u'version': 2, u'id': u'bip34', u'reject': {u'status': True}}, {u'version': 3, u'id': u'bip66', u'reject': {u'status': True}}, {u'version': 4, u'id': u'bip65', u'reject': {u'status': True}}], u'initialblockdownload': False, u'headers': 603580, u'mediantime': 1573647950, u'verificationprogress': 0.9999919488385801, u'bestblockhash': u'000000000000000000068b536474a44f9e0a0a0ab6be75d9afdaddc8c513bcea', u'size_on_disk': 282597514457} ``` # Blast methods Source: https://docs.chainstack.com/docs/blast-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | optimism\_outputAtBlock | | | | optimism\_syncStatus | | | | optimism\_rollupConfig | | | | optimism\_version | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Blast tooling Source: https://docs.chainstack.com/docs/blast-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Blast nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Blast endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Blast nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. For a detailed tutorial with Remix IDE, see [Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix). ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Blast nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const { Web3 } = require("web3"); const node_url = "CHAINSTACK_NODE_URL"; const web3 = new Web3(node_url); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { Web3 } = require("web3"); const web3 = new Web3( new Web3.providers.WebsocketProvider("YOUR_CHAINSTACK_ENDPOINT") ); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Blast nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Blast nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password 3. Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Key-protected eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Blast nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Blast nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const ethers = require('ethers'); const NODE_URL = "CHAINSTACK_NODE_URL"; const provider = new ethers.JsonRpcProvider(NODE_URL); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.WebSocketProvider( "YOUR_CHAINSTACK_ENDPOINT" ); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint. For example, Base mainnet: ```shell Shell brownie networks add Base ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Base network ID: * Mainnet: `8453` * Goerli Testnet: `84531` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Blast: Tracking Automatic, Void, Claimable accounts Source: https://docs.chainstack.com/docs/blast-tracking-automatic-void-claimable-accounts **TLDR** * On the Blast L2 network, ETH or stablecoins automatically rebase unless configured otherwise through one of their precompiled contracts. * Each account can set its yield mode (AUTOMATIC, VOID, or CLAIMABLE). AUTOMATIC is default, so VOID or CLAIMABLE indicates nonstandard behavior. * In this Python example, we filter recent blocks for addresses, then query each address’s yield mode on both USDB and WETH contracts, flagging any VOID or CLAIMABLE. * This helps identify accounts intentionally opting out of the default auto-yield or wishing to claim accrued yield separately. ## Main article Blast is an EVM-compatible L2 protocol with a major modification—ETH and stablecoins bridged over to the Blast network yield natively. As an example of what this means—if you have your ETH on your address on the Blast network and do nothing, your ETH balance will grow (rebase positively, to be more technical) over time. Check out the [Blast documentation](https://docs.blast.io/) to understand where the yield is coming from. For the purposes of this tutorial, we focus purely on the technical part. There are a few precompiled contracts on Blast that make the network tick. For the full list, see Blastdocs: [Contracts](https://docs.blast.io/building/contracts). The ones that provide native yield generation are: * USDB: [0x4300000000000000000000000000000000000003](https://blastscan.io/address/0x4300000000000000000000000000000000000003) * WETHRebasing:[0x4300000000000000000000000000000000000004](https://blastscan.io/address/0x4300000000000000000000000000000000000004) Inspecting these contracts will reveal that each of them has the same `yieldMode` setting: ```javascript Solidity enum YieldMode { AUTOMATIC, VOID, CLAIMABLE } ``` You can write this setting with the `configure(YieldMode yieldMode` function: ```Text Solidity function configure(YieldMode yieldMode) external returns (uint256) { _configure(msg.sender, yieldMode); ``` This means that every USDB or WETH token on the Blast network has a yield mode assigned to it: `AUTOMATIC` — default mode; your USDB or WETH token rebases positively. `VOID` — your USDB or WETH token does not rebase or accrue claimable yield. `CLAIMABLE` — your USDB or WETH token does not rebase but accrues claimable yield separately. ETH or stablecoins bridged to the Blast network get assigned the default `AUTOMATIC` yield mode. Users can manually set any other (`VOID` or `CLAIMABLE`) mode to their assets by calling the `configure(YieldMode yieldMode` on each of the contracts. This is all we need to know. This seems like checking the live addresses as they interact on the Blast network for the yield mode setting might raise a flag if it's non-default and could be useful. And this is what we are going to do in this simple tutorial. We will do a Python script that: * Identifies active address by extracting the from each new incoming block as `from`. * Checks each of the extracted active addresses against the USDB & WETH contracts for their yield setting and prints the result. All of the code is in [the GitHub repository](https://github.com/chainstacklabs/blast-track-void-claimable). ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Blast Mainnet node * [web3.py](https://web3py.readthedocs.io/) ## Step-by-step ### Get a Blast node Log in to your [Chainstack account](https://console.chainstack.com/) and get a node endpoint. ### Create the script Since the both the USDB and the WETH contracts are almost the same, we are going to use the same common ABI for both of them. Let's first do a script that identifies and prints all active accounts in each new block that have one of the three settings: `AUTOMATIC`, `VOID`, `CLAIMABLE`. We'll do this to make sure the script works as expected. ```python Python import time from web3 import Web3 # Connect to Blast node w3 = Web3(Web3.HTTPProvider('CHAINSTACK_NODE')) # Contract addresses and the common ABI USDB_ADDRESS = '0x4300000000000000000000000000000000000003' WETH_ADDRESS = '0x4300000000000000000000000000000000000004' COMMON_ABI = [{"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"getConfiguration","outputs":[{"internalType":"enum YieldMode","name":"","type":"uint8"}],"stateMutability":"view","type":"function"}] # Create contract objects usdb_contract = w3.eth.contract(address=USDB_ADDRESS, abi=COMMON_ABI) weth_contract = w3.eth.contract(address=WETH_ADDRESS, abi=COMMON_ABI) # Extract "from" addresses from each new block def handle_block(block_number, rps=25): block = w3.eth.get_block(block_number, full_transactions=True) active_accounts = set(tx['from'] for tx in block.transactions if 'from' in tx) delay = 1 / rps # Check yield mode for each active account for account in active_accounts: check_yield_mode(account, usdb_contract, "USDB") check_yield_mode(account, weth_contract, "WETH") time.sleep(delay) # Call the getConfiguration function to check the yield mode. 0 = AUTOMATIC, 1 = VOID, 2 = CLAIMABLE def check_yield_mode(account, contract, contract_name): yield_mode = contract.functions.getConfiguration(account).call() if yield_mode == 0: print(f"{account} yieldMode on {contract_name} is AUTOMATIC") elif yield_mode == 1: print(f"{account} yieldMode on {contract_name} is VOID") elif yield_mode == 2: print(f"{account} yieldMode on {contract_name} is CLAIMABLE") def main(rps=25): block_filter = w3.eth.filter('latest') while True: for block_number in block_filter.get_new_entries(): handle_block(block_number, rps) if __name__ == "__main__": main() ``` where * CHAINSTACK\_NODE — your Blast node deployed with Chainstack * `rps=25` — an RPS setting to make sure you stay within the [limits](/docs/limits). Running the script will reveal that an overwhelming majority of accounts interact with on the network with the default automatic yield setting. Let's modify the script to only print the `VOID` and `CLAIMABLE` accounts, which would be a decent enough flag for non-ordinary accounts. ```python Python import time from web3 import Web3 # Connect to Blast node w3 = Web3(Web3.HTTPProvider('CHAINSTACK_NODE')) # Contract addresses and the common ABI USDB_ADDRESS = '0x4300000000000000000000000000000000000003' WETH_ADDRESS = '0x4300000000000000000000000000000000000004' COMMON_ABI = [{"inputs":[{"internalType":"address","name":"account","type":"address"}],"name":"getConfiguration","outputs":[{"internalType":"enum YieldMode","name":"","type":"uint8"}],"stateMutability":"view","type":"function"}] # Create contract objects usdb_contract = w3.eth.contract(address=USDB_ADDRESS, abi=COMMON_ABI) weth_contract = w3.eth.contract(address=WETH_ADDRESS, abi=COMMON_ABI) # Extract "from" addresses from each new block def handle_block(block_number, rps=25): block = w3.eth.get_block(block_number, full_transactions=True) active_accounts = set(tx['from'] for tx in block.transactions if 'from' in tx) delay = 1 / rps # Check yield mode for each active account for account in active_accounts: check_yield_mode(account, usdb_contract, "USDB") check_yield_mode(account, weth_contract, "WETH") time.sleep(delay) # Call the getConfiguration function to check the yield mode. 0 = AUTOMATIC, 1 = VOID, 2 = CLAIMABLE def check_yield_mode(account, contract, contract_name): yield_mode = contract.functions.getConfiguration(account).call() if yield_mode == 1: print(f"{account} yieldMode on {contract_name} is VOID") elif yield_mode == 2: print(f"{account} yieldMode on {contract_name} is CLAIMABLE") def main(rps=25): block_filter = w3.eth.filter('latest') while True: for block_number in block_filter.get_new_entries(): handle_block(block_number, rps) if __name__ == "__main__": main() ``` ## Conclusion This tutorial guided you through creating a simple Python project that tracks and flags accounts as they come live that have a very uncommon yield mode setting. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Blob transactions the hard way Source: https://docs.chainstack.com/docs/blob-transactions-the-hard-way **TLDR:** * Walks you through crafting and broadcasting blob (type-3) transactions on Ethereum, including creating 128 KB blobs and verifying the KZG commitment and blob versioned hash. * Explains how blob data is stored on the consensus layer for 18 days, while the blob versioned hash lives on the execution layer. * Dives into EIP-4788 & EIP-4844, detailing how to map execution-layer blocks to consensus-layer slots to fetch and verify blob data. * Provides full Python scripts for a “bare bones” approach to building, sending, and verifying blob transactions from start to finish. ## Main article Or *almost* the hard way as we are going to use a bit of Python. The focus of this guide is on keeping the [Dencun and EIP-4844 talk](/docs/ethereum-dencun-rundown-with-examples) to a minimum and to give you hands-on exposure to blob transactions, which is a good way to supplement the prior general knowledge on the topic. ### Chainstack Ethereum archive nodes store blob data Chainstack Ethereum archive nodes store blob data beyond 18 days on the Mainnet, Sepolia, Holesky. [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Introduction ### EIPS involved The two EIPs for this guide are: * [EIP-4788](https://eips.ethereum.org/EIPS/eip-4788) — each block on the execution layer now contains the parent beacon's block root. * [EIP-4844](https://www.eip4844.com/) — the compressed transaction data from rollups is now stored on the consensus layer for 18 days. There's more to each of these EIPs, but we focus on what we need to know for this guide. ### Problem A quick reminder on the problem that the blob transactions solve. With the increased activity on the Ethereum mainnet, the limited throughput leads up to the increased transaction fees (and costs as ETH appreciates in value). To solve this, a new chain is launched. The new chain has faster block times and a centralized unit (the sequencer) processing the transactions and blocks. To have some of the Ethereum network security & decentralization, the new chain takes all of its transactions as complete transaction data, compresses them (or rolls them up, hence rollups) and posts the data in batches to the Ethereum network as smart contract call data. Posting as call data is costly, but you now have the costs split between all the transactions on the rollup chain that were compressed and posted as one transaction to the Ethereum network. As an example, have a look at the Arbitrum batch [517392](https://arbiscan.io/batch/517392?isnitro=true) — there are 809 transactions in the batch; these transactions were paid for in ETH on the Arbitrum mainnet. The ETH from the transactions in the batch was used to fund [one transaction](https://etherscan.io/tx/0x7a5c81cbbc07c68abf0b9d91bb8c6d121aaf3b25b49cd4f047603b3fa9d39910) on the Ethereum mainnet where it posted the entire compressed data from the 809 transactions as call data to a contract. In a simplified way, now you have the cost of this one Ethereum transaction split between the actors of the 809 Arbitrum transactions. This makes each of the 809 transactions cheaper. The two problems with this are: * The EVM was never designed for this use case and posting compressed transaction data in batches as call data to smart contracts is expensive. * The call data is unnecessary and expensive to merely store data. ### Solution Enter blob transactions — the simple and sane solution. Blob data is moved to be stored on the consensus layer. Blob data is no longer processed by the EVM, which means it can be much bigger in size and there is no associated EVM processing gas costs. | | Blockspace | Blobspace | | ----------------- | --------------- | --------------- | | Seen by all nodes | Yes | Yes | | Storage | Execution layer | Consensus layer | | EVM access | Yes | No | | Longevity | Forever | 18 days | | Cost | Expensive | Cheap | Table credits: [Consensys](https://consensys.io/blog/ethereum-evolved-dencun-upgrade-part-5-eip-4844) . A few quick things on blob transactions: * Blob transactions are type-3 transactions on the execution layer. * When submitted on the execution layer, blob transactions go through the usual lifecycle (node > mempool > block) but the actual blob data is separately gossiped to the consensus layer. * Blob data is not validated in the rollup chain context by the Ethereum chain. * 18 days is theoretically sufficient for all the network participants to agree on the rollup chain state. * It is ultimately the job of the respective rollup chain mechanisms to store the blob data beyond 18 days and ensure its availability (aka Data Availability). * Even though the blob data is stored for 18 days, the KZG commitment for each blob — which is basically a fancy hash of blob data — is stored forever. This means that if you recover a blob that's been lost, you can prove it's the same blob through the respective KZG commitment. ### Transaction types As mentioned, blob transactions are type-3 transactions and at the time of this post are the latest transaction type. * **Type-0** aka Legacy — original Ethereum transactions with the parameters `nonce`, `gasPrice`, `gasLimit`, `to`, `value`, `data`, `v`, `r`, `s`. * **Type-1** aka Access list — adds the `accessList` parameter to the original set. Introduced by [EIP-2930](https://eips.ethereum.org/EIPS/eip-2930). * **Type-2** aka [EIP-1559](https://eips.ethereum.org/EIPS/eip-1559) — introduces priority fees by adding `maxPriorityFeePerGas` and `maxFeePerGas`. * **Type-3** aka Blobs — adds two new parameters: * `maxFeePerBlobGas` — max fee per blob gas to store the blob * `blobVersionedHashes` — a blob pointer, which is a hash of the blob's KZG commitment As you see, Ethereum transactions add more parameters with each new type. ## Walkthrough Now let's have a hands-on walkthrough. In Python. Here's what we are going to do: * Create our own blob data * Send a type-3 transaction with our blob data on Sepolia * Retrieve our blob data from the network * Compute the KZG commitment for our blob data and verify it's the same as computed by the node * Compute the blob versioned hash from the KZG commitment and verify it's the same as computed by the node * Find all type-3 transactions in a block * Put it all together and run a script that: * Extracts all type-3 transactions from a block * Retrieves the blob data from the network * Retrieves the KZG commitment from the network * Retrieves the blob versioned hash from the network * Locally computes the KZG commitment and the blob versioned hash * Checks if the locally computed KZG commitment and the blob versioned has match the ones retrieved from the network (aka computed by the nodes) ### Prerequisites Log in to your [Chainstack account](https://console.chainstack.com/) and get an Ethereum Sepolia node. You can get by with a full node for this exercise. Install [eth-abi](https://pypi.org/project/eth-abi/). Install or update [web3.py](https://pypi.org/project/web3/). Note that it's important you have the latest version as it includes support for the type-3 transactions. Install or update [eth-account](https://pypi.org/project/eth-account/). Same here — it's important you have the latest version as it includes support for the type-3 transactions. Install [ckzg](https://pypi.org/project/ckzg/) — this computes the KZG commitments. Fund your account with SepoliaETH. You can do this through the [Chainstack faucet](https://faucet.chainstack.com/). Create a `.env` file with the following variables: ``` EXECUTION_LAYER_URL= CONSENSUS_LAYER_URL= PRIVATE_KEY= ``` [GithHub repository](https://github.com/chainstacklabs/blob-transactions-the-hard-way) for the all the scripts. ### Create blob data Each blob is always is a fixed 128 KB in size. You are always paying for a 128 KB slot space. For our hands-on exercise, this means that if our blob data is less than 128 KB, we need to pad it to 128 KB. We are sending a blob that says `Chainstack`, so we pad it to the fixed size. ```python Python import os from eth_abi import abi def create_blob_data(text): # Encode the text using Ethereum ABI encoding for a string encoded_text = abi.encode(["string"], [text]) # Calculate the required padding to make the blob size exactly 131072 bytes or 128 KB required_padding = 131072 - (len(encoded_text) % 131072) # Create the BLOB_DATA with the correct padding BLOB_DATA = (b"\x00" * required_padding) + encoded_text return BLOB_DATA def main(): text = "Chainstack" # If you change this, make sure you update the padding # Create blob data blob_data = create_blob_data(text) # Print the blob data in hexadecimal format print("Blob Data (Hex):") print(blob_data.hex()) if __name__ == "__main__": main() ``` For convenience, redirect the output to a file as it's going to be huge: ```shell Shell python create_blob_data.py > blob.txt ``` What you have now is your blob data. As a reminder, it's completely arbitrary what we put into it — it can be [2837 transactions](https://arbiscan.io/batch/617502?isnitro=true) rolled into one object or it can the string `Chainstack` in hex and padded to 128 KB (as in our case). ### Send a type-3 transaction Now that we know what blob data is and how to create one, let's actually put it on-chain. Let's first run the script and then have a look at the transaction committed to the chain. ```python Python import os from eth_abi import abi from eth_account import Account from eth_utils import to_hex from web3 import Web3, HTTPProvider from dotenv import load_dotenv load_dotenv() w3 = Web3(HTTPProvider(os.getenv("EXECUTION_LAYER_URL"))) """ Add the web3py middleware to ensure compatibility with Erigon 2 as the eth_estimateGas only expects one argument. The block number or block hash was dropped in https://github.com/ledgerwatch/erigon/releases/tag/v2.60.1 """ def erigon_compatibility_middleware(make_request, w3): def middleware(method, params): if method == 'eth_estimateGas' and len(params) > 1: # Modify the params to include only the transaction object params = params[:1] return make_request(method, params) return middleware w3.middleware_onion.add(erigon_compatibility_middleware) text = "Chainstack" encoded_text = abi.encode(["string"], [text]) # Calculate the required padding to make the blob size exactly 131072 bytes required_padding = 131072 - (len(encoded_text) % 131072) # Create the BLOB_DATA with the correct padding BLOB_DATA = (b"\x00" * required_padding) + encoded_text pkey = os.environ.get("PRIVATE_KEY") acct = w3.eth.account.from_key(pkey) tx = { "type": 3, # Type-3 transaction "chainId": 11155111, # Sepolia 11155111; Holesky 17000 "from": acct.address, "to": "0x0000000000000000000000000000000000000000", # Does not matter what account you send it to "value": 0, "maxFeePerGas": 10**12, "maxPriorityFeePerGas": 10**12, "maxFeePerBlobGas": to_hex(10**12), # Note the new type-3 parameter for blobs "nonce": w3.eth.get_transaction_count(acct.address), } # Now you can estimate gas as usual gas_estimate = w3.eth.estimate_gas(tx) tx["gas"] = gas_estimate # Proceed with the rest of your script signed = acct.sign_transaction(tx, blobs=[BLOB_DATA]) tx_hash = w3.eth.send_raw_transaction(signed.raw_transaction) tx_receipt = w3.eth.wait_for_transaction_receipt(tx_hash) print(f"TX receipt: {tx_receipt}") ``` Note the comments in the script. Especially note how we are handling the Erigon compatibility here. Erigon introduced a breaking change in [v2.60.1](https://github.com/ledgerwatch/erigon/releases/tag/v2.60.1) — unlike Geth, the `eth_estimateGas` call in Erigon moving forward does not take in the block number or block hash as an argument. Running this will print out the transaction receipt. Here's an example of this transaction on the Sepolia etherscan: [0x5a74bd72aeeb99e874e58b927f9a5c96665278a36b61bed69a4b09597b02edce](https://sepolia.etherscan.io/tx/0x5a74bd72aeeb99e874e58b927f9a5c96665278a36b61bed69a4b09597b02edce) On etherscan when viewing the transaction, hit **Blobs**. You will see: * Commitment — the KZG commitment of the blob data. We are going to touch on this later in this article. * Blob Versioned Hash — basically a hash of the blob, serving as pointer on the execution layer to the blob on the consensus layer. Clicking the Blob Versioned Hash value will show the blob data. You can check if it's the same as the one we created previously and saved as `blob.txt`. It should be the same. Etherscan, being a great explorer, is providing you the data both from the consensus layer and the execution layer. What you need to remember here is: * Blob data — only stored on the consensus layer. * KZG commitment — only stored on the consensus layer. * Blob Versioned Hash — only stored on the execution layer. Here's the same blob transaction from our example as `Index 0` in the [Sepolia Beacon chain explorer](https://sepolia.beaconcha.in/block/6090748#blobs). A look at the actual cycle of our type-3 transaction: 1. We create the blob data and submit it as a type-3 transaction to a node. The `to` address of the transaction doesn't really matter as our goal is to put the blob data on the consensus layer. 2. The node picks up our transaction in its entirety, including the complete blob data, and computes the KZG commitment for the blob data and the blob versioned hash from the KZG commitment. 3. The node then propagates the transaction (without the blob data) with the blob versioned hash value to the mempool on the execution layer. The node also gossips the actual blob data over P2P to other nodes. 4. The transaction gets picked up from the mempool as usual and committed in a block on the execution layer. The blob data is stored on the consensus layer. The blob versioned hash on the execution layer is acting as pointer to the blob on the consensus layer. Now we have the type-3 transaction with the blob versioned hash forever living on the execution layer and the actual blob data stored for 18 days on the consensus layer. (Unless you run Chainstack archive nodes, in which case it's also forever). ### Retrieve blob data from the network Let's retrieve the blob data from the network. Since we are working with the consensus layer now, make sure you now use the consensus client endpoint from your Chainstack node details. Our example transaction is included in [block 6090748](https://sepolia.etherscan.io/txs?block=6090748\&isblob=true) on the execution layer. The call we are going to use is [Retrieve blob sidecar](/reference/ethereum_beacon_chain_api/state/getBlobSidecarByRoot). To provide a specific pointer for the blob data on the consensus layer, we can use one of the following:`head`, `genesis`, `finalized`, or `slot_number`. The original [Beacon chain API spec](https://ethereum.github.io/beacon-APIs/#/Beacon/getBlobSidecars) also includes `hex encoded blockRoot with 0x prefix`, but Nimbus nodes (that Chainstack uses) do not support this for finalized blocks for efficiency. And we don't need this anyway as you will see later. Note that the call does not support the execution layer block number, you can only provide the respective slot number on the consensus layer. Programmatically, we'll deal with this later. For now, we'll just pick the respective slot number from the Sepolia Beacon chain explorer: [slot 5203463](https://sepolia.beaconcha.in/slot/5203463). Here's our call ```shell Shell curl --request GET \ --url CONSENSUS_LAYER_URL/eth/v1/beacon/blob_sidecars/5203463 \ --header 'accept: application/json' | jq "." ``` This will retrieve all the four blobs referenced by the four type-3 transactions in [block 6090748](https://sepolia.etherscan.io/txs?block=6090748\&isblob=true) on the execution layer. One of them is our blob data. ### Compute the KZG commitment Let's compute the KZG commitment for our blob data and make sure it's the same one as computed by the node. When we did the `sidecars` call in the previous section, we received 4 different blobs with respective KZG commitments. Let's now verify that one them is ours and computed correctly by independently taking our own created blob data and getting its KZG commitment. #### What is a KZG commitment But first — what is a KZG commitment? In simple words, this is a fancy (and extremely secure) way of creating a hash of a blob. You might have been one of the 141,416 contributors who were a part of the [KZG ceremony](https://ceremony.ethereum.org/) by providing your input. What was created as a result of this ceremony was the file `trusted_setup.txt` that the KZG commitment takes as one of the inputs. Examples: * In [Geth](https://github.com/ethereum/go-ethereum/blob/3687c34cfc9eba1b2c29209d27d272a72c4de3af/crypto/kzg4844/trusted_setup.json) * In [Nimbus](https://github.com/status-im/nimbus-eth2/blob/d2a07514541ffe6ee02a2ec7272ce7a315131e04/beacon_chain/conf.nim#L1487) * In [Erigon](https://github.com/ledgerwatch/erigon/blob/ff0da3dd47eb691dc5aefddb114a261779288778/cmd/utils/flags.go#L117) * In [Nethermind](https://github.com/NethermindEth/nethermind/blob/4fee4bfe1a72433cff37d62319e718462f78ca9a/src/Nethermind/Nethermind.Crypto/kzg_trusted_setup.txt) * In [c-kzg-4844](https://github.com/ethereum/c-kzg-4844/blob/main/src/trusted_setup.txt) The last one is the KZG implementation that we are actually going to use. #### Compute the KZG commitment The Python library for [c-kzg-4844](https://github.com/ethereum/c-kzg-4844/blob/main/src/trusted_setup.txt) is [ckzg](https://pypi.org/project/ckzg/). We are going to use the library, take our blob data that we created with `create_blob_data.py` earlier and saved as `blob.txt`, and take the [trusted\_setup.txt](https://github.com/ethereum/c-kzg-4844/blob/main/src/trusted_setup.txt) file. ```python Python import ckzg def bytes_from_hex(hexstring): return bytes.fromhex(hexstring.replace("0x", "")) if __name__ == "__main__": ts = ckzg.load_trusted_setup("trusted_setup.txt") with open("blob.txt", "r") as file: blob_hex = file.read().strip() blob = bytes_from_hex(blob_hex) # Compute KZG commitment commitment = ckzg.blob_to_kzg_commitment(blob, ts) # Print the commitment in hexadecimal format print("KZG Commitment:", commitment.hex()) ``` This will print the KZG commitment for our blob. For our example, it should be `9493a713dd89eb7fe295efd62545bb93bca395a84d18ecfa2c6c650cddc844ad4c1935cbe7d6830967df9d33c5a2e230` If you look up the KZG commitment in the data you retrieved (or look up in the explorer), you will see that our independently computed KZG for our independently created blob data matches the on-chain one. ### Compute blob versioned hash from the KZG commitment This one is less fancy, let's do the hash of our KZG commitment and see if it matches the versioned hash from the transaction on the execution layer. ```python Python import hashlib # Given KZG commitment kzg_commitment = "9493a713dd89eb7fe295efd62545bb93bca395a84d18ecfa2c6c650cddc844ad4c1935cbe7d6830967df9d33c5a2e230" # Remove the '0x' prefix if present if kzg_commitment.startswith("0x"): kzg_commitment = kzg_commitment[2:] # Convert the KZG commitment to bytes kzg_commitment_bytes = bytes.fromhex(kzg_commitment) # Compute the SHA-256 hash of the KZG commitment sha256_hash = hashlib.sha256(kzg_commitment_bytes).digest() # Prepend the version byte (0x01) to the last 31 bytes of the SHA-256 hash version_byte = b'\x01' blob_versioned_hash = version_byte + sha256_hash[1:] # Convert to hexadecimal for display blob_versioned_hash_hex = blob_versioned_hash.hex() # Print the result print(f"Blob versioned hash: 0x{blob_versioned_hash_hex}") ``` Remember that the blob versioned hash is stored on the execution layer. To retrieve it, you can do a simple [eth\_getTransactionByHash](/reference/ethereum_node_api/transaction_info/eth_getTransactionByHash). Example for our transaction on Sepolia: ```shell Shell curl --request POST \ --url EXECUTION_LAYER_URL \ --header 'accept: application/json' \ --header 'content-type: application/json' \ --data ' { "id": 1, "jsonrpc": "2.0", "method": "eth_getTransactionByHash", "params": [ "0x5a74bd72aeeb99e874e58b927f9a5c96665278a36b61bed69a4b09597b02edce" ] }' | jq '.result.blobVersionedHashes' ``` Check if the retrieved one matches the locally computed one. It should match. ### Find all type-3 transactions in a block This one simple with the latest version of [web3py](https://pypi.org/project/web3/). Make sure you have it updated. ```python Python import os from web3 import Web3 from web3 import Web3, HTTPProvider from dotenv import load_dotenv load_dotenv() w3 = Web3(HTTPProvider(os.getenv("EXECUTION_LAYER_URL"))) # Specify the block number you want to check block_number = 6090748 block = w3.eth.get_block(block_number, full_transactions=True) # Iterate through transactions and check for type-3 transactions for tx in block.transactions: if tx.type == 3: # Type 3 refers to blob transactions print("Transaction Hash:", tx.hash.hex()) ``` ### Putting it all together As a the final one, let's put all parts together and have a script that does the following: * Extracts all type-3 transactions from a block * Retrieves the blob data from the network * Retrieves the KZG commitment from the network * Retrieves the blob versioned hash from the network * Locally computes the KZG commitment and the blob versioned hash * Checks if the locally computed KZG commitment and the blob versioned matches the ones retrieved from the network (aka computed by the nodes) What you may notice is there's no very direct association of the block number on the execution layer with the respective slot number on the consensus layer. We do need the slot number on the consensus layer, however, to be able to do a `sidecar` call and retrieve the blobs associated with the type-3 transactions that we detect on the execution layer. This is where the [EIP-4788](https://eips.ethereum.org/EIPS/eip-4788) will help us. The EIP introduced the first real way for the execution layer to have access to the consensus layer by having the parent block root of the consensus layer. Each block on the execution layer now has the parent (previous) slot root of the consensus layer in it stored as `parentBeaconBlockRoot`. We'll retrieve this parent slot root, get the slot number associated with root with the `eth/v1/beacon/headers/{parent_beacon_block_root_hex}` call on the consensus layer, and then do a simple `+ 1` to get the slot number that we need. ```python Python import os import requests from web3 import Web3, HTTPProvider from dotenv import load_dotenv import ckzg import hashlib load_dotenv() # Connect to the Ethereum Execution Layer w3 = Web3(HTTPProvider(os.getenv("EXECUTION_LAYER_URL"))) # Specify the block number you want to check block_number = 6090748 block = w3.eth.get_block(block_number, full_transactions=True) # Find type-3 transactions type_3_tx_hashes = [tx.hash.hex() for tx in block.transactions if tx.type == 3] # Store blob versioned hashes in a dictionary blob_versioned_hashes_dict = {} for tx_hash in type_3_tx_hashes: tx_details = w3.eth.get_transaction(tx_hash) blob_versioned_hashes = tx_details.get('blobVersionedHashes', []) if blob_versioned_hashes: blob_versioned_hashes_dict[tx_hash] = blob_versioned_hashes[0].hex() # Extract the parentBeaconBlockRoot from the block data parent_beacon_block_root = block['parentBeaconBlockRoot'] # Convert byte string to hexadecimal string parent_beacon_block_root_hex = parent_beacon_block_root.hex() # Ensure it starts with '0x' if not parent_beacon_block_root_hex.startswith('0x'): parent_beacon_block_root_hex = '0x' + parent_beacon_block_root_hex # Print the parentBeaconBlockRoot for visibility print("parentBeaconBlockRoot being queried:", parent_beacon_block_root_hex) # Use parentBeaconBlockRoot for further queries headers_url = f"{os.getenv('CONSENSUS_LAYER_URL')}/eth/v1/beacon/headers/{parent_beacon_block_root_hex}" header_response = requests.get(headers_url) if header_response.status_code != 200: print("Failed to fetch data:", header_response.status_code) print(header_response.text) exit() header_data = header_response.json() if 'data' not in header_data: print("Unexpected response format:", header_data) exit() slot_number = int(header_data['data']['header']['message']['slot']) + 1 # Retrieve blobs blobs_url = f"{os.getenv('CONSENSUS_LAYER_URL')}/eth/v1/beacon/blob_sidecars/{slot_number}" blobs_response = requests.get(blobs_url).json() blobs = blobs_response['data'] # Process each blob results = [] for i, tx_hash in enumerate(type_3_tx_hashes): blob = blobs[i] print(f"Retrieved KZG commitment for transaction {tx_hash}: {blob['kzg_commitment']}") blob_data_hex = blob['blob'] # Save blob data to a file with open(f"blob{i}.txt", "w") as file: file.write(blob_data_hex) # Load blob data from the file and ensure it's correct with open(f"blob{i}.txt", "r") as file: blob_hex = file.read().strip() blob_data = bytes.fromhex(blob_hex.replace("0x", "")) # Ensure consistent handling print(f"Blob data file for transaction {tx_hash}: blob{i}.txt") # Load trusted setup ts = ckzg.load_trusted_setup("trusted_setup.txt") # Compute KZG commitment commitment = ckzg.blob_to_kzg_commitment(blob_data, ts) print(f"Locally computed KZG commitment for transaction {tx_hash}: {commitment.hex()}") # Compute versioned hash sha256_hash = hashlib.sha256(commitment).digest() versioned_hash = b'\x01' + sha256_hash[1:] # Compare with network data, ignoring the '0x' prefix network_commitment = blob['kzg_commitment'] local_commitment_hex = '0x' + commitment.hex() commitment_match = local_commitment_hex == network_commitment print(f"KZG commitment match for transaction {tx_hash}: {commitment_match}") # Use the stored blob versioned hashes during blob processing network_versioned_hash = blob_versioned_hashes_dict.get(tx_hash, "No blob versioned hash found") print(f"Network versioned hash for transaction {tx_hash}: {network_versioned_hash}") print(f"Blob data file for transaction {tx_hash}: blob{i}.txt") print(f"Locally computed KZG commitment for transaction {tx_hash}: {commitment.hex()}") print(f"Locally computed versioned hash for transaction {tx_hash}: {versioned_hash.hex()}") print() results.append({ 'transaction_hash': tx_hash, 'commitment': commitment.hex(), 'versioned_hash': versioned_hash.hex(), 'commitment_match': commitment_match, 'versioned_hash_match': versioned_hash.hex() == network_versioned_hash }) print("### SUMMARY ###") print(f"Block {block_number}, Slot {slot_number}") print("Type-3 transactions:", type_3_tx_hashes) print() for result in results: print(f"TX:{result['transaction_hash']}:") print(f"KZG: {result['commitment']}") print(f"Versioned hash: {result['versioned_hash']}") print(f"Locally computed match for the retrieved blob:") print(f"KZG commitment: {result['commitment_match']}") print(f"Versioned hash: {result['versioned_hash_match']}") print() ``` Make sure you have `trusted_setup.txt` in the same directory with the script. This will print the results and tell you if the independently computed KZG commitment and blob versioned hash match the ones retrieved off the network. Example for our transaction: ```shell Shell TX:5a74bd72aeeb99e874e58b927f9a5c96665278a36b61bed69a4b09597b02edce: KZG: 9493a713dd89eb7fe295efd62545bb93bca395a84d18ecfa2c6c650cddc844ad4c1935cbe7d6830967df9d33c5a2e230 Versioned hash: 01afb6777db05b1376ccb3d0c1e842d437d2a8ad9c0acfb8247edab425fee61c Locally computed match for the retrieved blob: KZG commitment: True Versioned hash: True ``` ## Conclusion This hands-on the hard way guide walked you through creating, detecting, and verifying the type-3 transactions (aka blob transactions). It also showed you some of the pitfalls and how to navigate them like getting slot number associated with the execution layer blob transaction, being able to retrieve the blobs through the Nimbus client, blob data being stored for 18 days (unless you run archive nodes with Chainstack), and so on. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # BNB Smart Chain: Lorentz hardfork Source: https://docs.chainstack.com/docs/bnb-lorentz-hardfork ## Main article Lorentz is an upgrade hardfork on the BNB Smart Chain mainnet that activates on April 29, 2025. Lorentz introduces one single change: [BEP-520: Short Block Interval Phase One: 1.5 seconds](https://github.com/bnb-chain/BEPs/blob/master/BEPs/BEP-520.md) ### Chainstack nodes are Lorentz-ready Chainstack nodes are prepared for the Lorentz network upgrade. Here's what you need to know: * Block time halved from 3 seconds to 1.5 seconds * Gas limit per block halved from 140 million to 70 million * Epoch extended from 200 blocks to 500 blocks * Consecutive block validation per validator increased from 4 blocks to 8 blocks The most important changes that may affect you as a dApp developer are gas limit per block and faster block times. The gas limit per block is an obvious one—slashing 140 million gas to 70 million gas means you can fit fewer heavy compute transactions in a block. Now let's focus on faster block times ## 1.5 second block time Going from 3 seconds to 1.5 seconds means you will need to be ingesting the blocks at 2x the pre-fork speed. Chainstack infrastructure is handling the nodes, so just make sure your client-side service is ready for that. Now let's have a look at the block structure. ### Lorentz block structure Compare post-fork and post-node-upgrade: ```javascript "baseFeePerGas": "0x0", "blobGasUsed": "0x0", "difficulty": "0x2", "excessBlobGas": "0x0", "extraData": "0xd883010509846765746888676f312e32332e37856c696e7578000000c011f0d2f8b27fb860a72b39a61672bf9410fecd9432d18ab0285d4475cc22583b71cf485b0c35d4109d83576db2dd1244484e36a6d9133f46159499e3db3db964b9dd4b9b2c7e185dbc2e3716df7b77befbd89ee2df8dd3d66c54f4a1fad622e1480dc019e35373c8f84c840303fa92a0db41c2b264c7bf23960962bc8e3d92f3538671c28f705760f0d700b17982d0fc840303fa93a0be106c47b19b9cc0341e36e6635202c3ed9b8548511f5f6c9bff56e9168f41b5801c415589572c27ab7d9ba60074e2148896ec1c10881891a52f43ead9084310fa58d5d44846a51c87a42d0f94194df1dc6e226747a693e64694bf9dce51610cd600", "gasLimit": "0x42c1d80", "gasUsed": "0xe464f9", "hash": "0xe58794d2ee429567b8107cbad4a67eb70929616748c4a0f4d76df436e5c56f6b", "logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000", "milliTimestamp": "0x1965c14398c", "miner": "0xd447b49cd040d20bc21e49ffea6487f5638e4346", "mixHash": "0x00000000000000000000000000000000000000000000000000000000000001f4", "nonce": "0x0000000000000000", "number": "0x303fa94", "parentBeaconBlockRoot": "0x0000000000000000000000000000000000000000000000000000000000000000", "parentHash": "0xbe106c47b19b9cc0341e36e6635202c3ed9b8548511f5f6c9bff56e9168f41b5", "receiptsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421", "requestsHash": "0xe3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855", "sha3Uncles": "0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347", "size": "0x37f", "stateRoot": "0x7c7abdcf4b9b0e660e5a3002f9946be0be54b1e99e3208e621598b443f254e37", "timestamp": "0x6807302f", "totalDifficulty": "0x6043378", "transactions": [], "transactionsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421", "uncles": [], "withdrawals": [], "withdrawalsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421" ``` And pre-fork and pre-node-upgrade: ```javascript "baseFeePerGas": "0x0", "blobGasUsed": "0x0", "difficulty": "0x2", "excessBlobGas": "0x0", "extraData": "0xd883010508846765746888676f312e32342e30856c696e7578000000ce18f5d3f8b5831effffb8609291fcd043ec871f0f8c00a11ef5a4e74164698519f3b391ed741010d736594d024b346d36092761a54899cdee613aa5063c028128f9a26884426ae237ea09ff38735645774849ccaf3e644655dcecf65e723ea971cb7aeeab8a25dd1185fa52f84c8402e52cffa04a0bda48a3d029c68c6663306c341213aad0ba3ff5d16f8c3251a59895641b4a8402e52d00a05e4a10dd4dcaa30a1eb6c194ae42abbda4f49632a0bbd37e4d6b48b45c9093f48030faf0179c11484aa23403d9619926beda9cb93ea4fc6e40e5e03ab46b64839b2474871bc03cd01bb57497c3fd7c97af75a42f7095dda366761ec21014bdb6df01", "gasLimit": "0x8583b00", "gasUsed": "0xe464f9", "hash": "0xe58794d2ee429567b8107cbad4a67eb70929616748c4a0f4d76df436e5c56f6b", "logsBloom": "0x00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000", "miner": "0xd447b49cd040d20bc21e49ffea6487f5638e4346", "mixHash": "0x0000000000000000000000000000000000000000000000000000000000000000", "nonce": "0x0000000000000000", "number": "0x303fa94", "parentBeaconBlockRoot": "0x0000000000000000000000000000000000000000000000000000000000000000", "parentHash": "0xbe106c47b19b9cc0341e36e6635202c3ed9b8548511f5f6c9bff56e9168f41b5", "receiptsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421", "sha3Uncles": "0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347", "size": "0x37f", "stateRoot": "0x7c7abdcf4b9b0e660e5a3002f9946be0be54b1e99e3208e621598b443f254e37", "timestamp": "0x6807302f", "totalDifficulty": "0x6043378", "transactions": [], "transactionsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421", "uncles": [], "withdrawalsRoot": "0x56e81f171bcc55a6ff8345e692c0f86e5b48e01b996cadc001622fb5e363b421" ``` | Field | Pre-fork (v1.5.7) | Post-fork (v1.5.11 / Lorentz) | Comment | | ---------------- | ------------------------------------------------- | ----------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | `gasLimit ` | 140,000,000 gas (0x8583b00) | 70,000,000 gas (0x42c1d80) | Block gas limit is halved, but blocks are produced twice as often (every 1.5s instead of 3s). This maintains the same overall throughput while improving transaction confirmation times. | | `milliTimestamp` | Not present | Added (e.g., 0x1965c14398c) | Introduced by BEP-520 to support 1.5-second block spacing without breaking the EVM's second-based timestamp system. | | `mixHash` | All zeros (unused in PoSA; legacy field from PoW) | Low 2 bytes contain sub-second remainder (0x01f4 = 500ms, 0x0000 = 0ms, etc.) | Allows clients to reconstruct the full millisecond timestamp while maintaining the same header size. | | `requestsHash` | Not present | Present (e3b0...b855 = empty SHA-256) | Introduced in [EIP-7685](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-7685.md) and implemented on BSC as part of [BEP-466](https://github.com/bnb-chain/BEPs/blob/master/BEPs/BEP-466.md) and exposed with the upgrade node version. | Let's have a look at `milliTimestamp`. The `milliTimestamp` field is introduced to keep the original `timestamp` in integer seconds. With the switch from 3 seconds to 1.5 seconds, the value becomes more granular, allowing for precise millisecond-level timing while maintaining backward compatibility with the existing second-based timestamp system. The original `timestamp` value is preserved post-fork, and the previously unused `mixHash` value now contains the last bytes of the sub-second timestamp to complement the original `timestamp` value. This means you can retrieve the post-fork block timestamp either via the `milliTimestamp` field or via the `timestamp` + `mixHash` calculation. The `timestamp` + `mixHash` calculation algorithm: 1. Convert timestamp to decimal seconds: `0x6807302f` > 1745301551 s 2. Multiply by 1000 to get milliseconds: 1745301551 × 1000 > 1745301551000 ms 3. Extract the ms remainder from `mixHash`: `0x…01f4` > `0x1f4` > 500 ms 4. Add them together: 1745301551000 + 500 > 1745301551500 ms 5. Convert back to hex if desired `0x1965c14398c` Or in Python: ```python timestamp = int("0x6807302f", 16) # 1745301551 s ms = int("0x1f4", 16) # 500 ms milli_ts = timestamp * 1000 + ms # 1745301551500 hex(milli_ts) # '0x1965c14398c' ``` And there you have it. Unless your service heavily relies on block numbers or block timestamps, you don't have much to worry about. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # BNB Smart Chain methods Source: https://docs.chainstack.com/docs/bnb-smart-chain-methods See also [interactive BNB Smart Chain API call examples](/reference/getting-started-bnb-chain). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | erigon\_blockNumber | | | | erigon\_forks | | | | erigon\_getBlockByTimestamp | | | | erigon\_getBlockReceiptsByBlockHash | | | | erigon\_getHeaderByHash | | | | erigon\_getHeaderByNumber | | | | erigon\_getLatestLogs | | | | erigon\_getLogsByHash | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | parlia\_getValidators | | | | parlia\_getSnapshot | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # BNB Smart Chain Trader Nodes with Warp transactions Source: https://docs.chainstack.com/docs/bnb-trader-nodes Warp transactions propagate through [bloXroute's high-speed transaction relay network](https://docs.bloxroute.com/bdn-architecture). This makes your transactions available for validators to pick up and include in blocks much faster than regular propagation. Each extra transaction (not request, just the transaction) consumed is billed separately. For details, see [Pricing](https://chainstack.com/pricing/) . ## Benefits with Chainstack Chainstack Trader nodes with Warp transactions combine reliable nodes for pre-transaction operations with bloXroute's high-speed propagation for the transaction itself. Switching to a Chainstack Trader node is as simple as changing the endpoint in your code. All calls go through Chainstack's infrastructure, except for `eth_sendRawTransaction`, which is routed directly to bloXroute. ## Sample use case 1. A token liquidity is deployed on BNB Smart Chain. 2. You receive a signal through your BNB Smart Chain node by monitoring the PancakeSwap contracts. 3. Your bot acts on the signal, sending a transaction via the Chainstack Trader node with Warp transactions. 4. Transaction is quickly propagated, increasing chances of winning the race. ## Availability and usage * Available from the [paid plans](https://chainstack.com/pricing). * Warp transactions (not requests, just the transactions) consumed are billed separately. For details, see [Pricing](https://chainstack.com/pricing/) with the [pay-as-you-go](/docs/manage-your-billing) setting enabled. * Only `eth_sendRawTransaction` calls are consumed as Warp transactions. * Deploy nodes close to your bot for best performance. See also [Sending Trader node warp transaction with web3.js, ethers.js, web3.py, and ethClient.go](/docs/sending-warp-transaction-with-web3js-ethersjs-web3py-and-ethclientgo). # BNB Smart Chain tooling Source: https://docs.chainstack.com/docs/bsc-tooling ## BNB Smart Chain client Interact with your BNB Smart Chain node using [BNB Smart Chain client](https://docs.bnbchain.org/docs/validator/fullnode). Install [BNB Smart Chain client](https://docs.bnbchain.org/docs/validator/fullnode). Use `geth attach` command with the node endpoint. ```bash Shell geth attach YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Invoke any methods from [Web3 JavaScript API](https://web3js.readthedocs.io/). Example below demonstrates how to get the balance of an address in wei value and convert it to ether value: ```javascript Javascript > web3.fromWei(web3.eth.getBalance("0xde0b295669a9fd93d5f28d9ec85e40f4cb697bae")) 642538.078574759898951277 ``` ## GraphQL You can use GraphQL on a dedicated node on the [paid plans](https://chainstack.com/pricing/). ### UI You can query data using the graphical interface. On Chainstack, navigate to your dedicated BNB Smart Chain node. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Hover over **GraphQL IDE URL** and click **Open**. In the graphical interface that opens, run a GraphQL query. Example to get the latest block number: ```graphql GraphQL { block { number } } ``` ### Node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require('axios'); const main = async () => { try { const result = await axios.post( 'YOUR_CHAINSTACK_ENDPOINT', { query: ` { block { number } } ` } ); console.log(result.data); } catch(error) { console.error(error); } } main(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your GraphQL query. In this case, to get the latest block number. See also [Using GraphQL with EVM-compatible nodes](https://support.chainstack.com/hc/en-us/articles/4409604331161-Using-GraphQL-with-EVM-compatible-nodes). ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your BNB Smart Chain nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the BNB Smart Chain endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your BNB Smart Chain nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and BNB Smart Chain nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and BNB Smart Chain nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and BNB Smart Chain nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Key Protected eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and BNB Smart Chain nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and BNB Smart Chain nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — BNB Smart Chain network ID: * Mainnet: `56` * Testnet: `97` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — BNB Smart Chain network ID: * Mainnet: `56` * Testnet: `97` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add "BNB Smart Chain" ID name="NETWORK_NAME" host=YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `bsc-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint * NETWORK\_ID — BNB Smart Chain network ID: * Mainnet: `56` * Testnet: `97` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # BNB Smart Chain: BEP-1155 contract with Truffle & OpenZeppelin Source: https://docs.chainstack.com/docs/bsc-tutorial-bep-1155-contract-with-truffle-and-openzeppelin * You’ll create a BEP-1155 contract combining fungible and non-fungible tokens in one deployment. * You’ll rely on OpenZeppelin libraries and Truffle to compile, deploy, and verify your tokens on BNB Smart Chain testnet via Chainstack. * You’ll flatten your code for a smooth verification on BscScan. * By the end, you’ll have fungible tokens (0) and NFT tokens (1) all in a single smart contract. ## Main article BEP-1155 is the multi-token standard for smart contracts that combines the fungibility of BEP-20 and the non-fungibility of BEP-721 in one contract. With a single BEP-1155 contract, you can deploy an ecosystem that has both fungible tokens (currency) and non-fungible tokens (NFTs). In this tutorial, you will: * Create a BEP-1155 contract that has a supply of fungible tokens and one non-fungible token. * Deploy the contract on the BNB Smart Chain testnet through a node deployed with Chainstack. * Interact with the deployed contract. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a BNB Smart Chain node. * [Truffle Suite](https://trufflesuite.com/) to create and deploy contracts. * [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/) to use the audited [ERC-1155 libraries](https://docs.openzeppelin.com/contracts/4.x/erc1155) to create your BEP-1155 contract. ## Overview To get from zero to a deployed BEP-1155 contract on the BNB Smart Chain testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the BNB Smart Chain testnet. With Chainstack, access your BNB Smart Chain node credentials. With OpenZeppelin, create a BEP-1155 contract. With Truffle, compile and deploy the contract through your BNB Smart Chain node. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the BNB Smart Chain testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your BNB Smart Chain node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install OpenZeppelin Contracts See [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/). ### Install Truffle Suite See [Truffle Suite: Installation](https://trufflesuite.com/docs/truffle/how-to/install/). ### Create the contract In your contract directory, initialize Truffle: ```bash Shell truffle init ``` This will generate the Truffle boilerplate structure: ```bash Shell . ├── contracts │ └── .gitkeep ├── migrations │ └── .gitkeep ├── test │ └── .gitkeep └── truffle-config.js ``` Go to the `contracts` directory. In the directory, create your BEP-1155 contract: `BNBSmartChain1155.sol`. ```solidity solidity //SPDX-License-Identifier: MIT pragma solidity ^0.8; import "@openzeppelin/contracts/token/ERC1155/ERC1155.sol"; contract BNBSmartChain1155 is ERC1155 { uint256 public constant FUNGIBLE = 0; uint256 public constant NON_FUNGIBLE = 1; constructor() ERC1155("JSON_URI") { _mint(msg.sender, FUNGIBLE, 100, ""); _mint(msg.sender, NON_FUNGIBLE, 1, ""); } } ``` The contract implementation is the following: * The contract uses OpenZeppelin audited [ERC-1155 contract templates](https://docs.openzeppelin.com/contracts/4.x/erc1155). * The contract creates two tokens: 100 fungible units of the currency called `FUNGIBLE` and 1 non-fungible unit called `NON-FUNGIBLE`. In the BEP-1155 standard, setting a token issuance to `1` makes it non-fungible. * The contract also has `JSON_URI` which is a locator for the metadata of your tokens hosted externally. For example, ``. See [EIP-1155](https://github.com/ethereum/eips/issues/1155) for details. Create `2_deploy_contracts.js` in the `migrations` directory. ```js JavaScript module.exports = function(deployer) { var BNBSmartChain1155 = artifacts.require("./BNBSmartChain1155.sol"); deployer.deploy(BNBSmartChain1155); }; ``` This will create the contract deployment instructions for Truffle. ### Compile and deploy the contract Install `HDWalletProvider`. [HDWalletProvider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider) is Truffle's separate npm package used to sign transactions. Run: ```bash Shell npm install @truffle/hdwallet-provider ``` Edit `truffle-config.js` to add: * `HDWalletProvider` * Your BNB Smart Chain node access and credentials * Your BNB Smart Chain account that you will use to deploy the contract ```js JavaScript const HDWalletProvider = require("@truffle/hdwallet-provider"); const private_key = 'PRIVATE_KEY'; module.exports = { networks: { testnet: { provider: () => new HDWalletProvider(private_key, "YOUR_CHAINSTACK_ENDPOINT"), network_id: 97, confirmations: 3, timeoutBlocks: 200, skipDryRun: true } }, compilers: { solc: { version: "0.8.2", } } }; ``` where * `testnet` — any network name that you will pass to the `truffle migrate --network` command. * `HDWalletProvider` — Truffle's custom provider to sign transactions. * PRIVATE\_KEY — the private key of your BNB Smart Chain account that will deploy the contract. The account must have enough BNB funds to run the deployment. See also [BNB Smart Chain Faucet](https://faucet.chainstack.com/bnb-testnet-faucet). * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node HTTPS endpoint. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [BNB Smart Chain tooling](/docs/bsc-tooling). * `network_id` — the network ID of the BNB Smart Chain network: mainnet is `56`, testnet is `97`. * `solc` — the Solidity compiler version that Truffle must use. OpenZeppelin contracts have a higher version Solidity compiler requirement than the default Truffle installation, hence you must provide a specific compiler version. Run: ```bash Shell truffle migrate --network testnet ``` This will engage `2_deploy_contracts.js` and deploy the contract to the BNB Smart Chain testnet as specified in `truffle-config.js`. ### Interact with the contract Once your contract is deployed, you can view it online at [BscScan testnet](https://testnet.bscscan.com/). Network explorers, including BscScan, do not display the NFT standards by default, so you will have to perform additional steps to check the balance of your issued tokens. Namely, you must verify the contract on BscScan to interact with it online. ### Flatten your contract code Since your BEP-1155 contract uses imported OpenZeppelin libraries, you must put all the imports into one `.sol` file to make BscScan be able to verify it. Install [Truffle Flattener](https://www.npmjs.com/package/truffle-flattener). Run: ```bash Shell npm install truffle-flattener ``` Flatten the contract. In the `contracts` directory, run: ```bash Shell npx truffle-flattener BNBSmartChain1155.sol > FlatBNBSmartChain1155.sol ``` Clean up the licensing information. The flattened contract will have the same licensing note imported from each of the files. Multiple licensing notes in one file break the BscScan verification, so you have to leave one licensing note for the entirety of the flattened contract. The easiest way to clean up is to search for the `SPDX` mentions in the file and remove all of them except for the very first one. ### Verify the deployed contract on BscScan At this point, you have your flattened and cleaned-up contract ready for the BscScan verification. Go to [BscScan testnet](https://testnet.bscscan.com/). Find your deployed contract. The address of your contract should have been printed by Truffle at the end of the deployment in the `contract address` field. On the contract page on BscScan, click **Contract** > **Verify and Publish**. In **Compiler Type**, select **Solidity (Single file)**. In **Compiler Version**, select **v0.8.2**. This is the version this tutorial used to compile the contract. Click **Continue**. Keep the **Optimization** option set to **No** as Truffle does not use optimization by default. Paste the entirety of your flattened `.sol` contract in the **Enter the Solidity Contract Code below** field. Click **Verify and Publish**. BscScan will take a few seconds to compile your contract, verify, and publish it. ### Check the balance Now that your BEP-1155 contract is verified, you can check your balance of the issued tokens on BscScan. On BscScan, on your contract, click **Contract**. Click **Read Contract**. Scroll to the **balanceOf** field. In the **account** field, provide the address of the account you used to deploy the contract. In the **id** field, put `0` to check your fungible balance and put `1` to check your non-fungible balance. ## Conclusion This tutorial guided you through the basics of creating and deploying a contract in the BEP-1155 multi-token standard. The BEP-1155 is useful in that it can deploy an ecosystem of currencies and NFTs in one go—you can add however many fungible and non-fungible tokens to your contract. You also verified your deployed contract online and interacted with it. This tutorial uses testnet, however, the exact same instructions and sequence will work on the mainnet as well. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Celo: Build a simple voting dApp with Foundry, Next.js, and Web3.js Source: https://docs.chainstack.com/docs/celo-build-a-simple-voting-dapp-with-foundry-nextjs-and-web3js ### `TLDR` * You’ll build a simple voting DApp on Celo: a Solidity contract (Voting.sol) deployed with Foundry, and a React/Next.js front-end with web3.js to interact via MetaMask. * The Solidity contract defines two initial candidates (Luna, Orion) and basic voting logic, tested with Foundry’s test suite. * The front end fetches candidate data, checks the chain/network, and allows users to cast votes. * This serves as a baseline for deeper improvements (security, Sybil protection, etc.) before production. ## Introduction to Celo Celo is an open-source blockchain ecosystem that makes decentralized financial (DeFi) tools and services accessible to anyone with a smartphone. It is designed to support financial inclusion and provide a platform for decentralized applications (DApps) with a particular emphasis on mobile usability. ### NFP Not for production (NFP) obviously. Feel free to take the source and modify to your needs. We assume no responsibility for the code. Moreover, this is a very rough unaudited contract. ## Project overview: Simple voting DApp In this tutorial, we'll build a simple voting DApp on the Celo blockchain. The project involves deploying a smart contract using Foundry and creating a simple user interface with Next.js and web3.js to interact with MetaMask. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Celo node * [Foundry](https://book.getfoundry.sh/) for smart contract development * [Next.js](https://nextjs.org/) for the front-end framework * [web3.js](https://docs.web3js.org/) for interacting with the blockchain * Some Celo tokens ## Step-by-step ### Get a Celo node ### Get you own node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ### Smart contract development Use Foundry to develop, compile, and deploy a simple voting smart contract. Install Foundry on your machine; you can [follow the instructions in the Foundry book](https://book.getfoundry.sh/getting-started/installation). Once installed, create a new directory for your project and initialize a Foundry project. ``` forge init foundry ``` This will create a Foundry project with the following layout: ``` . ├── README.md ├── foundry.toml ├── lib │   └── forge-std │   ├── LICENSE-APACHE │   ├── LICENSE-MIT │   ├── README.md │   ├── foundry.toml │   ├── package.json │   ├── scripts │   ├── src │   └── test ├── script │   └── Counter.s.sol ├── src │   └── Counter.sol └── test └── Counter.t.sol ``` In the `src` directory, rename the sample smart contract to `Voting.sol` and paste the following contract: ### Disclaimer This Solidity smart contract is provided as an educational example and is not intended for production use. The code is simplified for clarity and lacks several critical features required for a secure and efficient production-grade application. ```sol Voting.sol // SPDX-License-Identifier: MIT // Celo only supports up to 0.8.19 due to the new Push0 opcode pragma solidity >=0.8.0 <=0.8.19; /// @title A simple voting contract /// @notice This contract allows users to vote for pre-defined candidates /// @dev This contract uses mappings to store candidate and voter information contract Voting { struct Candidate { string name; uint voteCount; } mapping(uint => Candidate) public candidates; mapping(address => bool) public voters; uint public candidatesCount; event CandidateAdded(uint indexed candidateId, string name); event Voted(address indexed voter, uint indexed candidateId); /// @notice Constructor to initialize the contract with default candidates constructor() { addCandidate("Luna"); addCandidate("Orion"); } /// @notice Adds a new candidate to the candidates list /// @dev This function is private and can only be called within the contract /// @param _name The name of the candidate to add function addCandidate(string memory _name) private { candidatesCount++; candidates[candidatesCount] = Candidate(_name, 0); emit CandidateAdded(candidatesCount, _name); } /// @notice Allows a user to vote for a candidate /// @dev This function checks if the voter has already voted and if the candidate ID is valid /// @param _candidateId The ID of the candidate to vote for function vote(uint _candidateId) public { require(!voters[msg.sender], "Already voted."); require( _candidateId > 0 && _candidateId <= candidatesCount, "Invalid candidate." ); voters[msg.sender] = true; candidates[_candidateId].voteCount++; emit Voted(msg.sender, _candidateId); } /// @notice Returns the name and vote count of a candidate /// @dev This function retrieves candidate information based on the candidate ID /// @param _candidateId The ID of the candidate to retrieve /// @return name The name of the candidate /// @return voteCount The vote count of the candidate function getCandidate( uint _candidateId ) public view returns (string memory name, uint voteCount) { Candidate memory candidate = candidates[_candidateId]; return (candidate.name, candidate.voteCount); } } ``` Follow the comments in the smart contract to understand the implementation; here is a quick breakdown. #### TL;DR smart contract breakdown This Solidity smart contract implements a simple voting system. Here's a concise breakdown: 1. **Contract Name**: `Voting` * Implements a basic voting mechanism. 2. **Key Components**: * **Candidate Struct**: Represents a candidate with a `name` and `voteCount`. * **Mappings**: * `candidates`: Maps a candidate ID to a `Candidate` struct. * `voters`: Maps an address to a boolean indicating if the address has voted. 3. **State Variables**: * `candidatesCount`: Tracks the number of candidates. 4. **Events**: * `CandidateAdded`: Emitted when a new candidate is added. * `Voted`: Emitted when a vote is cast. 5. **Constructor**: * Initializes the contract by adding two default candidates: "Luna" and "Orion". 6. **Functions**: * **`addCandidate`**: * Private function to add a new candidate. * Increments `candidatesCount` and updates the `candidates` mapping. * Emits the `CandidateAdded` event. * **`vote`**: * Allows a user to vote for a candidate. * Check if the voter has already voted and the candidate's ID is valid. * Updates the `voters` mapping to mark the address as having voted. * Increments the candidate's `voteCount`. * Emits the `Voted` event. * **`getCandidate`**: * Returns the name and vote count of a candidate by ID. #### Functionality summary * **Adding Candidates**: Candidates can only be added internally via the `addCandidate` function, which is called in the constructor to add initial candidates. * **Voting**: Users can vote once for a candidate by providing the candidate's ID. The contract ensures each user votes only once and only for valid candidates. * **Retrieving Candidate Info**: Users can get a candidate's name and vote count by providing the candidate's ID. This smart contract is designed for a simple voting scenario where users vote for pre-defined candidates, and the results are publicly accessible. #### Test smart contract Now that we have a contract let's implement a simple test script within Foundry. In the `test` directory, rename the current sample script to `Voting.t.sol` and paste the following script: ```sol Voting.t.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.19; import {Test} from "../lib/forge-std/src/Test.sol"; import {Voting} from "../src/Voting.sol"; contract VotingTest is Test { Voting voting; // The setUp function runs before each test function // It creates a new instance of the Voting contract to ensure each test starts with a fresh contract function setUp() public { voting = new Voting(); } // This test checks that a voter cannot vote more than once function testCannotVoteTwice() public { // Cast the first vote for candidate with ID 1 voting.vote(1); // Expect the next vote attempt from the same voter to revert with "Already voted." message vm.expectRevert(bytes("Already voted.")); // Attempt to vote again for the same candidate, which should fail voting.vote(1); } // This test checks that a voter cannot vote for an invalid candidate function testCannotVoteInvalidCandidate() public { // Expect the vote attempt for an invalid candidate ID (3) to revert with "Invalid candidate." message // The contract is deployed with 2 candidates vm.expectRevert(bytes("Invalid candidate.")); // Attempt to vote for a non-existent candidate, which should fail voting.vote(3); } } ``` This basic test script checks that an address cannot vote twice and that a user cannot vote for a candidate not on the list. Note how this contract lacks any real Sybil protection; this is an improvement you can add. Move your terminal within the Foundry project and run the test command. ``` forge test ``` This will run the test script and you should see all the tests pass. ``` [⠘] Compiling... [⠔] Compiling 1 files with 0.8.19 [⠑] Solc 0.8.19 finished in 1.69s Compiler run successful! Running 2 tests for test/Voting.t.sol:VotingTest [PASS] testCannotVoteInvalidCandidate() (gas: 13027) [PASS] testCannotVoteTwice() (gas: 57875) Test result: ok. 2 passed; 0 failed; 0 skipped; finished in 5.57ms Ran 1 test suites: 2 tests passed, 0 failed, 0 skipped (2 total tests) ``` #### Deploy the smart contract We have a tested smart contract; let's deploy it on Celo using Foundry and your Chainstack node. If you haven't yet, ensure you have some Celo tokens. Let's compile the smart contract: ``` forge compile ``` Then, we can deploy the contract in one single command using `forge create`: ``` forge create --rpc-url YOUR_CELO_CHAINSTACK_RPC --private-key YOUR_PRIVATE_KEY src/Voting.sol:Voting ``` Make sure to add your RPC url and your private key to the command editing `YOUR_CELO_CHAINSTACK_RPC` and `YOUR_PRIVATE_KEY`. Also note that this is a quick way to deploy and test, but exposing endpoints and private keys in your terminal is not a good security practice; ensure the wallet is used for testing only. This will deploy the smart contract on Celo. ``` [⠆] Compiling... No files changed, compilation skipped Deployer: 0x8f8e7012F8F974707A8F11C7cfFC5d45EfF5c2Ae Deployed to: 0x5564C4fC4842898Cf78B59373D822A32431d9f46 Transaction hash: 0x383821697a1019f36fe2ab4206463d58bf6bea4e226a4a8db8846a8a48d1eac4 ``` The transaction can be seen on the [Celo explorer](https://explorer.celo.org/alfajores/tx/0x383821697a1019f36fe2ab4206463d58bf6bea4e226a4a8db8846a8a48d1eac4); the link will show you an example of this contract deployment. We'll need the contract ABI and the address where it was deployed for the front end. You can find the ABI in the Foundry project in`out/Voting.sol/Voting.json`. If you didn't make any edits to the contract, you'll find the proper ABI already implemented in the front-end code we'll review in the next section. ### Developing the front end Now that we have deployed the smart contract, we can create a simple front end so that users can interact with it and vote. Let's initiate a Next.js project. You can do this in a different directory. ``` npx create-next-app@latest celo-voting-dapp ``` You can initialize the project with the following options ``` ✔ Would you like to use TypeScript? … No ✔ Would you like to use ESLint? … Yes ✔ Would you like to use Tailwind CSS? … Yes ✔ Would you like to use `src/` directory? … Yes ✔ Would you like to use App Router? (recommended) … Yes ✔ Would you like to customize the default import alias (@/*)? … No ``` Then, move into that directory: ``` cd celo-voting-dapp ``` And install the web3.js package: ``` npm i web3@4.9.0 ``` Then in the Next project, go in `src/app/page.js`and paste the following: Remember to add your node URL and smart contract address in: ``` > const nodeUrl = "YOUR_CHAINSTACK_CELO_URL"; > > // Add the address of your smart contract, you can use this if you don't have one 0x5564C4fC4842898Cf78B59373D822A32431d9f46 > const contractAddress = "YOUR_DEPLOYED_SMART_CONTRACT"; ``` Also note that exposing your endpoint in the front end like this is not good security practice, but it works for a prototype. ```javascript page.js "use client"; import { useEffect, useState } from "react"; import { Web3 } from "web3"; // You node URL and chain ID const nodeUrl = "YOUR_CHAINSTACK_CELO_URL"; const chainId = "0xa4ec"; // 42220 Chain ID for Celo // Add the address of your smart contract const contractAddress = "YOUR_DEPLOYED_SMART_CONTRACT"; const contractAbi = [ { type: "constructor", inputs: [], stateMutability: "nonpayable" }, { type: "function", name: "candidates", inputs: [{ name: "", type: "uint256", internalType: "uint256" }], outputs: [ { name: "name", type: "string", internalType: "string" }, { name: "voteCount", type: "uint256", internalType: "uint256" }, ], stateMutability: "view", }, { type: "function", name: "candidatesCount", inputs: [], outputs: [{ name: "", type: "uint256", internalType: "uint256" }], stateMutability: "view", }, { type: "function", name: "getCandidate", inputs: [ { name: "_candidateId", type: "uint256", internalType: "uint256" }, ], outputs: [ { name: "name", type: "string", internalType: "string" }, { name: "voteCount", type: "uint256", internalType: "uint256" }, ], stateMutability: "view", }, { type: "function", name: "vote", inputs: [ { name: "_candidateId", type: "uint256", internalType: "uint256" }, ], outputs: [], stateMutability: "nonpayable", }, { type: "function", name: "voters", inputs: [{ name: "", type: "address", internalType: "address" }], outputs: [{ name: "", type: "bool", internalType: "bool" }], stateMutability: "view", }, { type: "event", name: "CandidateAdded", inputs: [ { name: "candidateId", type: "uint256", indexed: true, internalType: "uint256", }, { name: "name", type: "string", indexed: false, internalType: "string" }, ], anonymous: false, }, { type: "event", name: "Voted", inputs: [ { name: "voter", type: "address", indexed: true, internalType: "address", }, { name: "candidateId", type: "uint256", indexed: true, internalType: "uint256", }, ], anonymous: false, }, ]; export default function Home() { const [candidates, setCandidates] = useState([]); const [account, setAccount] = useState(""); const [isCorrectNetwork, setIsCorrectNetwork] = useState(false); const [loading, setLoading] = useState(true); const [error, setError] = useState(""); useEffect(() => { // Load the user's account when the component mounts loadAccount(); }, []); useEffect(() => { // Load candidates if the user is connected to the correct network if (isCorrectNetwork) { loadCandidates(); } }, [isCorrectNetwork]); const checkNetwork = async (web3) => { // Get the current chain ID const currentChainId = await web3.eth.getChainId(); // Check if the user is connected to the correct network if (currentChainId !== parseInt(chainId, 16)) { // Switch to the correct network if not await switchNetwork(); } else { setIsCorrectNetwork(true); } }; const switchNetwork = async () => { try { // Request to switch the network in MetaMask await window.ethereum.request({ method: "wallet_switchEthereumChain", params: [{ chainId: chainId }], }); setIsCorrectNetwork(true); } catch (switchError) { // Handle the error if the network is not available in MetaMask if (switchError.code === 4902) { setError( "This network is not available in your MetaMask, please add it manually" ); } } }; const loadAccount = async () => { if (window.ethereum) { const web3 = new Web3(window.ethereum); try { // Request the user's accounts from MetaMask await window.ethereum.request({ method: "eth_requestAccounts" }); const accounts = await web3.eth.getAccounts(); // Set the user's account setAccount(accounts[0]); // Check if the user is connected to the correct network await checkNetwork(web3); return accounts[0]; } catch (error) { // Handle the error if the user denies account access setError("User denied account access"); return null; } } else { // Handle the error if MetaMask is not detected setError("MetaMask not detected"); return null; } }; const disconnectAccount = () => { // Reset the user's account and network status setAccount(""); setIsCorrectNetwork(false); }; const loadCandidates = async () => { const web3 = new Web3(nodeUrl); const contract = new web3.eth.Contract(contractAbi, contractAddress); // Get the total number of candidates from the smart contract const candidatesCount = await contract.methods.candidatesCount().call(); const candidatesArray = []; // Fetch each candidate's details from the smart contract for (let i = 1; i <= candidatesCount; i++) { const candidate = await contract.methods.getCandidate(i).call(); candidatesArray.push({ id: i, name: candidate[0], voteCount: parseInt(candidate[1], 10), }); } // Update the state with the list of candidates setCandidates(candidatesArray); setLoading(false); }; const vote = async (candidateId) => { // Load the user's account if not already loaded const account = await loadAccount(); if (!account) return; const web3 = new Web3(window.ethereum); const contract = new web3.eth.Contract(contractAbi, contractAddress); try { // Call the vote function in the smart contract await contract.methods.vote(candidateId).send({ from: account }); // Reload the candidates to update the vote counts loadCandidates(); } catch (error) { // Handle the error if voting fails setError("Error voting: " + error.message); } }; return (

Celo Voting App | Dashboard powered by Chainstack

Connect your account and vote for your candidate!

Each account can only vote once.

{error &&
{error}
} {!account ? (
) : ( <>
Connected: {account}
{isCorrectNetwork ? ( loading ? (
Loading candidates...
) : (
{candidates.map((candidate) => (
{candidate.name}: {candidate.voteCount} votes
))}
) ) : (
Please switch to the Celo Alfajores Testnet to vote.
)} )}
); } ```
#### TL;DR front-end code breakdown This React component implements a frontend interface for a simple voting application that interacts with a Celo blockchain smart contract. Here's a concise breakdown: 1. **Dependencies**: * `Web3` library for blockchain interaction. * React hooks (`useEffect`, `useState`) for managing state and side effects. 2. **Key Variables**: * `nodeUrl`: URL of the Celo blockchain node. * `chainId`: Chain ID for the Celo Mainnet. * `contractAddress`: Address of the deployed voting smart contract. * `contractAbi`: ABI (Application Binary Interface) of the smart contract. 3. **State Variables**: * `candidates`: Array to store candidate details. * `account`: Stores the user's blockchain account address. * `isCorrectNetwork`: Boolean indicating if the user is connected to the correct blockchain network. * `loading`: Boolean indicating if candidate data is being loaded. * `error`: String for storing error messages. 4. **Lifecycle Hooks**: * `useEffect`: Loads the user's account on the component mount and checks the network status. If the user is connected to the correct network, it loads candidates. 5. **Functions**: * **`checkNetwork`**: Checks if the user is connected to the correct blockchain network and switches networks if necessary. * **`switchNetwork`**: Switches the user's MetaMask network to the Celo Mainnet. * **`loadAccount`**: Requests the user's account address from MetaMask and checks the network status. * **`disconnectAccount`**: Resets the account and network status. * **`loadCandidates`**: Loads candidate details from the smart contract and updates the state. * **`vote`**: The user can vote for a candidate by using the smart contract. 6. **UI Elements**: * Displays the application title and instructions. * Connect button to log in with MetaMask. * Displays the connected account address and log-out button. * Displays a list of candidates with their names, vote counts, and vote buttons. * Error messages and network status messages. This frontend code provides a user interface for interacting with the voting smart contract, enabling users to connect their MetaMask accounts, vote for candidates, and view current voting results. ### Run the DApp All the pieces are together now. If you do not change the contract, the ABI in the front end will work. Otherwise, you'll need to add the updated ABI. Run the project with: ``` npm run dev ``` Your front end will be available on `http://localhost:3000`: ``` > celo-tutorial@0.1.0 dev > next dev ▲ Next.js 14.2.3 - Local: http://localhost:3000 ``` Click on the **Login with MetaMask** button to connect your wallet to the DApp. Then, you can vote for a candidate by signing a transaction. ## Conclusion Congratulations! You have successfully built a simple voting DApp on the Celo blockchain. Following this tutorial, you've learned how to develop, test, and deploy a smart contract using Foundry and create a frontend interface with Next.js and web3.js to interact with the contract. This project is a foundational example of getting you started with blockchain development on Celo. Remember to enhance and secure your DApp before deploying it to a production environment. # Celo methods Source: https://docs.chainstack.com/docs/celo-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_simulateV1 | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Celo tooling Source: https://docs.chainstack.com/docs/celo-tooling Find a complete list of tools on the . ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Celo nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const {Web3} = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Celo nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Celo nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.JsonRpcProvider(urlInfo.url, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Celo network ID: * Celo Mainnet: `42220` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Celo nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Celo nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Chainlink: Estimating the price of a call Source: https://docs.chainstack.com/docs/chainlink-estimating-the-price-of-a-call TLDR: * A script in JavaScript using the [Chainlink functions toolkit library](https://www.npmjs.com/package/@chainlink/functions-toolkit) to estimate the call cost before making the actual smart contract call to the DON network and getting frustrated when it fails. ### Get you own node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Overview [Chainlink functions](https://chain.link/functions) is no small feat in Web3 and it's part more than one onboarding guides for developers getting seriously into in the industry. After all, deploying a simple ERC20 token is no longer enough. AI agents do it all the time. Chainlink has a very good functions tutorial: [Chainlink Functions Getting Started](https://docs.chain.link/chainlink-functions/getting-started). It's designed to be very easy and informative. It explains every step and there are videos to follow along too. One thing, however, is a number of onboarding developers fail at the sending the actual request. A lot of the times it fails. This extremely specific article fixes that. Why? Because people get confused at this step, there's no logging or explanation, in their confusion, they don't know whether it's an RPC endpoint or something else. And it involves quite a bit of digging to get understand what actually is going on. Since I was involved in the troubleshooting (or basically had to troubleshoot), I'm putting out this quick script to public in case someone else finds it useful. The scripts prints what the network *thinks* the call needs to cost to be fulfilled and not fail. I put the word *thinks* here in italics because it's not what the actual computation ends up costing, but it's what preventing the call from going through and you the messages like: * The request will most likely fail. Do you want to continue? * `sendRequest` failed * `sendRequest` execution reverted It can be very frustrating if you are following a tutorial to a ***t*** and still can't through for reasons you can't understand. ## Details It costs \$LINK to fulfill the requests on the Chainlink Oracle network, but the examples you are doing are on Sepolia where the dynamic Oracle call price due to it being the testnet infrastructure can fluctuate wildly. While the example in the tutorial might show that having 2 links on your funded subscription account is more that enough, more often than not the actual price estimate can be otherworldly like 16 \$LINK or even hundreds of LINK. And not having enough $LINK on your subscription account is what actually makes the call fail without explaining it to you. Not that the actual fulfillment price will still be something reasonable within the $LINK decimals range (not even one Link). But the amount of \$LINK on you subscription account will make your transaction revert and fail. ## Call estimation script Make sure you have the [Chainlink functions toolkit library](https://www.npmjs.com/package/@chainlink/functions-toolkit) installed. So here's the script: Note it's for Sepolia. You'll need other values for different networks, but you can find them easily in the Chainlink documentation that's generally has always been pretty well maintained: [Chainlink Functions: Supported Networks](https://docs.chain.link/chainlink-functions/supported-networks) . Also you can change `subscriptionId` to your subscription, but for the default Chainlink tutorial it doesn't really matter. And provide and an actual endpoint for the query to go through here `CHAINSTACK_SEPOLIA_RPC`. ```node estimateCall.js import { SubscriptionManager } from '@chainlink/functions-toolkit' import { ethers } from 'ethers' async function estimateRequestCost() { try { const provider = new ethers.providers.JsonRpcProvider('CHAINSTACK_SEPOLIA_RPC') const signer = ethers.Wallet.createRandom().connect(provider) const subscriptionManager = new SubscriptionManager({ signer, linkTokenAddress: '0x779877A7B0D9E8603169DdbD7836e478b4624789', // Sepolia LINK address functionsRouterAddress: '0xb83E47C2bC239B3bf370bc41e1459A34b41238D0', // Sepolia Router address }) await subscriptionManager.initialize() const estimatedCostInJuels = await subscriptionManager.estimateFunctionsRequestCost({ subscriptionId: 4062, callbackGasLimit: 300000, gasPriceWei: BigInt(20000000000), // 20 gwei donId: 'fun-ethereum-sepolia-1' }) const estimatedCostInLink = ethers.utils.formatEther(estimatedCostInJuels.toString()) console.log(`Estimated cost: ${estimatedCostInLink} LINK`) } catch (error) { console.error('Error:', error) } } estimateRequestCost() ``` For example, at the time posting this article: ```shell Shell % node estimateCost.js Estimated cost: 14.358219259338930318 LINK ``` Which would mean I'd need to have 14+ \$LINK on my subscription account for the [sendRequest](https://docs.chain.link/chainlink-functions/getting-started#run-the-example) to be successful. ## Conclusion Cost estimation is a crucial step in working with Chainlink Functions. This script helps you understand the \$LINK token costs associated with your Functions requests before deployment, allowing for better resource planning and optimization. It may be flaky on Sepolia, but it's still a must-have survival skill for prod. Happy learning! ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Blockchain APIs guides Source: https://docs.chainstack.com/docs/chainstack-blockchain-apis-guides The following guides provide valuable resources for those looking to harness the power of Blockchain APIs to build decentralized applications. These guides provide not only theoretical knowledge but also showcase practical examples, enabling developers to deepen their understanding of these concepts and apply them effectively in their projects. By mastering these tools and techniques, developers can build innovative blockchain-based applications, streamline their development process, and make the most of the decentralized ecosystem. ## Guides list # Overview Source: https://docs.chainstack.com/docs/chainstack-chat-gpt-plugin-introduction ChatGPT plugins are an advancement in the sphere of AI interactions, acting as a bridge between ChatGPT and a diverse range of third-party applications. These plugins empower ChatGPT to interface with APIs designed by developers, thereby substantially expanding its functionalities and enabling it to execute a wide array of tasks. ### The new Chainstack GPT The Chainstack GPT is now available on ChatGPT using GPT4. Try it out here: [Chainstack GPT](https://chat.openai.com/g/g-ueMUdZAzr-chainstack-gpt) ## Natural language Web3 infrastructure management The Chainstack ChatGPT plugin takes this a step further by providing seamless integration with the Web3 infrastructure workflow. This integration enables developers to manage their projects, establish networks, deploy or remove nodes, and even request funds directly through the Chainstack Faucet API, all within the ChatGPT environment. This not only enhances the capabilities of ChatGPT but also simplifies and streamlines the process of interacting with the Chainstack platform, offering a more efficient and intuitive user experience. ## Install the Chainstack ChatGPT plugin To install the Chainstack plugin, simply search for **Chainstack** in the plugin store in ChatGPT. ## What can the Chainstack ChatGPT plugin do? The Chainstack plugin provides various functionalities to interact with different EVM (Ethereum Virtual Machine) based blockchains. Below you can find a breakdown of its capabilities. ### 1. Querying blockchain data * **Get the latest block number** — retrieve the latest block number from the specified chain. * **Get balance** — fetch the balance of a given account at the latest block. * **Get base gas fee** — retrieve the base gas fee from an EVM chain. * **Blockchain call** — make a specific method call on a chosen protocol with user-specified parameters. * **Get transaction summary** — retrieve earliest and latest transactions, among other information, by address. ### 2. Conversion utilities * **Hex to decimal** — convert a hexadecimal value to a decimal value. * **Decimal to hex** — convert a decimal value to a hexadecimal value. * **From wei** — convert a wei value to an ether value. ### 3. ENS (Ethereum Name Service) resolution * **ENS to address** — resolve an ENS name to an Ethereum address. ### 4. Token and NFT management * **Get token balances** — retrieve token balances for a given chain name and wallet address. * **Get NFTs** — retrieve all NFTs for a given chain name and wallet address. * **Get single NFT** — retrieve a single NFT for a given chain name, wallet address, and token ID. ### 5. Retrieve docs information * **Chainstack docs** — scrape Chainstack docs for platform info and blockchain development guides. ## How to use the Chainstack ChatGPT plugin Once you have installed the plugin from the store, simply ask ChatGPT to interact with the Chainstack platform, the faucet, or a blockchain. The blockchains available out of the box are: * Ethereum Mainnet * Polygon Mainnet * Avalanche Mainnet * Arbitrum Mainnet * Gnosis Chain Mainnet You can also provide an endpoint for any EVM chain, and ChatGPT will use it. ### Interaction examples Here you can find some interaction examples and how you can leverage the power of ChatGPT with Chainstack and the blockchain. #### Check accounts balance You can ask for checking balances specifying accounts and blockchain. The plugin supports ENS as well. **User**: What is the balance of jaredfromsubway.eth on Ethereum? **ChatGPT**: The Ethereum balance for the ENS address jaredfromsubway.eth (resolved to 0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13) is 180.158912073898873256 ETH. #### Blockchain calls You can also ask for general blockchain calls. For example, retrieve a list of transactions in a block. ChatGPT will attempt to make conversions on its own, which are not always accurate. **User**: Can you convert block 43184850 to hexadecimal and show me the transactions hashes of this block on Polygon? Please keep the transaction details flag off. **ChatGPT**: The block number 43184850 in decimal is represented as 0x292f2d2 in hexadecimal. Here is the list of transaction hashes from that block on the Polygon network: 0x33e4e2167bd8787d480543022b5a9dd577e3dd48d05a958521e7295d88f47021 0x092709a880a8df3fa10e60dfd124e0da81310e4e3b588e1231ddb5aefd42f0d4 0x6f2726aff25220395816ca9058c165bdcd03c15f9109a643355cd0b5c01ccba8 0x8cba7ca23f86113b8820c7c3ca2339968326834dadc7a983118273bd309bfdae 0xa4f9fd3f88403cc2e4e41a337b7ddc1034cb9cabcb76ceea78f038fe8545e811 . . In this case, ChatGPT will first convert the value to hexadecimal, then call `eth_getBlockByNumber` with the correct parameters. #### Check the chain ID for an endpoint **User**: Can you check the chain ID using this endpoint? [https://nd-123-456-789.p2pify.com/API\_KEY](https://nd-123-456-789.p2pify.com/API_KEY) **ChatGPT**: The chain ID for the specified endpoint [https://nd-123-456-789.p2pify.com/API\_KEY](https://nd-123-456-789.p2pify.com/API_KEY) is 0x1 #### Get the base gas fee **User**: What is the base gas fee on Ethereum? **ChatGPT**: The base gas fee on the Ethereum network is currently 38.28 Gwei. Note that while ChatGPT is highly skilled at understanding requests and interacting with plugins, it is recommended to provide comprehensive prompts. If you encounter an error, feel free to ask again giving more details. ## Try the Chainstack plugin Head over to [ChatGPT](https://chat.openai.com/) and install the Chainstack plugin from the OpenAI store. # Chainstack Compare dashboard Source: https://docs.chainstack.com/docs/chainstack-compare-dashboard Free and open-source dashboard for monitoring RPC node providers performance **TLDR**: * The Chainstack Compare Dashboard provides real-time performance metrics (response times, success rates, transaction confirmation) for major RPC providers across multiple blockchains and regions. * Built with Vercel (data collection) and Grafana Cloud (dashboards), it keeps data refreshed every minute with 14 days of history. * The ranking system prioritizes fast, reliable providers, while special metrics (e.g., transaction landing on Solana) help fine-tune your choice of provider. * Fork the GitHub repository for full technical details and customization of this open dashboard solution. ## Problem Choosing the right RPC provider for your blockchain or trading project can be challenging. Performance varies by region and protocol, and it changes over time. How do you make an informed decision? [Chainstack Compare Dashboard](https://chainstack.grafana.net/public-dashboards/65c0fcb02f994faf845d4ec095771bd0) solves this problem by providing real-time performance data for major RPC providers across different regions and blockchains. The dashboard has been built with the help of Vercel (Pro plan) and Grafana Cloud (Free plan). You can either use the dashboard hosted by Chainstack or fork the [GitHub repository](https://github.com/chainstacklabs/chainstack-rpc-dashboard-functions) and customize it to your needs. ### Run nodes on Chainstack [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Solution overview The solution consists of the two main components: data collection services and dashboard. The data collection service sends API calls to all providers, measures response times, and pushes collected data to the dashboard every minute. The data collection service only records response times and marks requests based on whether they were successful or not. All calculations, including averages and other aggregated values, are performed by the dashboard. ### Quick start Visit the [dashboard](https://chainstack.grafana.net/public-dashboards/65c0fcb02f994faf845d4ec095771bd0?orgId=1) Select your blockchain network Choose your region of interest Review performance metrics ### What you can do With Chainstack Compare Dashboard, you can: 1. Compare RPC provider performance across regions 2. Monitor response times and success rates 3. Monitor transaction confirmation metrics (only for Solana) The dashboard currently supports: * Blockchains: Ethereum, Base, Solana, TON * Regions: US West, Germany, Singapore * Providers: Alchemy, Chainstack, QuickNode, Helius, TonCenter ### Data and paid tiers All metrics, except for the transaction landing metrics, are updated every minute, with 14 days of historical data available. The transaction landing metrics are updated every 15 minutes. For the dashboard, we used paid tiers of the mentioned providers, except for TonCenter where we used the free tier. ### Special features #### Provider ranking We evaluate RPC providers based on their speed (response time) and reliability (success rate) across three regions to rank them in each region and globally. Transaction landing metrics are not considered for this ranking. 1. The lower the score, the better the provider 2. Even small success rate drops (99% vs 97%) significantly impact the final score 3. Performance is measured equally across all regions The formula used for ranking is presented below. Score = 1 ÷ ((1/ResponseTime) × (SuccessRate3)) ### Solana: transaction landing We send standardized test transactions every 15 min across the Solana network to measure two metrics: * **Slot Latency** The time it takes for a transaction to be confirmed on the network, measured in slots. Lower slot latency means faster transaction finality. * **Landing Rate** The percentage of transactions that successfully confirm within our testing parameters. This metric helps identify reliable providers for your specific region. Each test transaction: * Uses the Solana Memo Program for consistent, lightweight operations * Includes fixed priority fees (200,000 microlamports/CU) * Has the 55 sec timeout For more technical details on the transaction landing metric, please check the code [here](https://github.com/chainstacklabs/chainstack-rpc-dashboard-functions/blob/master/metrics/solana_landing_rate.py). We added a few enhanced endpoints for providers which offer such services. Enhanced enpoints are used only for measuring the transaction confirmation speed. | Provider name | Default endpoint | Enhanced endpoint | | ------------- | ---------------- | --------------------------------------------------------------------- | | Alchemy | Yes | No | | Chainstack | Yes | Yes, [enhanced endpoint](/docs/trader-node) with bloXroute technology | | Helius | Yes | Yes, staked connection | | Quicknode | Yes | No | ## How it works ### Vercel: data collection services We chose Vercel as our hosting solution due to its simplicity and time-to-production. The [data collection services](https://github.com/chainstacklabs/chainstack-rpc-dashboard-functions) are Vercel serverless functions deployed to multiple regions. Metrics for each blockchain are collected by a dedicated function which is triggered by Vercel cron jobs every minute. Once data is collected, it is pushed to a Grafana Cloud Prometheus instance. The service has the following performance thresholds: * Response timeout: 55 seconds * Block delay threshold: 55 seconds Failures include: * Timeouts and excessive block delays * Non-200 HTTP responses * JSON-RPC error responses ### Grafana Cloud: dashboards Grafana Cloud with its hosted Prometheus instance providers hassle-free services. Thanks to it, we can focus on the dashboard quality, rather than on supporting a Prometheus instance. Grafana Cloud stores dashboard configurations as JSON files, which makes it easier to support and improve them. Data transformation required for some charts is performed with the help of Grafana Data [Transformation feature](https://grafana.com/docs/grafana/latest/panels-visualizations/query-transform-data/transform-data/) and [PromQL](https://prometheus.io/docs/prometheus/latest/querying/basics/). Historical data is available for the last 14 days. ## For developers Chainstack Compare Dashboard For developers interested in the technical implementation and contribution to the project, please refer to our [GitHub repository](https://github.com/chainstacklabs/chainstack-rpc-dashboard-functions), which contains: 1. Complete technical documentation 2. Setup instructions 3. Configuration guidelines ## FAQ Alchemy, Chainstack, QuickNode, Helius, and TonCenter. Paid tiers (except for TonCenter). It helps choose the right RPC provider based on real data and monitor their performance across regions. Every minute. Transaction landing metrics is updated every 15 minutes. Any response slower than 55 seconds, non-200 status codes, or responses containing error messages (as per JSON-RPC specification). For blocks, delays over 55 seconds count as failures. Yes, dashboards keep 14 days of performance history. We use serverless functions in three regions, measuring response times and success rates from each location. We focus on commonly used methods like balance checks, transaction simulation, and block queries. Each blockchain has its specific set of tested methods. Global shows aggregated performance across all regions, while regional views provide detailed metrics for specific locations. ## Conclusion Chainstack Compare Dashboard provides Web3 developers and users with a comprehensive tool for monitoring RPC provider performance across different regions. By offering real-time metrics on response times, success rates, block delays, and transaction landing, the dashboard enables its users to make data-driven infrastructure decisions. ### About author Developer Advocate @ Chainstack Multiple years of software development and Web3 expertise. Creator of the open-source Compare Dashboard for RPC provider performance benchmarking. Core contributor to the DevEx team’s pump.fun trading bot. Author of technical tutorials on EVM blockchains, Solana, TON and Subgraphs. [](https://github.com/smypmsa) [](https://x.com/sensuniama) [](https://www.linkedin.com/in/anton-sauchyk/) # Chainstack Compare: Revolutionizing RPC node performance analysis in blockchains Source: https://docs.chainstack.com/docs/chainstack-compare-rpc-node-performance **TLDR:** * Chainstack Compare emphasizes data-fetching efficiency (blocks processed per second) to give developers a deeper insight into RPC node performance. * By leveraging Python’s concurrency features, it surpasses traditional metrics that mainly focus on latency and requests per second. * Benchmarks demonstrate dramatic gains in fetching large amounts of data – crucial for data-intensive Web3 applications. * The tool’s focus on real-world conditions, rate limits, and hardware considerations makes it a practical and more accurate solution for evaluating and selecting RPC providers. ## Main article In the evolving world of blockchain, the efficiency of RPC nodes plays an important role in powering data-hungry decentralized applications (DApps). However, traditional metrics for evaluating these nodes often fail to provide a comprehensive view of their capabilities, especially regarding data fetching efficiency—a critical aspect for any blockchain application. This gap in assessment tools led to the development of Chainstack Compare, a cutting-edge Node Performance Tool designed to address this challenge. Chainstack Compare offers a unique approach to measuring RPC node performance, focusing on the efficiency of data fetching from blockchain networks. This tool analyzes how EVM-compatible RPC nodes handle real-time blockchain data retrieval, going beyond the standard metrics of latency and requests per second. In this article, we’ll jump into the world of RPC nodes, uncover the limitations of traditional performance metrics, and introduce the innovative methodology behind Chainstack Compare. Try [Chainstack Compare](https://compare.chainstack.com/) now. ## The need for a specialized tool—Chainstack Compare Traditional methods to test RPC nodes’ performance primarily focus on measuring latency and the number of requests a node can handle per second. While these metrics are useful, they must fully capture the node's ability to handle complex data-fetching tasks essential for blockchain applications. This limitation becomes particularly evident in applications where the goal is fetching and handling large amounts of data. ### Importance of data fetching performance The true test of an RPC node's performance lies in its ability to handle requests efficiently and its capacity to manage and fetch substantial volumes of data. This capability is particularly crucial for DApps that act as data aggregators, API providers, or some institutional users that need to ingest data directly from the source, where the sheer volume and complexity of the data being processed are significantly higher. Traditional performance metrics fall short for such applications. They need to assess how well a node can handle data-fetching operations adequately. ### Introducing Chainstack Compare Chainstack Compare innovatively bridges the gap in traditional performance analysis by introducing a pivotal metric: the rate of blocks processed per second, **BPS**. This metric goes beyond surface-level assessments, offering a deeper insight into a node's operational efficiency in real-world conditions. Chainstack Compare stands as a crucial asset for developers. It empowers them with a more refined and accurate evaluation of RPC node performance, specifically tailored to the needs of data-intensive DApps. ## How Chainstack Compare works In this article section, we will get into the technical workings of Chainstack Compare, focusing on the underlying methodology and logic that powers its node performance tests. This exploration will explain Chainstack Compare's approach to evaluating RPC nodes, particularly in fetching and processing blockchain data. ### Overview of the testing approach behind Chainstack Compare Chainstack Compare uses a comprehensive testing approach to assess the performance of EVM-compatible RPC nodes. The core of this evaluation lies in its ability to fetch and process large amounts of blockchain data, specifically focusing on retrieving the latest 127 blocks. This task concerns quantity and replicating realistic scenarios that an RPC node might encounter in a live blockchain environment. ### Why we chose the path of Chainstack Compare In the journey to create a better way of testing RPC nodes for data ingestion performance, we quickly learned that there’s more to it than just the nodes themselves. The type of architecture and programming language plays a big role, too. This led to a couple of key decisions: * **Using Python for better data fetching:** We all know fetching data loads can be tedious, especially with those endless `for` loops. That’s why we decided to use Python’s concurrency features. It’s not just about speeding things up; it’s about making the process smoother and more manageable. Sure, languages like Rust might offer more efficiency, but many of us are familiar with Python. It’s user-friendly and doesn’t scare away beginners. * **Going big on data collection:** We wanted to mimic a real scenario by fetching a hefty amount of data across various blocks. It’s like a test drive for data ingestion apps that need to pull info from blockchains. This way, we get a clearer picture of how well an RPC node performs under pressure, similar to what developers face in the real world. * **Looking beyond just fetching data:** It's not only about how fast or how much data you can fetch. We also looked into the limitations, like rate limits providers set up. Chainstack Compare doesn’t just tell you how effective an RPC node is; it also shows if any rate limits were triggered and how it affected the fetching process, including the percentage of failed attempts. By taking this route, we'd like to shed some light on the whole process, making it easier for developers to understand what works best when dealing with blockchain data. It's about finding that sweet spot between technical efficiency and practical usability. ### Technical logic behind Chainstack Compare At the heart of Chainstack Compare's testing process is a Python-based system that leverages asynchronous programming and multithreading to optimize data retrieval and processing. This approach allows the tool to efficiently handle multiple tasks concurrently, significantly enhancing the speed and accuracy of data fetching. 1. **Asynchronous and multithreaded processing**: Chainstack Compare employs asynchronous programming alongside a `ThreadPoolExecutor` for multithreading. This combination enables the tool to initiate and manage multiple block data fetch operations simultaneously, minimizing wait times and maximizing the throughput of data processing. 2. **Testing various RPC methods**: Chainstack Compare tests how an RPC node performs with various RPC methods. At launch, it will include `eth_getblockByNumber`, `eth_call`, and `debug_traceBlockByNumber`. 3. **Performance metrics calculation**: Post data retrieval, Chainstack Compare calculates key metrics such as blocks per second, whether all the data was processed correctly, and if any rate limit was triggered. These metrics offer insights into the node's efficiency in processing blockchain data, responsiveness, and capability to handle high-demand scenarios typical in blockchain applications. Read the following to understand the concept behind the asynchronous and multithreaded processing in Python Chainstack Compare uses: **[Mastering multithreading in Python for Web3 requests: A comprehensive guide](/docs/mastering-multithreading-in-python-for-web3-requests-a-comprehensive-guide)** ### Comparative analysis of fetching methods: Python concurrency vs. traditional JavaScript To highlight the effectiveness of Chainstack Compare's methodology, we conducted benchmarks using [Viem](https://viem.sh/docs/introduction.html), a JavaScript Web3 library. We explored traditional JavaScript methods versus Python's concurrency approach, which is foundational to Chainstack Compare. ### Traditional methods in JavaScript with Viem * **Regular looping:** Using Viem with a standard for loop in JavaScript, fetching 100 blocks showed a blocks per second (BPS) rate of approximately 3.1; in this case, the RPC node used does not make any difference. This approach clearly shows the limitations of synchronous processing in data-intensive tasks. * **Asynchronous concurrency:** Implementing an asynchronous approach with Viem yielded a significant improvement, peaking at 47.68 BPS and fetching 100 blocks. Despite this enhancement, JavaScript's model of async concurrency, lacking true multi-threading, revealed an upper limit in performance; with this approach, the RPC node makes a difference, but not substantial, as it would not exceed 47/48 BPS. Due to those results, I decided to use concurrency in Python as it could yield better performance. ### Python concurrency: Elevating performance After exploring traditional JavaScript fetching methods with Viem, we turned our focus to the concurrency features of Python to assess their impact on performance. The results shed light on Python's significant advantages in efficiently managing large-scale data fetching tasks. ### Benchmark results using Python Our benchmarks were specifically designed to fetch varying numbers of blocks, aiming to understand how the volume of data and the configuration of concurrent workers affect performance. For these tests, we utilized 60 workers to optimize task execution. Here’s what we discovered: * **Fetching 10 blocks:** Completed in 0.764 seconds, resulting in a rate of 13.09 blocks per second (BPS). This baseline test demonstrates Python's capability to quickly process smaller data sets, benefiting from the efficient management of concurrent workers. * **Fetching 100 blocks:** The processing time was significantly reduced compared to the JavaScript approach to 1.367 seconds, pushing the rate to 73.14 BPS. Compared to the slower JavaScript approach, this marked improvement underscores Python's efficiency in scaling with data volume and worker allocation. * **Fetching 250 blocks:** This test achieved a performance of 108.76 BPS within just 2.299 seconds, highlighting Python's exceptional ability to manage intermediate data sets efficiently, leveraging the optimal number of workers. * **Fetching 500 blocks:** As the data volume increased, the processing time of 3.143 seconds for a rate of 159.07 BPS exemplified Python's robustness in handling large-scale data-fetching tasks, with worker count playing a key role in maximizing efficiency. * **Fetching 1000 blocks:** In our most comprehensive test, the process was completed in 5.682 seconds, achieving a groundbreaking rate of 175.99 BPS. This outcome showcased Python's unparalleled scalability and performance, even with massive data sets, attributed to the strategic use of 60 workers. The previous benchmarks were conducted on an M2 MacBook Pro 12-Core CPU with 16GB of memory, using a Chainstack Ethereum Global endpoint. [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. ### Benchmark overview Our benchmark tests clearly show that two main factors impact the efficiency of data processing tasks: the volume of data being processed (measured in blocks) and how many workers are running concurrently. The hardware used in these tests—specifically, the CPU's multi-core design and fast memory access significantly reduces the overhead of managing and switching between multiple concurrent tasks, enhancing the throughput of block-fetching operations. This finding highlights the need to consider the hardware to optimize data processing tasks. Solutions like Chainstack Compare can illuminate how RPC nodes distribute data and the varying performance levels across providers by considering the setup of concurrent workers and making the most of certain hardware features. ## Overview of Chainstack Compare The deployed version of Chainstack Compare runs on more basic hardware. Currently, a machine running 2 vCPU and up to 2 GB of RAM. It highlights the performance difference between RPC nodes but displays less impressive BPS figures. ### Chainstack Compare current specs At this moment, Chainstack Compared is deployed on the following specs: * Machine running 2 vCPU * 2 GB of RAM * 8 workers configured In evaluating the performance of an RPC node, the ability to rapidly fetch data is crucial but not the only consideration. This becomes particularly relevant in scenarios where a provider imposes stringent rate-limiting. To address this, Chainstack Compare adopts a comprehensive approach to testing RPC nodes that is both realistic and practical, focusing on several key aspects: * Ensuring users understand the tool's architecture and how its integration with specific hardware impacts data retrieval capabilities. * Fetching a specific quantity of actual data to provide tangible performance metrics, in this case, a certain number of blocks in the past. * Measuring how fast an RPC node can retrieve data from the blockchain, quantified by the number of blocks per second it can process and return to the user. * Managing rate-limiting errors and presenting users with data that includes the proportion of unsuccessful requests. This approach allows Chainstack Compare to offer nuanced insights into RPC node performance, considering the complex interplay between software capabilities, hardware limitations, and provider restrictions. ## Conclusion Chainstack Compare offers a deep dive into RPC node performance that traditional metrics often overlook. Throughout this article, we've explored its unique approach to evaluating data fetching efficiency, a critical component for the smooth operation of DApps. By integrating Python concurrency features, Chainstack Compare highlights the limitations of conventional methods and sets a new standard in node performance analysis. This tool is a testament to the importance of precision and adaptability in the blockchain domain, empowering developers with the insights needed to optimize DApps for the challenges of tomorrow. # Overview Source: https://docs.chainstack.com/docs/chainstack-data-leak-protection-tool-introduction In the digital age, data privacy, and security have become paramount. As we increasingly rely on artificial intelligence (AI) models for various tasks, we must ensure that our interactions with these models do not inadvertently expose sensitive information. This is particularly relevant when using AI-powered chatbots like ChatGPT, where users often input data that could be personal or sensitive. The Chainstack DLP browser extension is designed to enhance the privacy and security of your interactions with ChatGPT. This extension works by redacting potentially sensitive information before it's sent to ChatGPT for processing, including names, addresses, API keys, JWTs, etc. Currently, **Chainstack DLP** is available either locally from the source code or through installation on the [Chrome Web Store](https://chrome.google.com/webstore/detail/chainstack-dlp/miihnlobablmmphiogghcpeebijgpgce?hl=en\&authuser=0). ## Processing sensitive data To function as a genuine data loss prevention (DLP) tool, Chainstack DLP performs all processing locally. This means the extension does not rely on any external APIs, ensuring that your data never leaves your local environment. Trust is a critical factor in data security, and to uphold this trust, we've made our tool 100% open-source. This transparency allows you to verify the security measures we've implemented. You can install the extension directly from the store, or, for those who prefer, you can also install it locally using this repository. Chainstack DLP tool employs regular expression patterns to detect potentially sensitive data, such as API keys, credit card numbers, JWTs, etc. Additionally, it utilizes the [compromise package](https://github.com/spencermountain/compromise) V 14.9.0, a robust JavaScript library for natural language processing (NLP). This library aids in the identification of personal and business identifiers, including names, addresses, and company names. ## Source code The source code of the Chainstack DLP tool is 100% open source and is available on the Chainstack Labs GitHub account. ## Usage The Chainstack DLP introduces two buttons and a small preview window above the ChatGPT input bar. The `↕` button serves two functions: expanding and minimizing the preview window. Meanwhile, the clear button deletes all content within the ChatGPT input bar. A popup window is provided to manage the redaction tool. Users can enable or disable the tool and select or unselect specific types of data to redact. However, particularly sensitive data such as credit card patterns, JSON Web Tokens, Ethereum private keys, and phone numbers cannot be disabled. Any changes made within the popup window will update live. However, you must empty the ChatGPT input bar and reenter the content to see these changes. A preview is available to check the content before submitting it to ChatGPT. If something appears malfunctioning, reloading the page should resolve the issue. # Chainstack Marketplace Source: https://docs.chainstack.com/docs/chainstack-marketplace-tutorials This set of guides provides instructions on how to seamlessly integrate with the Chainstack Marketplace, as well as how to effectively utilize the wide range of services from third-party vendors available on the platform. Whether you're a seasoned user or new to the platform, these guides will offer valuable insights and help you get the most out of your experience with the Chainstack Marketplace. ## Guides list # Chainstack Subgraphs Source: https://docs.chainstack.com/docs/chainstack-subgraphs-tutorials This is a tutorials collection that will help you to use your subgraphs to the fullest. Before you start, make sure you learn the basics of [Chainstack Subgraphs](/docs/subgraphs-introduction). Explore how you can interact with subgraphs by following our series: # Best practices handbook Source: https://docs.chainstack.com/docs/chainstack-web3-development-best-practices The following guides focus on implementing best practices for building secure and efficient decentralized applications (DApps). They cover a range of essential topics, and by following these best practices, developers can create DApps that not only meet their functional requirements but also provide a reliable and secure experience for end-users, contributing to a more robust decentralized ecosystem. ## Guides list # Creating a subgraph for upgradeable proxy contracts: developer's guide Source: https://docs.chainstack.com/docs/creating-a-subgraph-for-upgradeable-proxy-contracts-a-developers-guide In this article, we explore the intricacies of developing a subgraph for upgradeable proxy contracts, a crucial component in modern smart contract development. This guide aims to provide a clear understanding and methodology for handling such contracts, ensuring a seamless update process. ### **Understanding Upgradeable Proxy Contracts** Upgradeable Proxy Contracts, as explained in resources like OpenZeppelin, offer a dynamic solution for updating smart contracts. They allow for the modification of the contract's logic over time, while the user interface remains unchanged. This separation of core functionality from the user interaction layer is pivotal in maintaining a stable yet adaptable contract environment. ### **The Challenge with Contract Events** A significant challenge in updating proxy contracts arises when altering contract events, especially when these events need to accommodate different data types or structures. A common issue is the limitation in indexing the same event name with varied parameter structures, a frequent obstacle for subgraph developers. ### **The Strategy: Renaming Events for Legacy and New Data** The key strategy here is to rename events when they are updated. This allows for indexing both the legacy data and the new data. It involves modifying the contract's ABI (Application Binary Interface) to include the legacy event with a new name. ### **Example: Updating Events in a Sample Contract** Let's consider a hypothetical smart contract, MarketExchange.sol. We'll demonstrate how to transition from a legacy event to a revised version. Legacy Event (Version 1): ```Text GraphQL event LegacyTrade( bytes32 indexed tradeId, bytes32 indexed tradePair, address indexed initiator, ERC20 assetSent, ERC20 assetReceived, uint128 sentAmount, uint128 receivedAmount, uint64 tradeTimestamp ); ``` Updated Event (Version 2): ```Text GraphQL event NewTrade( bytes32 indexed tradeId, bytes32 indexed tradePair, address indexed initiator, ERC20 assetSent, ERC20 assetReceived, uint128 sentAmount, uint128 receivedAmount ); ``` In this update, the 'tradeTimestamp' is removed, considering its retrievability from the transaction context. ### **Modifying Contract ABIs for Subgraph Compatibility** The next essential step is to update the ABI in our subgraph to recognize the legacy event under its new name. Configuring the Subgraph.yaml File The subgraph.yaml file is then adjusted to index events from both the legacy and the current versions of the contract: ```json JSON abis: - name: MarketExchange file: ./abis/MarketExchangeOptimism.json eventHandlers: - event: NewTrade(...) handler: handleNewTrade ... - event: LegacyTrade(...) handler: handleLegacyTrade ... ``` ### **Conclusion** Through the use of proxy contracts and adherence to event-driven development principles, we can navigate contract version updates seamlessly. This method ensures data preservation and maintains the ecosystem built around a consistent access address. Adapting our subgraphs to index both legacy and current events exemplifies the flexibility and robustness of blockchain applications. ### About the author Product Lead @ Chainstack BUIDLs on Ethereum and Graph protocol Majored in Data Science and Product Management [](https://github.com/balakhonoff) [](https://twitter.com/balakhonoff) [](https://www.linkedin.com/in/kirill-balakhonov/) # Cronos methods Source: https://docs.chainstack.com/docs/cronos-methods See also [interactive Cronos API call examples](/reference/getting-started-cronos). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Cronos tooling Source: https://docs.chainstack.com/docs/cronos-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Cronos nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Cronos endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Cronos nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Cronos nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Cronos nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Cronos nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Cronos nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Cronos nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Cronos network ID: * Mainnet: `25` * Testnet: `338` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Cronos network ID: * Mainnet: `25` * Testnet: `338` ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Cronos ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Cronos network ID: * Mainnet: `25` * Testnet: `338` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use forge to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use cast to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Cronos: Dutch auction smart contract with Hardhat Source: https://docs.chainstack.com/docs/cronos-tutorial-dutch-auction-smart-contracts-on-cronos-with-hardhat **TLDR** * This tutorial shows how to create and deploy a Dutch auction NFT contract on Cronos testnet using Hardhat. * The price starts high and lowers at fixed intervals; buyers can mint an NFT once the price is acceptable. * You’ll set up Hardhat, code and compile the contract, then verify and interact with it using Cronoscan. * The same approach works for mainnet once you’re comfortable testing everything on Cronos testnet. ## Main article A Dutch auction is a type of an auction in which the initial price of an NFT starts at its ceiling and lowers by a small amount at set intervals. Buyers make bids at reduced prices until the end of an auction. A Dutch auction continues until either all assets are sold out or the auction time ends. The objective of this tutorial is to familiarize you with the Cronos network, Hardhat, and Dutch auction smart contracts. In the end of the tutorial, you will be able to create a simple Dutch auction smart contract to help you sell your NFTs to the highest bidder. Specifically, in this tutorial, you will: * Create a Dutch auction smart contract. * Deploy and verify the contract on the Cronos testnet through a node deployed with Chainstack. * Interact with the deployed contract. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Cronos testnet node. * [Hardhat](https://hardhat.org/docs) to compile and deploy the contract. * [MetaMask](https://docs.metamask.io/guide/) to interact with the contract through your Chainstack node. ## Overview To get from zero to deploying your own Dutch auction smart contract on the Cronos testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the Cronos testnet. With Chainstack, access your Cronos node endpoint. With Hardhat, set up your development environment. With Hardhat, create and compile your Dutch auction contract. With MetaMask, fund your wallet with Cronos test tokens (CRO) from the [Cronos official faucet](https://cronos.org/faucet). With Hardhat, deploy your Dutch auction contract. With [Cronos testnet explorer](https://testnet.cronoscan.com/), interact with your Dutch auction contract. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Cronos testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Cronos node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install Hardhat See [Hardhat documentation](https://hardhat.org/hardhat-runner/docs/getting-started#installation). ### Initialize a Hardhat project In your project directory, run `npx hardhat`. Select **Create a JavaScript project** and agree to install the sample project's dependencies. This will create a sample project directory with a smart contract draft. ### Install additional dependencies To complete the project, we need to install several additional dependencies. The [OpenZeppelin Contract library](https://docs.openzeppelin.com/) allows you to inherit smart contracts. To install it, run: ```bash Shell npm i @openzeppelin/contracts ``` The [dotenv library](https://github.com/motdotla/dotenv) allows you to export and keep sensitive data securely. To install it, run: ```bash Shell npm i dotenv ``` The [hardhat-etherscan](https://hardhat.org/hardhat-runner/plugins/nomiclabs-hardhat-etherscan) and [hardhat-cronoscan](https://docs.cronos.org/for-dapp-developers/cronos-smart-contract/contract-verification) plugins allow you to verify your contracts on the Cronos testnet. To install them, run: ```bash Shell npm i --save-dev @nomiclabs/hardhat-etherscan@^3.1.0 @cronos-labs/hardhat-cronoscan ``` You need to create an environment file to store your sensitive data with the project. To create it, in your project directory, run: ```bash Shell touch .env ``` ### Create and compile the contract 1. Navigate to your previously created project directory and go to the `contracts` directory. In the directory, create your Dutch auction smart contract: `DutchAuction.sol`. ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; import "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import "@openzeppelin/contracts/access/Ownable.sol"; import "@openzeppelin/contracts/utils/Counters.sol"; contract MyToken is ERC721, Ownable { using Counters for Counters.Counter; Counters.Counter private _tokenIdCounter; uint256 public immutable startPrice = 10 ether; uint256 public immutable startAt; uint256 public immutable endsAt; uint256 public immutable endPrice = 5 ether; uint256 public immutable discountRate = 0.01 ether; uint256 public duration = 500 minutes; uint256 public immutable MAX_SUPPLY = 100; mapping (address => bool) public onlyOne; constructor() ERC721("CronosNFT", "CroNFT") { startAt = block.timestamp; endsAt = block.timestamp + duration; } function price() public view returns (uint256) { if (block.timestamp > endsAt ) { return endPrice; } uint256 minutesElapsed = (block.timestamp - startAt) / 60; return startPrice - (minutesElapsed * discountRate); } function safeMint(address to) public payable { require(msg.value >= price(), "Not enough ether sent"); require(!onlyOne[msg.sender], "Sorry, Address already has 1 NFT!"); uint256 tokenId = _tokenIdCounter.current(); require(tokenId < MAX_SUPPLY, "No more items left."); _safeMint(to, tokenId + 1); _tokenIdCounter.increment(); onlyOne[msg.sender]=true; } function withdraw() public onlyOwner { payable(owner()).transfer(address(this).balance); } } ``` The contract implementation is the following: * The price of an NFT starts at 10 ether and decreases to 5 ether over 500 minutes. A maximum of 100 NFTs can be minted. * The contract uses the `onlyOne` mapping to ensure that each address can only own one NFT from your collection. * The contract sets the values of the start and end price, which cannot be changed later on. * The function `price()` uses `block.timestamp` to calculate the price of an NFT each time the `safeMint` function is called. * If all the required conditions are satisfied, the wallet address gets an NFT minted. 2. To compile the contract, in your project directory, run: ```bash Shell npx hardhat compile ``` ### Fund your account To deploy your smart contract on the Cronos testnet, you will need some test CRO. See the [Cronos faucet](https://cronos.org/faucet). ### Set up the environment and configuration files 1. In your project directory, navigate to a previously created environment file and edit it to add the following data: * `RPC_URL` — your Cronos node HTTPS endpoint deployed with Chainstack. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). * `PRIVATE_KEY` — the private key of your MetaMask wallet that has a sufficient amount of TCRO. See also the [Cronos faucet](https://cronos.org/faucet). * `API_KEY` — a Cronos API key to verify the deployed smart contract using the Cronoscan and Etherscan plugins. Create an API key in the [Cronos blockchain explorer](https://docs.cronos.org/block-explorers/block-explorer-and-api-keys#creating-api-keys-on-cronoscan). Example of the environment file data: ```env env RPC_URL=YOUR_CHAINSTACK_ENDPOINT PRIVATE_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXu7 API_KEY=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXG5 ``` 2. In your project directory, navigate to the Hardhat configuration file and replace the current data with the following data: ```js JavaScript require("@nomicfoundation/hardhat-toolbox"); require('dotenv').config(); require("@nomiclabs/hardhat-etherscan"); require("@cronos-labs/hardhat-cronoscan"); module.exports = { solidity: "0.8.10", networks: { Cronos_testnet: { url: `${process.env.RPC_URL}`, accounts: [process.env.PRIVATE_KEY] }, }, etherscan: { apiKey: { cronosTestnet: `${process.env.API_KEY}`, }, }, }; ``` where * `dotenv` — library to import your sensitive data from the environment file securely * `Cronos_testnet` — the testnet to deploy the contract to * `url` — your Cronos node HTTPS endpoint imported from the environment file * `accounts` — your MetaMask wallet private key imported from the environment file * `etherscan` — your Cronos API key imported from the environment file to verify your contract ### Deploy the Dutch auction contract 1. In your project directory, navigate to the scripts directory and create the `DeployDutch.js` file. 2. Edit the `DeployDutch.js` file to add the basic deployment script of Hardhat: ```js JavaScript const hre = require("hardhat"); async function main() { const CronosToken = await hre.ethers.getContractFactory("MyToken"); console.log("Deploying your contract, please Wait....."); const cronosToken = await CronosToken.deploy(); await cronosToken.deployed(); console.log("CronosToken deployed to:", cronosToken.address); } main() .then(() => process.exit(0)) .catch((error) => { console.error(error); process.exit(1); }); ``` 3. In your project directory, run the following script: ```bash Shell npx hardhat run scripts/DeployDutch.js --network Cronos_testnet ``` The contract will deploy and the terminal will return the contract address. Use this address to verify and interact with your contract. ### Verify your contract on the Cronos testnet Once your contract is deployed, verify it on the Cronos testnet. In your terminal, run: ```bash Shell npx hardhat verify --network Cronos_testnet CONTRACT_ADDRESS ``` where CONTRACT\_ADDRESS your contract address returned at the previous step ### Interact with the contract Now that your contract is verified, Cronoscan is effectively a front-end instance for your contract. ## Conclusion This tutorial guided you through the basics of using Hardhat to deploy a Dutch auction smart contract to the Cronos testnet, and verify it using Etherscan and Cronoscan plugins. This tutorial uses testnet, however, the exact same instructions and sequence work on the mainnet. ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS Part-time Rust aficionado [](https://github.com/Genesis3800) [](https://twitter.com/PriyankGupta03) [](https://www.linkedin.com/in/priyank-gupta-0308/) # cryo + Chainstack: Developer's guide to blockchain data mastery with Python Source: https://docs.chainstack.com/docs/cryo-with-chainstack-and-python **TLDR** * Integrate **cryo** (a Rust-based CLI for blockchain data extraction) with Python via **cryo\_python** for streamlined data collection and analysis. * **cryo.collect()** fetches data into Python-friendly objects (Pandas, Polars, etc.) for real-time analysis, while **cryo.freeze()** saves data to files in JSON/Parquet/CSV. * Combine Chainstack’s high-performance Global Nodes with Python’s libraries (pandas, matplotlib, etc.) to query, process, and visualize blockchain data with speed and precision. ## Main article In blockchain data exploration, we previously introduced you to `cryo`, Paradigm's powerful command-line interface tool. As you might recall, this tool is a beacon for developers, researchers, and blockchain enthusiasts, optimizing the process of extracting data from various blockchain networks. Our initial journey through `cryo` revealed its data formatting efficiency and seamless integration with Chainstack [Global Nodes](/docs/global-elastic-node). Learn how to use `cryo` and how it works with **[cryo: Your gateway to blockchain data](/docs/cryo-your-gateway-to-blockchain-data#basic-usage-of-cryo)**. Now, we embark on a sequel, bridging `cryo` with the world of Python. This guide will show you how to use the Python wrapper made for the `cryo` CLI by covering setup, basic usage, and data extraction and manipulation using common Python libraries. ## Python and cryo for blockchain data manipulation Python is known for its simplicity and data manipulation and analysis capability. The Python wrapper allows you to couple `cryo`'s Rust-based efficiency for data extraction with Python’s data manipulation capabilities. This integration enhances the analytical power at your fingertips, allowing you to leverage Python's rich library ecosystem for in-depth data analysis, visualization, and machine learning. ## Prerequisites and setup This section will lay the groundwork for integrating the `cryo` tool with Python. This process involves ensuring that your system has the necessary tools and libraries and installing the Python wrapper for `cryo`. ### Prerequisites Before diving into the installation process, ensure your environment is primed for the task. The following prerequisites are essential: * **Chainstack Global Node RPC**: Get a high-performance Chainstack [Global Node](/docs/global-elastic-node) RPC before starting. Follow these steps to deploy an Ethereum node: [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). To follow this guide, deploy a **Standard** Ethereum node, which will default to a Global Node. Once you deploy the node, you'll have access to an RPC endpoint, which will look like this: ``` https://ethereum-mainnet.core.chainstack.com/YOUR_CREDENTIALS ``` Create a `.env` file in your root directory and place the endpoint in it. ```python Python ETH_RPC="https://ethereum-mainnet.core.chainstack.com/YOUR_CREDENTIALS" ``` * **Rust**: Rust must be installed in your system for `cryo` to work, the Python integration is a lightweight wrapper for the `cryo` CLI, so you’ll still need to meet the app’s requirements. Install Rust following the [rustup](https://rustup.rs/) instructions. * **Python Environment**: Ensure that you have Python installed on your system and create a new virtual environment in your project’s directory; you can run the following: ```bash Bash python3 -m venv cryo-and-python ``` Then activate the virtual environment with: ```bash Bash source cryo-and-python/bin/activate ``` * **Required Libraries**: `cryo_python` depends on several libraries, make sure to install the following libraries, ```bash Bash pip install maturin pandas polars pyarrow python-dotenv web3 matplotlib ``` Note that the `python-dotenv web3 matplotlib` libraries are not strictly required to run `cryo_python`, but we’ll use them along the guide. ### Installation and Setup With the prerequisites in place, let’s move on to the installation steps: **Clone the cryo Repository**: Use git to clone the `cryo` repository from GitHub. If you don’t have git installed, you can download it from [git](https://git-scm.com/downloads). ```bash Bash git clone https://github.com/paradigmxyz/cryo ``` **Navigate to the Python Directory**: ```bash Bash cd cryo/crates/python ``` **Build** `cryo_python`: * Run the `maturin` build command: ```bash Bash maturin build --release ``` * This command will compile the Rust code and create a wheel file (.whl) for the Python package. **Install the Python Wrapper**: * Find the `.whl` file generated by maturin. It will be located in the `target/wheels` directory. * Install the wheel file using pip: ```bash Bash pip install --force-reinstall .whl ``` * Replace `` with the actual path to the `.whl` file generated, it will look like this: ```bash Bash /YOUR_PATH/cryo/target/wheels/cryo_python-0.3.0-cp310-cp310-macosx_11_0_arm64.whl ``` Your current draft provides a solid foundation. To enhance it, we can add more context and details based on the source files, particularly focusing on the functionality and technical nuances of `cryo.collect()` and `cryo.freeze()`. Here's an improved version: ## Basic Usage of `cryo_python` `cryo_python` serves as a lightweight wrapper for the `cryo` CLI offers a seamless Python interface to the powerful CLI commands. With `cryo_python`users can access two principal functions that mirror their CLI counterparts: * `cryo.collect()` extracts blockchain data and returns it as a Python-friendly data frame, enabling direct use within scripts for real-time analysis and manipulation. * `cryo.freeze()` fetches data and saves it to a file, facilitating subsequent use or long-term storage. Explore the source files for [cryo.collect()](https://github.com/paradigmxyz/cryo/blob/main/crates/python/python/cryo/_collect.py) and [cryo.freeze()](https://github.com/paradigmxyz/cryo/blob/main/crates/python/python/cryo/_freeze.py) in the GitHub repository. ### `cryo.collect()` Main Aspects 1. **Asynchronous Support**: `cryo.collect()` includes both `async_collect` and `collect` methods, designed to operate asynchronously. This feature is vital for efficiently handling large datasets or high-throughput tasks, ensuring optimal resource utilization and performance. 2. **Multiple Output Formats**: `cryo.collect()` allows you to organize data in various Python-friendly formats for diverse scenarios: * **Polars DataFrame**: Ideal for high-performance data manipulation, leveraging its fast, efficient data handling capabilities. * **Pandas DataFrame**: Provides broad compatibility with Python's extensive data analysis ecosystem. * **List of Dictionaries**: Facilitates easy handling of JSON-like data structures, simplifying serialization. * **Dictionary of Lists**: Offers an alternative structured data format suitable for specific data processing requirements. ### `cryo.freeze()` Main Aspects 1. **Data Type Flexibility**: `cryo.freeze()` can handle single and multiple data types, showcasing its versatility in accommodating various data collection needs. 2. **Argument Parsing**: Echoing `cryo.collect()`, `cryo.freeze()` also parses additional keyword arguments (`*kwargs`), enhancing the customization possibilities in data collection and storage. ### Usage examples Having grasped the basics of **`cryo_python`**, let's get into practical examples to demonstrate its usage. Throughout this guide, we'll consistently retrieve the RPC endpoint from a **`.env`** file. Ensure you have your RPC endpoint details in a `.env` file for these examples. ### `cryo.collect` basic example Start by creating a file named **`main.py`** and paste the following code: ```python Python import os import cryo from dotenv import load_dotenv # Load environment variables from .env file load_dotenv() # Retrieve the Ethereum RPC URL from environment variables eth_rpc = os.getenv("ETH_RPC") # Collect blockchain data using the cryo library and return it as a pandas DataFrame # Specifying blocks range and output format data = cryo.collect( "blocks", blocks=["18734050:18735050"], rpc=eth_rpc, output_format="pandas", hex=True ) # Displaying the column names of the DataFrame print("Columns in the DataFrame:") for column in data.columns: print(column) # Print the entire DataFrame print(data) ``` Here's an explanation of how it works and what it does: 1. **Environment Setup**: * The code starts by importing the necessary modules: `os` for environment variable management, `cryo` for accessing blockchain data, and `load_dotenv` from the `dotenv` package to load environment variables from a `.env` file. * It then loads the environment variables using `load_dotenv()`, which reads the `.env` file and sets the variables. 2. **Accessing Ethereum RPC Endpoint**: * The `ETH_RPC` variable, which contains the URL to an Ethereum RPC endpoint, is fetched from the environment variables using `os.getenv("ETH_RPC")`. 3. **Data Collection with `cryo.collect`**: * The `cryo.collect` function has specific parameters to fetch data from the Ethereum blockchain. * `datatype`: Set to `"blocks"`, indicating that the function should fetch data about blockchain blocks. * `blocks`: Specifies the range of blocks to fetch data for (in this case, from block `18734050` to `18735050`). * `rpc`: The Ethereum RPC endpoint URL, passed as `eth_rpc`. * `output_format`: Set to `"pandas"`, indicating that the data should be returned as a Pandas DataFrame. * `hex`: The boolean parameter set to `True` will return the data already converted to hexadecimal. 4. **Output**: * The fetched data is stored in the variable `data`, a Pandas DataFrame. * The script then prints the column names of the DataFrame to provide an overview of the data structure. * Finally, it prints the DataFrame `data`, showing the fetched blockchain data. The result of this script is a detailed listing of data for the specified range of Ethereum blocks. The DataFrame columns represent each block's attributes, such as `block_hash`, `author`, `block_number`, `gas_used`, `extra_data`, `timestamp`, `base_fee_per_gas`, and `chain_id`. Here is an example of the output in the console: ```shell Shell Columns in the DataFrame: block_hash author block_number gas_used extra_data timestamp base_fee_per_gas chain_id block_hash author ... base_fee_per_gas chain_id 0 0xdf6d5d7526eb50e68278998b2cc7a519a4c3daddb14a... 0xcdbf58a9a9b54a2c43800c50c7192946de858321 ... 37583327088 1 1 0x2818389bb471ebe60a74fb1865574c0ac50f40daf575... 0xdafea492d9c6733ae3d56b7ed1adb60692c98bc5 ... 38781853745 1 2 0xd65229e3d67c28f71b978ae789df6ee58c27420f8a35... 0x1f9090aae28b8a3dceadf281b0f12828e676c326 ... 38448400269 1 3 0x218f26e524d889d20604ad01b91feec5c2285dc3e747... 0x4675c7e5baafbffbca748158becba61ef3b0a263 ... 38316617096 1 4 0xace296b1c263fee4f35d831086c93aa820577dcc1bea... 0x388c818ca8b9251b393131c08a736a67ccb19297 ... 37638298982 1 .. ... ... ... ... ... 995 0xc59d9ad3444b0352c36f3ec7e3a3561bbc90d9118232... 0x690b9a9e9aa1c9db991c7721a92d351db4fac990 ... 71896254942 1 996 0xd611033a7769913ba0e8abdc8ae0ab0fee224435c512... 0x4838b106fce9647bdf1e7877bf73ce8b0bad5f97 ... 70283122265 1 997 0x2a4448fe72e37c169868e37ea4fce71789c31ecc9108... 0x95222290dd7278aa3ddd389cc1e1d165cc4bafe5 ... 73132855124 1 998 0x6ce066a304e334c6a98154b51f2a1b24edf749467424... 0x4838b106fce9647bdf1e7877bf73ce8b0bad5f97 ... 74400095143 1 999 0xbc293dc0e8d0a61f24256433f7faaf2a8e754a5557d9... 0x4838b106fce9647bdf1e7877bf73ce8b0bad5f97 ... 73090669589 1 [1000 rows x 8 columns] ``` Running this Python script is the equivalent of running this command from the `cryo` CLI directly: ```bash Bash cryo blocks --blocks 18734050:18735050 --rpc YOU_CHAINSTACK_NODE ``` Please note that Chainstack endpoints on the **Developer** plan are limited to 30 RPS, so you might need to add rate limiting to your code; starting from the [**Growth** plan](https://chainstack.com/pricing/), there is no rate limit. To manage rate limits, **`cryo.collect`** can be adjusted using the `requests_per_second` parameter: ```python Python data = cryo.collect( "blocks", blocks=["18734050:18735050"], rpc=eth_rpc, output_format="pandas", hex=True, requests_per_second=25 ) ``` ### `cryo.freeze` basic example The principle of **`cryo.freeze`** is quite similar to **`cryo.collect`**. In a new file, paste this code: ```python Python import os from dotenv import load_dotenv import cryo # Load environment variables from the .env file load_dotenv() # Retrieve the Ethereum RPC URL from environment variables eth_rpc = os.getenv("ETH_RPC") # Fetch and save blocks data in JSON data = cryo.freeze( "blocks", blocks=["18734050:18735050"], rpc=eth_rpc, output_dir="blocks_data/", file_format="json", hex=True, requests_per_second=500 ) ``` This script uses **`cryo.freeze`** to fetch and save the same block data as a JSON file in the specified directory. The logic and syntax closely follow the **`cryo`** CLI. The result is a JSON file containing data for the blocks in the `root/blocks_data/` directory. Since both **`cryo.freeze`** and **`cryo.collect`** are just wrappers around the CLI; you can use the same commands. Let’s explore a few more examples. ### Fetching ERC-20 balances with `cryo` This section will guide you in using `cryo_python` to retrieve ERC-20 token balances from specified addresses and contracts. We’ll get the balance of the [APECoin](https://etherscan.io/address/0x4d224452801ACEd8B2F0aebE155379bb5D594381) token in the [Binance](https://etherscan.io/token/0x4d224452801ACEd8B2F0aebE155379bb5D594381?a=0xf977814e90da44bfa03b6295a0616a897441acec) address in a range of 10,000 blocks. Start by creating a new Python file and paste the following code: ```python Python import os from dotenv import load_dotenv import cryo # Load environment variables load_dotenv() # Access Ethereum RPC URL from environment variables eth_rpc = os.getenv("ETH_RPC") # Fetch ERC-20 token balances for a specified address within a block range data = cryo.freeze( "erc20_balances", blocks=["18.68M:18.69M"], contract=['0x4d224452801ACEd8B2F0aebE155379bb5D594381'], address=['0xF977814e90dA44bFA03b6295A0616a897441aceC'], rpc=eth_rpc, output_dir="blocks_data/", file_format="json", hex=True, requests_per_second=900 ) ``` Executing this script will generate a JSON file containing the ERC-20 balance data structured as follows: ```bash Bash Schema for ERC-20 Balances ───────────────────────── - block_number: uint32 - erc20: hex - address: hex - balance_binary: binary - balance_string: string - balance_f64: float64 - chain_id: uint64 ``` This structure, `erc20_balances`efficiently organizes ERC-20 balances by block, offering a clear and accessible format for data analysis. Check the [`cryo` documentation](https://github.com/paradigmxyz/cryo?tab=readme-ov-file#cryo-datasets) to find what other datasets you can fetch. ## Fetch and manipulate blockchain data Having explored the basic functionality of `cryo_python`, let's now get into a more advanced application by integrating it with essential Python libraries for data manipulation and visualization. ### Find the top 10 block authors In this example, we'll fetch Ethereum blockchain data and visualize the top block authors using `cryo_python`, `pandas`, and `matplotlib`. In a Python file, paste the following: ```python Python import os import time import pandas as pd import matplotlib.pyplot as plt from dotenv import load_dotenv from web3 import Web3 import cryo # Constants ETH_RPC_VAR = "ETH_RPC" LOOKBACK_BLOCKS = 5000 TOP_AUTHORS_COUNT = 10 # Load environment variables load_dotenv() # Function to get the block range def get_block_range(web3, lookback_blocks): latest_block = web3.eth.block_number start_block = max(0, latest_block - lookback_blocks) # Avoid negative block numbers return f"{start_block}:{latest_block}" # Function to fetch data from cryo def fetch_block_data(block_range, eth_rpc): try: # Start timer for fetching blocks fetch_start_time = time.time() # Fetch the block data data = cryo.collect( "blocks", blocks=[block_range], rpc=eth_rpc, output_format="pandas", hex=True ) # Calculate and print the time taken to fetch the blocks fetch_time = time.time() - fetch_start_time print(f"Time taken to fetch blocks: {fetch_time:.2f} seconds") return data except Exception as e: print(f"An error occurred while fetching block data: {e}") return pd.DataFrame() # Return an empty DataFrame on error # Function to plot the top authors def plot_top_authors(data, num_entries): top_authors = data['author'].value_counts().head(TOP_AUTHORS_COUNT) plt.figure(figsize=(12, 6)) top_authors.plot(kind='bar') plt.xticks(rotation=45, ha='right') plt.title(f'Top {TOP_AUTHORS_COUNT} Authors by number of blocks mined from past {num_entries} blocks', fontsize=14) plt.xlabel('Author', fontsize=14) plt.ylabel('Number of Blocks', fontsize=14) plt.tight_layout() plt.show() # Main execution def main(): eth_rpc = os.getenv(ETH_RPC_VAR) if not eth_rpc: raise ValueError(f"Environment variable {ETH_RPC_VAR} not found") w3 = Web3(Web3.HTTPProvider(eth_rpc)) if not w3.is_connected(): print("Failed to connect to Ethereum node.") return block_range = get_block_range(w3, LOOKBACK_BLOCKS) print(f"Fetching blocks from {block_range} range.") data = fetch_block_data(block_range, eth_rpc) if not data.empty: num_entries = len(data) print(f"Number of blocks fetched: {num_entries}") plot_top_authors(data, num_entries) if __name__ == "__main__": main() ``` Here's a step-by-step breakdown of what this script does: 1. **Setting Up the Environment**: We start by importing necessary libraries like `os`, `time`, `pandas`, `matplotlib.pyplot`, and `Web3`, along with `cryo`. Then, we define constants for the RPC URL, the number of blocks to look back on, and the number of top authors to display. 2. **Fetching Blockchain Data**: We define a function to determine the range of blocks to fetch based on the current block number. Another function uses `cryo.collect` to get data on these blocks and returns it as a pandas DataFrame. We track the time taken for this operation, offering insights into the performance of our data retrieval process. 3. **Data Visualization**: With the blockchain data in hand, we analyze the top block authors using a function that counts the occurrences of each author in the data. We then use `matplotlib` to create a bar chart, showcasing the top authors based on the number of blocks mined. 4. **Executing the Script**: In the `main` function, we initialize a Web3 instance, connect to the Ethereum node, fetch the block data, and, if successful, visualize the top authors. We handle potential errors, such as missing environment variables or connection issues, to ensure robustness. 5. **Running the Code**: This script is designed as a standalone program. When executed, it will display a bar chart illustrating the most active Ethereum block authors over a specified block range. This example demonstrates how to effectively combine `cryo` with other Python tools to fetch, process, and visualize Ethereum blockchain data, providing valuable insights into blockchain activity. Here is an example of the console output and chart. The console will output something like the following: ```bash Bash Fetching blocks from 18874064:18879064 range. Time taken to fetch blocks: 30.83 seconds Number of blocks fetched: 5000 Top Authors by Number of Blocks Mined: 0x95222290dd7278aa3ddd389cc1e1d165cc4bafe5: 1417 blocks 0x1f9090aae28b8a3dceadf281b0f12828e676c326: 1385 blocks 0x4838b106fce9647bdf1e7877bf73ce8b0bad5f97: 443 blocks 0x388c818ca8b9251b393131c08a736a67ccb19297: 295 blocks 0xb9342d6a9789cc6479e48cfef67590c1bd05744e: 213 blocks 0x88c6c46ebf353a52bdbab708c23d0c81daa8134a: 183 blocks 0xdafea492d9c6733ae3d56b7ed1adb60692c98bc5: 175 blocks 0x0aa8ebb6ad5a8e499e550ae2c461197624c6e667: 89 blocks 0x4675c7e5baafbffbca748158becba61ef3b0a263: 55 blocks 0x690b9a9e9aa1c9db991c7721a92d351db4fac990: 52 blocks ``` And the chart will look like this: ### Visualise ERC-20 balance changes over time The next example we’ll work on will use the same `erc20_balances` dataset used in one of the previous examples. This time, we’ll fetch and visualize how much [WETH](https://etherscan.io/token/0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2) is in the[`WETH-USDT` pool](https://etherscan.io/address/0x0d4a11d5EEaaC28EC3F61d100daF4d40471f185) from Uniswap V2. In a new file, paste the following code: ```python Python import os import time from dotenv import load_dotenv import cryo import pandas as pd import matplotlib.pyplot as plt import matplotlib.ticker as ticker from web3 import Web3 # Constants ETH_RPC_VAR = "ETH_RPC" LOOKBACK_BLOCKS = 7200 # Approx a day in the past CONTRACT_ADDRESS = '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2' # WETH WALLET_ADDRESS = '0x0d4a11d5EEaaC28EC3F61d100daF4d40471f1852' # WETH-USDT pool Uniswap V2 # Initialize environment variables and Web3 load_dotenv() eth_rpc = os.getenv(ETH_RPC_VAR) w3 = Web3(Web3.HTTPProvider(eth_rpc)) def check_eth_rpc_connection(eth_rpc): """Check connection to Ethereum RPC.""" if not eth_rpc: raise ValueError(f"Environment variable {ETH_RPC_VAR} not found") if not w3.is_connected(): raise ConnectionError("Failed to connect to Ethereum node.") def get_block_range(lookback_blocks): """Determine the range of blocks to fetch.""" latest_block = w3.eth.block_number start_block = max(0, latest_block - lookback_blocks) return f"{start_block}:{latest_block}" def fetch_erc20_balances(block_range): """Fetch ERC-20 token balances within a given block range.""" return cryo.collect( "erc20_balances", blocks=[block_range], contract=[CONTRACT_ADDRESS], address=[WALLET_ADDRESS], rpc=eth_rpc, output_format="pandas", hex=True, requests_per_second=900 # Adapt the RPS to your endpoint ) def convert_balance_to_ether(balance_str): """Convert balance from Wei to Ether, handling None values.""" return None if balance_str is None else Web3.from_wei(int(balance_str), 'ether') def plot_balance_change_over_time(data): """Plot the balance change over time on a chart.""" plt.figure(figsize=(12, 6)) plt.plot(data['block_number'], data['balance_ether'], marker='o') # Set axis labels and chart title with contract and address plt.xlabel("Block Number") plt.ylabel("Balance (Ether)") plt.title(f"ERC-20 Token Balance Change for {CONTRACT_ADDRESS}\nWallet {WALLET_ADDRESS}") # Manually set the x-axis ticks based on the block range block_numbers = data['block_number'] tick_spacing = (block_numbers.max() - block_numbers.min()) // 10 # for example, 10 evenly spaced ticks ticks = range(int(block_numbers.min()), int(block_numbers.max()), int(tick_spacing)) plt.xticks(ticks, [f"{tick:,.0f}" for tick in ticks]) # Format the y-axis to show balances rounded to 4 decimal places plt.gca().yaxis.set_major_formatter(ticker.StrMethodFormatter('{x:,.4f}')) # Add grid, tighten layout, and display the plot plt.grid(True) plt.tight_layout() plt.show() def main(): """Main function to fetch and plot ERC-20 token balance changes.""" check_eth_rpc_connection(eth_rpc) block_range = get_block_range(LOOKBACK_BLOCKS) # Start timing the data fetch start_time = time.time() # Fetch the data data = fetch_erc20_balances(block_range) # End timing the data fetch end_time = time.time() elapsed_time = end_time - start_time print(f"Data fetched in {elapsed_time:.2f} seconds.") if data.empty: print("No data available for plotting.") return # Prepare data for plotting data = data[['block_number', 'erc20', 'address', 'balance_string']] data['balance_ether'] = data['balance_string'].apply(convert_balance_to_ether) data = data[data['balance_ether'].notnull()] # Filter out rows with None values # Print data summary to the console print("\nData summary:") print(f"Block rage: {block_range}") print(f"Start balance in Ether: {data.iloc[0]['balance_ether']}") print(f"End balance in Ether: {data.iloc[-1]['balance_ether']}") # Plot the balance changes over time plot_balance_change_over_time(data) if __name__ == "__main__": main() ``` Here's a step-by-step explanation of what’s going on: 1. **Fetch Block Range**: * It calculates the range of blocks to query by finding the latest block number and subtracting the lookback period to determine the start block; in the example, we analyze about a day's worth of blocks. 2. **Fetch ERC-20 Balances**: * The script fetches ERC-20 token balance data from the specified contract, wallet address, and block range. The `cryo.collect` function is called, and the data is returned in a pandas DataFrame format. 3. **Data Conversion and Cleaning**: * A conversion function transforms balance values from Wei (the smallest unit of Ether) to Ether for readability. It handles any `None` values to avoid errors during conversion. 4. **Summarizing Data**: * The script prints out a summary of the data to the console, including the block range and the start and end balances in Ether, providing a quick overview of the dataset. 5. **Data Visualization**: * It then plots the balance changes over time using `matplotlib`. The x-axis represents block numbers, and the y-axis represents the balance in Ether. * The axis tick labels are formatted for better readability, and the chart is titled with the contract and wallet address for reference. Remember to adapt the request per second. Here is an example of the result: ```python Python Data fetched in 16.51 seconds. Data summary: Block rage: 18872292:18879492 Start balance in Ether: 27034.167858289615425314 End balance in Ether: 27039.186876795597977365 ``` Graph for the balance change over a day: As you can see, we can use `cryo` to fetch data and manipulate it with Python, a very powerful combo. ## Conclusion The integration of `cryo` with Python is a significant advancement for blockchain data analysis. It combines `cryo`'s efficient data extraction capabilities with Python's powerful data processing and visualization tools. This synergy, coupled with high-performance Chainstack Global Nodes, enables users to easily extract, analyze, and visualize blockchain data, making it an invaluable resource for developers, researchers, and enthusiasts in the blockchain community. The practical examples demonstrate this integration's real-world utility, highlighting its potential to yield insightful and actionable information from complex blockchain datasets. In essence, `cryo` and Python offer an effective and accessible platform for in-depth blockchain data exploration. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # cryo: Your gateway to blockchain data Source: https://docs.chainstack.com/docs/cryo-your-gateway-to-blockchain-data **TLDR** * **cryo** is a Rust-based command-line tool for extracting blockchain data (blocks, transactions, logs) into formats like JSON or Parquet. * You can flexibly specify block ranges, columns, or events to filter exactly the data you want. * Pairing cryo with a high-performance Chainstack global node unlocks near-unlimited RPS for lightning-fast data retrieval and analyses. * Prebuilt event scrapers for ERC-20/721 Transfers or custom logs provide a seamless research and development experience. ## Main article Cryo, by [Paradigm](https://www.paradigm.xyz/), is a tool that is as cool as its name suggests. If you're venturing into blockchain data, whether you're a researcher, developer, or just a curious explorer, `cryo` is about to become your new best friend. This guide is designed to walk you through what `cryo` is, how it works, and how to harness its power to fetch blockchain data quickly. At its core, `cryo` is a command-line interface (CLI) tool, but don't let the simplicity of its interface fool you. This tool packs a powerful punch, making an easy and flexible way to extract blockchain data into various user-friendly formats. Whether you need your data in Parquet, CSV, JSON, or piped directly into a Python data frame, `cryo` has got you covered. So, whether you're planning to build complex applications, conduct in-depth research, or satisfy your curiosity about blockchain operations, `cryo` will make the process simple and fast. Today, let's learn how to use `cryo` coupled with a high-performance Chainstack [Global Node](/docs/global-elastic-node). Here you can find the [cryo repository](https://github.com/paradigmxyz/cryo). ## Understanding how `cryo` works Understanding the magic under the hood is always important to utilize a full tool's potential. This section will give you a glimpse into the inner workings of `cryo`, explaining its data extraction process, how it handles data schemas and formatting, and the range of blockchain networks it supports. ### Data extraction process `cryo`'s primary tool is the JSON-RPC protocol, a widely used standard that allows for communication with a blockchain node. When you run a `cryo` command, it sends out JSON-RPC requests to a blockchain node. These requests ask for specific pieces of data like blocks, transactions, or logs. The node responds with raw data, which `cryo` then meticulously processes. It's not just about fetching data; `cryo` transforms this data into structured and readable formats like CSV, JSON, or Parquet files. This transformation makes it incredibly straightforward to use this data in various applications or analyses. ### Efficiency oowered by Rust The speed and efficiency of `cryo` are standout features, largely attributed to its development in Rust. Renowned for its performance and memory safety, Rust enables `cryo` to handle blockchain data with exceptional speed and efficiency. This results in rapid data processing, even when dealing with the large datasets typical in blockchain networks. Rust's prowess in concurrency further amplifies `cryo's` ability to manage multiple data extraction tasks simultaneously, ensuring swift and smooth operation. In short, `cryo` leverages Rust's strengths to offer a fast, reliable, and efficient data extraction experience. ### Supported chains The blockchain world is vast and diverse, and `cryo` is built to navigate this diversity. It's compatible with various blockchain networks, making it a versatile tool for users interested in different ecosystems. Primarily, `cryo` is compatible with Ethereum and supports EVM-based networks. This wide range of compatibility is possible because `cryo` utilizes [ethers.rs](http://ethers.rs/) for JSON-RPC requests, allowing it to interact with any chain compatible with ethers-rs. This versatility makes `cryo` a valuable asset, whether you're getting into the bustling world of Ethereum or exploring the unique landscapes of its various Layer 2 solutions and sidechains. ## Installation and setup Getting `cryo` up and running involves a few straightforward steps. This section will guide you through the prerequisites, the installation process, and setting up essential environment variables. We'll also set up a Chainstack global endpoint. ### Prerequisites Before installing `cryo`ensure that your system meets the following requirements: 1. **Rust**: `cryo` is built in Rust, so you must install Rust on your machine. If you haven't installed Rust, you can do so via [rustup](https://www.rust-lang.org/tools/install), the recommended way to install the Rust programming language. ### Installation steps `cryo` can be installed either directly from the source or via `crates.io`. Here's how you can do it: **Method 1: Install from source** 1. Clone the `cryo` repository: ``` git clone https://github.com/paradigmxyz/cryo.git ``` 2. Navigate to the `cryo` directory: ``` cd cryo ``` 3. Install `cryo` using Cargo: ``` cargo install --path ./crates/cli ``` **Method 2: Install from [crates.io](http://crates.io/)** 1. Run the install command: ``` cargo install cryo_cli ``` Installing from source has been the most reliable method so far. ## Basic usage of `cryo` Diving into `cryo` begins with understanding its basic and help commands and the variety of data types it can extract. This section will cover the foundational aspects of using `cryo`, including the essential commands and options that make it a versatile tool for blockchain data extraction. ### Basic commands `cryo` offers several commands to help you navigate its functionalities: 1. **`cryo help`**: This is your go-to command for any assistance. It provides an overview of all available commands and options in `cryo`. Whenever in doubt, just type `cryo help` in your terminal. 2. **`cryo help syntax`**: Blockchain data queries can sometimes get complex. The `cryo help syntax` command is designed to help you understand how to effectively specify block ranges, transaction hashes, and other query parameters. 3. **`cryo help datasets`**: This command displays all the available datasets that `cryo` can extract. Datasets include `blocks`, `transactions`, `logs`, and many others, each serving a specific type of data extraction. 4. **`cryo help DATASET(S)`**: Use this command for detailed information about a specific dataset. It helps you understand the nuances of each dataset, what data it includes, and how it can be used. For instance, if you want to know more about the `logs` dataset, you should use `cryo help logs`. ### Data types and options `cryo` can extract various types of blockchain data, each with its own set of applicable options: * **Logs**: Extracts event logs from the blockchain. Useful for tracking events emitted by smart contracts. * **Blocks**: Retrieves block data. This is essential for analyses that require details like block data, block time, and transactions within a block. * **Transactions**: Fetches transaction data, crucial for examining transaction flows, gas prices, and contract interactions. To further refine your data extraction, `cryo` provides a range of options: * **`-include-columns`**: Specify which columns to include in your output. For instance, if you're only interested in certain aspects of a transaction, like gas prices and transaction hashes, this option allows you to focus on just those columns. * **`-exclude-columns`**: Conversely, if there are columns you want to omit from your output, this option lets you exclude them, streamlining your dataset. * **`-blocks`**: A crucial option for specifying the range of blocks you are interested in. You can define a single block, a range, or multiple ranges. * **`-contract`**: This option lets you specify a particular contract address when dealing with log-related data. `cryo` also includes various other options for output format (`--csv`, `--json`), sorting (`--sort`), and filtering based on transaction parameters. Combining these data types and options gives you a powerful toolkit to customize your data extraction precisely to your needs. ## Using `cryo` with a custom RPC To use `cryo` you'll need a custom RPC (Remote Procedure Call) endpoint. This section will explain what a custom RPC endpoint is and how to use it with `cryo`. ### Chainstack RPC nodes An RPC endpoint in the context of blockchain is a server interface that allows you to interact with the blockchain network. It's like a gateway through which you send requests (like fetching data) and receive responses. So, how do you choose an RPC? Cryo is a high-performing tool that can send many requests per second, and the Chainstack global node is ideal for this tool. ### Chainstack global nodes Chainstack global nodes are geo-load-balanced nodes enabling intelligent routing of requests to the nearest server, reducing latency and delivering maximum performance. By proactively monitoring node status, global nodes adapt to network conditions in real time providing instant failover to another node during network interruptions on a global scale. ### Get a global node Follow these steps to deploy an Ethereum node: [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). To follow this guide, deploy a global Ethereum node. Once you deploy the node, you'll have access to an RPC endpoint, which will look like this: ``` https://ethereum-mainnet.core.chainstack.com/YOUR_CREDENTIALS ``` Now, we are ready to start fetching some blockchain data. ### Use a custom RPC with `cryo` To use a custom RPC endpoint with `cryo`, you can use the `--rpc` flag followed by the URL of your RPC endpoint or add it as an environment variable. Here's how to do it: * In your `cryo` command, add the `-rpc` flag followed by your custom RPC URL. For example: ``` cryo --rpc ``` * If you want to use it as an environment variable, export it as a variable named `ETH_RPC_URL` by running this in the console: ``` export ETH_RPC_URL=https://ethereum-mainnet.core.chainstack.com/YOUR_CREDENTIALS ``` If you add the endpoints as an environment variable, you do not need to add the `--rpc` flag when running a command. ## Fetch some data At this point, you are ready to get your hands dirty with blockchain data; let's explore a few kinds of datasets you can get from `cryo` and you can fine-tune the requests. ### Extract basic block information Block information is a fundamental kind of data needed to analyze the blockchain's state at specific times, which is essential for historical analysis, auditing, and verifying transaction integrity. It is also somewhat resource-intensive, especially with a wide block range. The basic block data command syntax is: ``` cryo blocks --blocks START_BLOCK:END_BLOCK ``` Let's explore some `cryo` commands to work with blocks. ### Dry command The `--dry` flag in `cryo` is useful for previewing the structure and content of the data you plan to extract without actually executing the data extraction. Using the `--dry` command with cryo provides a snapshot of the parameters, source details, output configuration, and the data schema for the requested dataset. This feature is highly beneficial for confirming the data fields and format before running a full data extraction process. It helps in understanding the range of data (like block numbers and types of data points), source information (like network and RPC URL), and how the data will be output (such as the file format and chunk size). This preemptive insight allows users to adjust their query parameters and output settings as needed, ensuring they get the data they need in the desired format. This is especially valuable for large-scale data operations where efficiency and precision are critical. Run it with: ``` cryo blocks --blocks 18734075:18735075 --dry ``` The output will be similar to the following: ``` cryo parameters ─────────────── - version: 0.2.0-183-gebfc97b - data: - datatypes: blocks - blocks: n=1,000 min=18,734,075 max=18,735,074 align=no reorg_buffer=0 - source: - network: ethereum - rpc url: - max requests per second: unlimited - max concurrent requests: unlimited - max concurrent chunks: 4 - output: - chunk size: 1,000 - chunks to collect: 1 / 1 - output format: parquet - output dir: /PATH - report file: $OUTPUT_DIR/.cryo/reports/2023-12-07_12-12-44.189964.json schema for blocks ───────────────── - block_number: uint32 - block_hash: binary - timestamp: uint32 - author: binary - gas_used: uint64 - extra_data: binary - base_fee_per_gas: uint64 - chain_id: uint64 sorting blocks by: block_number other available columns: parent_hash, state_root, transactions_root, receipts_root, logs_bloom, total_difficulty, size [dry run, exiting] ``` Let's briefly analyze the response and see what customizations we can make. 1. **Version and Data Types**: It lists the version of cryo being used and the data types selected for extraction (e.g., blocks). This is useful for ensuring you're working with the correct version and data set. 1. **Data Details**: It specifies the range of data to be extracted (e.g., block numbers and their range), which helps verify that you're targeting the correct segment of the blockchain. 2. **Source Information**: Shows the network (e.g., Ethereum), RPC URL, and rate limits. This confirms the blockchain source and helps manage resource usage. 1. This can be particularly important when dealing with rate limits and ensuring network compatibility. For example, free Chainstack endpoints are rate-limited at 30 requests per second. 3. **Output Configuration**: Details about how the data will be output, including format (e.g., Parquet), directory path, chunk size, and the number of chunks. This is crucial for understanding how the data will be organized and stored, allowing for proper data storage and management planning. 4. **Data Schema**: Lists the columns included in the output (e.g., block\_number, block\_hash, timestamp). This is essential for understanding the structure of the extracted data, enabling users to anticipate the kind of information they will receive and how it can be utilized in their analysis or application. 5. **Sorting and Additional Columns**: Information on how the data will be sorted and other available columns that were not included in the extraction but are available for use. This can be useful for refining future data extraction queries. ### Customize a fetch command From the dry run above, we can see a few details: * The command will fetch 1000 blocks, which is the range we want. * There is no rate limit, and concurrent requests are unlimited. * The output will be a [Parquet file](https://www.databricks.com/glossary/what-is-parquet) with 1000 entries. * `schema for blocks` displays what kind of data will be extracted. We can use a few extra flags to customize this: ``` cryo blocks \ --blocks 18734075:18735075 \ --json \ --requests-per-second 30 \ --columns block_number block_hash timestamp chain_id \ --dry ``` This command will limit the RPS to 30, return a JSON file, and only keep the block number, block hash, timestamp, and chain ID. Running it will return the following: ``` cryo parameters ─────────────── - version: 0.2.0-183-gebfc97b - data: - datatypes: blocks - blocks: n=1,000 min=18,734,075 max=18,735,074 align=no reorg_buffer=0 - source: - network: ethereum - rpc url: - max requests per second: 30 - max concurrent requests: 30 - max concurrent chunks: 4 - output: - chunk size: 1,000 - chunks to collect: 1 / 1 - output format: json - output dir: /PATH - report file: $OUTPUT_DIR/.cryo/reports/2023-12-07_13-01-04.300002.json schema for blocks ───────────────── - block_number: uint32 - block_hash: hex - timestamp: uint32 - chain_id: uint64 sorting blocks by: block_number other available columns: parent_hash, author, state_root, transactions_root, receipts_root, gas_used, extra_data, logs_bloom, total_difficulty, size, base_fee_per_gas [dry run, exiting] ``` Once you verify that's the data you need, run the same command without the `--dry` flag to actually send the requests, you'll get an output similar to this: ``` collecting data ─────────────── started at 2023-12-07 13:03:52.669 done at 2023-12-07 13:04:26.789 collection summary ────────────────── - total duration: 34.120 seconds - total chunks: 1 - chunks errored: 0 / 1 (0.0%) - chunks skipped: 0 / 1 (0.0%) - chunks collected: 1 / 1 (100.0%) - blocks collected: 1,000 - blocks per second: 29.3 - blocks per minute: 1,758.5 - blocks per hour: 105,509.5 - blocks per day: 2,532,227.9 ``` As you can see, it gives us the details of the operation, and you'll find a JSON file in your output path with the following structure: ```Json JSON [ { "block_hash": "0x0294b4c63bcc7d721ecafe323a38e391efc13698117851f2eef969bbc2267874", "block_number": 18734075, "timestamp": 1701948011, "chain_id": 1 }, { "block_hash": "0xcf9d0f402eaaa4e6082145e3ee932a1749f6d6d75a3885e7c59098d259b1b544", "block_number": 18734076, "timestamp": 1701948023, "chain_id": 1 }, { "block_hash": "0xae2b4c8e6c5e0e8c0e962d935f77d28350fed20d8fb63ec811d6d70aab6e2224", "block_number": 18734077, "timestamp": 1701948035, "chain_id": 1 }, ] ``` Because of the rate limit, it took 34 seconds to complete the request; the good news is that Chainstack does not have any rate limit starting from a [paid plan](https://chainstack.com/pricing/), which is recommended if you need really high-performance nodes. The following is the result of the same request using a premium Chainstack [Global Node](/docs/global-elastic-node) with unlimited RPS: ``` - source: - network: ethereum - rpc url: https://ethereum-mainnet.core.chainstack.com/ - max requests per second: unlimited - max concurrent requests: unlimited - max concurrent chunks: 4 collecting data ─────────────── started at 2023-12-07 13:09:24.810 done at 2023-12-07 13:09:28.437 collection summary ────────────────── - total duration: 3.627 seconds - total chunks: 1 - chunks errored: 0 / 1 (0.0%) - chunks skipped: 0 / 1 (0.0%) - chunks collected: 1 / 1 (100.0%) - blocks collected: 1,000 - blocks per second: 275.7 - blocks per minute: 16,540.6 - blocks per hour: 992,437.6 - blocks per day: 23,818,503.0 ``` As you can see, this is an enormous improvement, almost 90% faster, and this is the power of this Rust-based tool. This is the Gist of this powerful tool, and you can follow the same principle for the other datasets, but we'll explore a couple more. ### Extracting event logs with `cryo` Event logs on the blockchain offer invaluable insights into contract interactions and transactions and are one of the most sought-after data types. `cryo` simplifies the extraction of these logs, offering two primary approaches: 1. Utilizing pre-configured event scrapers for standard events. 2. Fetching custom event logs tailored to specific requirements. ### Using pre-configured event scrapers `cryo` comes with built-in capabilities to extract standard event logs like ERC-20 and ERC-721 `Transfer` events. This functionality is accessible without needing intricate parameters, significantly streamlining the process. For example, extracting ERC-20 `Transfer` events can be achieved with a straightforward command: ``` cryo erc20_transfers --blocks latest --json ``` This command will retrieve the`Transfer` events from the latest block in JSON format. You can specify additional parameters for more targeted data retrieval, such as a specific token address or a range of blocks. For instance, to extract `Transfer` events of the APE token over a range of 500 blocks, the command would be: ``` cryo erc20_transfers \ --address 0x4d224452801ACEd8B2F0aebE155379bb5D594381 \ --blocks 18735627:18736127 \ --json ``` The output is structured as a JSON file like the following: ```Json JSON [ { "block_number": 18735629, "transaction_index": 127, "log_index": 271, "transaction_hash": "0x652dd336bdcac90f521ebce7f788ac4179db5d736d246d9b4fa6c29ecd911731", "erc20": "0x4d224452801aced8b2f0aebe155379bb5d594381", "from_address": "0x5f65f7b609678448494de4c87521cdf6cef1e932", "to_address": "0xc469b4efd8566f8774437795ece23851a325f661", "value_binary": "0x000000000000000000000000000000000000000000000012574733bb4f6eb2dd", "value_string": "338330445611002671837", "value_f64": 3.383304456110027e20, "chain_id": 1 }, { "block_number": 18735631, "transaction_index": 157, "log_index": 258, "transaction_hash": "0xa554db46136e23b11cd859ac4d6879e800f57979e847a3d9753e41b8400c5954", "erc20": "0x4d224452801aced8b2f0aebe155379bb5d594381", "from_address": "0x21a31ee1afc51d94c2efccaa2092ad1028285549", "to_address": "0x648e9390d7dbf9ee00e606b0a62e77c58f767a40", "value_binary": "0x0000000000000000000000000000000000000000000003c1fbde01786f130000", "value_string": "17745470000000000000000", "value_f64": 1.774547e22, "chain_id": 1 }, ] ``` You can do the same for ERC-721 `Transfers`. ### Extracting custom event logs `cryo` also allows users to extract custom event logs. This feature is useful for analyzing non-standard events or those specific to a particular smart contract. For example, to fetch custom events, you can use the `logs` dataset command with specific topics or event signatures. To extract the same `Transfer` events as above using a custom approach, the command would be: ``` cryo logs \ --blocks 18735627:18736127 \ --contract 0x4d224452801ACEd8B2F0aebE155379bb5D594381 \ --topic0 0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef \ --json ``` Alternatively, you can directly use the event signature: ``` cryo logs \ --blocks 18735627:18736127 \ --contract 0x4d224452801ACEd8B2F0aebE155379bb5D594381 \ --event-signature "Transfer(address indexed from, address indexed to, uint256 value)" \ --json ``` Consider extracting data about new token pairs created on [SushiSwap V2](https://etherscan.io/address/0xc0aee478e3658e2610c5f7a4a2e1777ce9e4f2ac#code) for a more complex example. The following command accomplishes this for a specific block: ``` cryo logs \ --blocks 18687074 \ --contract 0xC0AEe478e3658e2610c5F7A4A2E1777cE9e4f2Ac \ --event-signature "PairCreated(address indexed token0, address indexed token1, address pair, uint)" \ --json ``` The extracted transactions are sorted by block number and log index by default. For more examples and custom event extraction scenarios, refer to the [`cryo` documentation](https://github.com/paradigmxyz/cryo/blob/main/examples/uniswap.sh). ### Beyond data collection The versatility of `cryo` extends beyond event logs, encompassing various data types inherent to blockchain technology. Each dataset can be customized and extracted based on user-specific requirements, enabling various analysis and research possibilities. Post extraction, the data can be seamlessly integrated into different frameworks or tools for further processing and analysis, offering a comprehensive solution for blockchain data retrieval. ## Conclusion In this guide, we've journeyed through the remarkable capabilities of `cryo`, a tool that stands out in blockchain data extraction. Cryo offers a seamless and efficient way to access blockchain data, from its intuitive command-line interface to its powerful Rust-based engine. Whether you're getting into complex application development, embarking on in-depth research, or simply exploring the blockchain landscape, `cryo` proves to be an indispensable ally. Its ability to interact with various blockchain networks and its flexibility in data formatting and extraction ensures that your blockchain data needs are met with precision and ease. Integrating `cryo` with high-performance RPC endpoints like Chainstack global nodes further elevates its efficiency, providing lightning-fast data retrieval and enhanced reliability. This synergy enables you to harness the full potential of blockchain data, unlocking insights and opportunities that were previously challenging to access. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Debug & Trace APIs Source: https://docs.chainstack.com/docs/debug-and-trace-apis Debug and trace APIs allow Web3 developers to monitor, trace, and debug the activities that occur within blockchain networks. ## Overview They provide detailed information about transactions, blocks, and smart contracts execution making it easier to identify and resolve issues within decentralized applications. By using debug and trace APIs, developers can build more reliable and secure DApps with improved user experience and thus contribute to the overall health of a network. ### Historical states limitations To debug and trace transactions, you need to have historical states on the node. Full nodes of different protocols can [store different amount of historical states](/docs/protocols-modes-and-types#modes). At the same time, archive nodes keep historical states for the *entire chain*. ## Protocols with debug and trace APIs ### Ethereum To enable debug and trace APIs on your Ethereum node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node as a [global node.](/docs/global-elastic-node) #### Dedicated nodes To enable debug and trace APIs on your dedicated Ethereum node, you must have a [paid plan](https://chainstack.com/pricing/). Dedicated nodes can be deployed either as the Geth or Erigon client implementation for the full and archive modes respectively, however, they slightly differ from each other: * With a node running on Geth, only `debug_*`namespace is exposed. * With a node running on Erigon, both `debug_*` and `trace_*`namespaces are exposed. Learn how to [deploy your dedicated Ethereum node run on different clients](/reference/enable-debug-trace-apis-for-your-ethereum-node#dedicated-ethereum-node) with debug and trace APIs enabled. For the full list of the available debug and trace API methods, see: ### Polygon To enable debug and trace APIs on your Polygon node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node as a [global node.](/docs/global-elastic-node) #### Dedicated nodes To enable debug and trace APIs on your dedicated Polygon node, you must have a [paid plan](https://chainstack.com/pricing/). Dedicated nodes can be deployed either as the Bor or Erigon client implementation for the full and archive modes respectively, however, they slightly differ from each other: * With a node running on Bor, only `debug_*`namespace is exposed. * With a node running on Erigon, both `debug_*` and `trace_*`namespaces are exposed. For the full list of the available debug and trace API methods, see: [Erigon: RPC implementation status](https://github.com/ledgerwatch/erigon/blob/stable/cmd/rpcdaemon/README.md#rpc-implementation-status) ### BNB Smart Chain To enable debug and trace APIs on your BNB Smart Chain node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node as a [global node.](/docs/global-elastic-node) #### Dedicated nodes To enable debug and trace APIs on your dedicated BNB Smart Chain node, you must have a [paid plan](https://chainstack.com/pricing/). Dedicated nodes can be deployed either as the Geth or Erigon client implementation for the full and archive modes respectively, however, they slightly differ from each other: * With a node running on Geth, only `debug_*`namespace is exposed. * With a node running on Erigon, both `debug_*` and `trace_*`namespaces are exposed. For the full list of the available debug and trace API methods, see: ### Base To enable debug and trace APIs on your Base node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node as a [global node.](/docs/global-elastic-node) ### Avalanche To enable debug and trace APIs on your Avalanche node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node in the [archive mode](/docs/protocols-modes-and-types#modes). Your node will run the AvalancheGo client, which is the Go language implementation of an Avalanche node. Avalanche offers an API interface identical to [Geth's API](https://geth.ethereum.org/docs/rpc/server), but with a limited set of services that include `debug_trace*`. For a full list of the Geth debug API methods, see the [Debug namespace](https://geth.ethereum.org/docs/interacting-with-geth/rpc/ns-debug) section of the Geth documentation. You can deploy a dedicated Avalanche node starting from a [paid plan](https://chainstack.com/pricing/). ### Arbitrum To enable debug and trace APIs on your Arbitrum node, you must have a [paid plan](https://chainstack.com/pricing/) and deploy the node as a [global node.](/docs/global-elastic-node) #### Dedicated nodes To enable debug and trace APIs on your dedicated Arbitrum node, you must have a [paid plan](https://chainstack.com/pricing/). #### Available methods Both and dedicated Arbitrum nodes with debug and trace APIs enabled will expose the `debug_*` and `arbtrace_*` methods. Blocks older than 22,207,815th were added to the chain before [Nitro migration](https://developer.arbitrum.io/migration/dapp_migration) and cannot be queried with Geth methods. Starting from block 22,207,815, Arbitrum migrated to Nitro which made Geth `debug_*` methods available for newer blocks. Use the following methods for calling on blocks prior to 22,207,815: * arbtrace\_call * arbtrace\_callMany * arbtrace\_replayBlockTransactions * arbtrace\_replayTransaction * arbtrace\_transaction * arbtrace\_get * arbtrace\_block * arbtrace\_filter For the full list of the available `debug_*` namespace methods, see [Geth documentation](https://geth.ethereum.org/docs/interacting-with-geth/rpc/ns-debug). ### zkSync Era To enable debug and trace APIs on your zkSync Era node, you must deploy an node in the archive mode. The node will expose the `debug_*` API methods. A debug and trace node for zkSync Era is available starting from a [paid plan](https://chainstack.com/pricing/). For the full list of the available debug and trace API methods, see [Debug namespace](https://github.com/matter-labs/zksync-era/tree/main/docs/guides/external-node). ### Optimism To enable debug and trace APIs on your Optimism node, deploy a [global node](/docs/global-elastic-node) with the debug & trace APIs enabled. For the full list of the available debug and trace API methods, see [Debug namespace](https://geth.ethereum.org/docs/interacting-with-geth/rpc/ns-debug). ### Scroll To enable debug and trace APIs on your Optimism node, deploy a [global node](/docs/global-elastic-node) with the debug & trace APIs enabled. ### Ronin To enable debug and trace APIs on your Optimism node, deploy a [global node](/docs/global-elastic-node) with the debug & trace APIs enabled. ### Gnosis Chain To enable debug and trace APIs on your Gnosis Chain node, you must deploy a dedicated node in the full or archive mode. The node will expose the `debug_*` and `trace_*` API methods. A dedicated debug and trace node for Gnosis Chain is available starting from a [paid plan](https://chainstack.com/pricing/). For the full list of the available debug and trace API methods, see [Nethermind documentation](https://docs.nethermind.io/nethermind/ethereum-client/json-rpc). ### Fantom To enable debug and trace APIs on your Fantom node, you must deploy a dedicated node in the full or archive mode. The node will expose the `debug_*` API methods. A dedicated debug and trace node for Fantom is available starting from a [paid plan](https://chainstack.com/pricing/). For the full list of the available debug and trace API methods, see [Debug namespace](https://geth.ethereum.org/docs/interacting-with-geth/rpc/ns-debug). ### Harmony To enable debug and trace APIs on your Harmony node, you must deploy a dedicated node in the full or archive mode. The node will expose the `debug_*` API methods. A dedicated debug and trace node for Harmony is available starting from a [paid plan](https://chainstack.com/pricing/). For the full list of the available debug and trace API methods, see [Debug namespace](https://geth.ethereum.org/docs/interacting-with-geth/rpc/ns-debug). ## Usage examples You can debug and trace transactions by replaying them in the Ethereum Virtual Machine to get the execution details in the exact same way as they happened on the chain. ### debug\_traceBlockByNumber Trace all transactions included in a block with `debug_traceBlockByNumber` in [block 14,976,695](https://etherscan.io/txsInternal?block=14976695): ```bash cURL curl 'YOUR_CHAINSTACK_ENDPOINT' \ -H 'Content-Type: application/json' \ -d '{"id": 1, "method": "debug_traceBlockByNumber", "params": ["0xE486B7", {"tracer": "callTracer"}]}' ``` where * `0xE486B7` — the number of the block to get the traces of included transactions. For Geth, you need to provide the block number in hex. * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node HTTPS endpoint. See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). See a reverted transaction in the output: ```Json JSON "calls": [ { "type": "DELEGATECALL", "from": "0x68b3465833fb72a70ecdf485e0e4c7bd8665fc45", "to": "0x68b3465833fb72a70ecdf485e0e4c7bd8665fc45", "gas": "0x38ed0", "gasUsed": "0x31147", "input": "0x472b43f30000000000000000000000000000000000000000000000000429d069189e00000000000000000000000000000000000000000000000000160d962fcdfd0bb02400000000000000000000000000000000000000000000000000000000000000800000000000000000000000000b5ec97d9a8a9941a28a88084a1f670c62bd8bf40000000000000000000000000000000000000000000000000000000000000002000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2000000000000000000000000b00b2e950d7ef8bdc49377c49676d1550deab982", "error": "execution reverted" ``` ### trace\_block Trace all transactions included in a block with `trace_block` in [block 14,976,695](https://etherscan.io/txsInternal?block=14976695): ```bash cURL curl 'YOUR_CHAINSTACK_ENDPOINT' \ -H 'Content-Type: application/json' \ -d '{"id": 1, "method": "trace_block", "params": ["14976695"]}' ``` where * `14976695` — the number of the block to get the traces of included transactions * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node HTTPS endpoint. See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). See a reverted transaction in the output: ```Json JSON "action": { "from": "0x68b3465833fb72a70ecdf485e0e4c7bd8665fc45", "callType": "delegatecall", "gas": "0x38ed0", "input": "0x472b43f30000000000000000000000000000000000000000000000000429d069189e00000000000000000000000000000000000000000000000000160d962fcdfd0bb02400000000000000000000000000000000000000000000000000000000000000800000000000000000000000000b5ec97d9a8a9941a28a88084a1f670c62bd8bf40000000000000000000000000000000000000000000000000000000000000002000000000000000000000000c02aaa39b223fe8d0a0e5c4f27ead9083c756cc2000000000000000000000000b00b2e950d7ef8bdc49377c49676d1550deab982", "to": "0x68b3465833fb72a70ecdf485e0e4c7bd8665fc45", "value": "0x429d069189e0000" }, "blockHash": "0x03a83bf066e81498804f26caf5d49e47820f6a0e92fd1c9cb7dc3b87bf46cf0f", "blockNumber": 14976695, "error": "Reverted" ``` ### Advanced options on paid plans Dedicated nodes, archive nodes, debug & trace APIs are available starting from a [paid plan](https://chainstack.com/pricing/). # Dedicated Node Source: https://docs.chainstack.com/docs/dedicated-node ## What is a dedicated node? A dedicated node is a node that exclusively deployed for and belongs to a particular customer, with its compute resources are not being shared with anybody else. Such nodes are highly customizable and can offer more additional services than other node types. This means that each dedicated node can be configured to perfectly match the case and needs of a particular customer. ### Dedicated node plans **Subscription level** — a Business or Enterprise subscription is required. See [Pricing Page](https://chainstack.com/pricing/). ## Node clients For most of the protocols, the popular mode clients are available for dedicated nodes: * Reth * Geth * Erigon For example, for [Web3 data backfills](https://chainstack.com/raw-web3-data-for-your-warehouse/), Reth might be the fastest option. ### Debug and trace API Regardless of mode, all dedicated nodes will have debug and trace API enabled for protocols which support this feature. See the [full list of protocols with debug and trace API](/docs/debug-and-trace-apis). ## Hosting regions Dedicated nodes offer unparalleled flexibility, allowing them to be deployed in various combinations of networks and modes across any hosting regions currently available. With this flexibility, you can seamlessly leverage dedicated nodes in a desired hosting region, ensuring optimal performance and adaptability for your Web3 projects. # Deep dive into Merkle proofs and eth_getProof Source: https://docs.chainstack.com/docs/deep-dive-into-merkle-proofs-and-eth-getproof-ethereum-rpc-method **TLDR** * Introduces the eth\_getProof method, which returns a Merkle proof of an account’s state and storage at a specific block. * Explains Ethereum’s Merkle trie structure and how merkle proofs allow verifying account/storage data without downloading the entire state. * Demonstrates retrieving a proof for an ERC-20 contract (e.g. USDT) and shows a sample Python script to verify that proof off-chain. * Reminds that archive nodes are required and that eth\_getProof is supported on both Geth (no block limit) and Erigon (100,000-block limit). ## Main article `eth_getProof` is an Ethereum JSON-RPC method. It returns the Merkle proof of a specific account and its storage at a given block. If you're not familiar with these terms, don't worry. Most people don't understand them. By the end of this article, you should have a better understanding of what `eth_getProof` is and how to use it. eth\_getProof requires an archive node. ## The Merkle trie `eth_getProof` was proposed in [EIP-1186](https://eips.ethereum.org/EIPS/eip-1186). The proposal summary states the following: > In order to allow verification of accounts outside the client, we need an additional function delivering us the required proof. These proofs are important to secure Layer2-Technologies. But how does the proof work? In the Chainstack blog **[Deep dive into eth\_call](https://chainstack.com/deep-dive-into-eth_call/)**, we discuss how Ethereum uses Merkle tries for data storage. There are four types of tries used by Ethereum: * State trie * Storage trie * Receipt trie * Transaction trie The state trie of Ethereum is like a real tree in some ways. All the account information is stored at the end of the branches, which are represented by the leaves. Each block has a unique hash value that references the storage root. In a full mode, which has access to the most recent 128 blocks, there are 128 isolated state tries (in reality, it is slightly more complicated than this). When the EVM needs to access the account state, it uses the block number to find the proper root hash, then uses the root hash to determine which trie to query, and finally traverses down the branches to identify the correct leaf containing the account information. ## Merkle proof Now, here's the question: we have account information (`storageRoot`, `codeHash`, balance, and `nonce`); how do we verify this information’s authenticity and correctness? The answer is by using the storage root hash. For Ethereum, the final root hash is derived by hashing every layer of the trie. [Merkle proofs for offline data integrity](https://ethereum.org/en/developers/tutorials/merkle-proofs-for-offline-data-integrity/) ```javascript stateRootHash //In a very simplified fashion stateRootHash = hash( hash( hash(hash(A)+hash(B))+hash(hash(C)+hash(D)) )+ hash( hash(hash(E)+hash(F))+hash(hash(G)+hash(H)) ) ) ``` The final root hash is computed by hashing all the sub-nodes in the state trie. If any of the account information differs from the original, the resulting root hash will also be different. Therefore, any account information can be verified using an authenticated root hash. For example, consider the following Ethereum account: `0xdac17f958d2ee523a2206206994597c13d831ec7`. At block `16947990`, its account information is as follows: ```javascript Account states 0xdac17f958d2ee523a2206206994597c13d831ec7{ balance : '1', codeHash : '0xb44fb4e949d0f78f87f79ee46428f23a2a5713ce6fc6e0beb3dda78c2ac1ea55', nonce : '1', storageHash : '0x171b13e236ba0fd523b341866fdd3db37aeb8eb9bb578e819cbd983971309e3c' } ``` The state root hash at block `16947990` is: `0x98612efbedabf19646d53f903810e156143d3174a4e985b00cbf01dc257431d1` If any of the account information does not match the original value, the final root hash will not match `0x98612efbedabf19646d53f903810e156143d3174a4e985b00cbf01dc257431d1`. Therefore, we can prove that this information is incorrect. Here comes another question: do we need every account's information to produce a proof? The answer is obviously no. With millions of accounts on the blockchain, it is not feasible to retrieve all these data just to verify one single account, especially for off-chain verification. In fact, to verify account C, we don’t really need the information from leaf nodes A, B, D, E, F, G, H; we just need the hash of the intermediate branch nodes. [Merkle proofs for offline data integrity](https://ethereum.org/en/developers/tutorials/merkle-proofs-for-offline-data-integrity/) If the value of `hashEFGH`, `hashAB`, and `hashD` are known: ```javascript Nodes hashes const hashEFGH = hash(hash(hash(E)+hash(F))+hash(hash(G)+hash(H))) const hashAB = hash(hash(A)+hash(B)) const hashD = hash(D) ``` The original equation becomes: ```javascript stateRootHash //In a very simplified fashion stateRootHash = hash(hash(hashAB + hash(hash(C)+hashD)+ hashEFGH) ``` This is much easier, isn't it? Essentially, `eth_getProof` returns the values of `hashEFGH`, `hashAB`, and `hashD`. It is also known as Merkle proof. Merkle proof doesn't only work for account information but also for storage information. Storage refers to the data that lives within an account, such as the owner and balance mapping from an ERC-20 token smart contract. ## eth\_getProof `eth_getProof` returns the Merkle proof for a specific account and its associated storage values. Merkle proofs are used to verify the inclusion of a certain piece of data within a Merkle trie. ### Parameters * `DATA` — Ethereum address of the account for which the proof is requested * `ARRAY` — the array of 32-byte storage keys that need to be proven and included. See [eth\_getStorageAt](/reference/ethereum_node_api/accounts_info/eth_getStorageAt) in the API reference. * `QUANTITY|TAG` — the integer block number identifying the block for which the proof is requested or a string tag such as `"latest"` or `"earliest"`. ### Response `Object` — an account object with: * `balance`: `QUANTITY` — the balance of the account. See [eth\_getBalance](/reference/ethereum_node_api/accounts_info/eth_getBalance) in the API reference. * `codeHash`: `DATA`— the code hash of the account. For a simple account without code, it will return `"0xc5d2460186f7233c927e7db2dcc703c0e500b653ca82273b7bfad8045d85a470"`. * `nonce`: `QUANTITY` — the nonce of the account. See [eth\_getTransactionCount](/reference/ethereum_node_api/accounts_info/eth_getTransactionCount). * `storageHash`: `DATA` — SHA3 of the `StorageRoot`. All storage will deliver a Merkle proof starting with this root hash. This hash is used as the starting point to retrieve a Merkle proof for all storage entries associated with the account. * `accountProof`: `ARRAY` — the array of RLP-serialized Merkle trie nodes, starting with the stateRoot-Node, following the path of the SHA3 (address) as key, and can be used to prove the existence and validity of a piece of data in the trie. * `storageProof`: `ARRAY` — the requested array of storage entries, where each entry is represented as an object that includes the following properties: * `key`: `QUANTITY` — the requested storage key * `value`: `QUANTITY` — the storage value * `proof`: `ARRAY` — the array of RLP-serialized Merkle trie nodes, starting with the `storageHash-Node`, following the path of the SHA3 (key) as the path. Source: [EIP-1186 specification](https://eips.ethereum.org/EIPS/eip-1186) See the following code samples: ```bash cURL curl "YOUR_CHAINSTACK_ENDPOINT" \ -X POST \ -H "Content-Type: application/json" \ -d '{"id": 1,"jsonrpc": "2.0","method": "eth_getProof","params": ["0xdac17f958d2ee523a2206206994597c13d831ec7",["0x9c7fca54b386399991ce2d6f6fbfc3879e4204c469d179ec0bba12523ed3d44c"],"latest"]}' ``` ```js web3.js const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); async function main() { // Gets the Merkle proof of USDT's smart contract var proof = await web3.eth.getProof( "0xdac17f958d2ee523a2206206994597c13d831ec7", [ //Storage slot hash, value = keccak256(storage address + slot index) "0x9c7fca54b386399991ce2d6f6fbfc3879e4204c469d179ec0bba12523ed3d44c" ], "latest" ) var block = await web3.eth.getBlock("latest") //state root is very important for Merkle verification var stateRoot = block.stateRoot console.log("state root:" + stateRoot) console.log(proof) } main(); ``` ```py web3.py from web3 import Web3 httpUrl = "YOUR_CHAINSTACK_ENDPOINT" w3 = Web3(Web3.HTTPProvider(httpUrl)) proof = w3.eth.get_proof('0xdAC17F958D2ee523a2206206994597C13D831ec7', ["0x9c7fca54b386399991ce2d6f6fbfc3879e4204c469d179ec0bba12523ed3d44c"], "latest") print(proof) ``` Sample result: ```shell Shell { address: '0xdAC17F958D2ee523a2206206994597C13D831ec7', accountProof: [ '0xf90211a06a718c2c9da77c253b12d7b2569657901e37bb691718f5dda1b86157ab1dd5eda0e7f19ed5e21bccc8d3260236b24f80ad88b3634f5d005f37b838881f0e12f1bda0abb301291704e4d92686c0f5f8ebb1734185321559b8d717ffdca95c99591976a0d0c2026bfab65c3b95276bfa82af9dec860b485f8857f293c148d63a2182128fa0c98044ec9a1273a218bed58b478277dd39173ad7b8edb95c200423a6bc8fc25fa056e5a55d9ddccdbf49362857200bbb1f042d61187c9f5f9ddcff5d2f1fc984a2a02a5b7200af424114f99a4b5f0a21c19aac82209e431ed80bfde177adb1004bdfa0026e4374f0518ff44a80fa374838ecb86cc64ac93bb710fea6dff4198f947b27a03fea341d87984673ad523177ed52f278bf4d8f97e6531c8ece932aeede4802f4a0bfe2f4a7fcb78f7e9f080dea7b6977fb1d88c441696e4456dad92b9d34ff0f43a02a3eb5c0edb14626c9c629601027bd60178bb2b688a67cea4d179fc432436615a0747355b8e02f3b884b4ffe5cea1619e32515fea064cca98208591af8c744e894a0874253737bae37f020ad3bb7e3292c7c4a63cdc158af6b33aaa4deaef016dccba03d8192bc1fc6aa1548912e763a0b5013a94399cefad7b47cf388873b2b794068a09b67f9737c6028d796bfd1c5da57a6f45824dc891f848ea0e1f8019d1fb5fba8a0aa871f9de8da85960fcd8a22cdf21c27f11e3966c14a6737ffd414b98dda00b280', '0xf90211a0d360be1e1da1a0c32bc4c105833bd531e59d110684007b7c50fb2709002973eca0cf6dd1e350a7031b4e2ab49c899fd8bd47551c8565d8fd8d1d7796c83820c3b1a0eb0a88c29bb33989a589156f7bf07d9efc74034dd9d3f5b73385c3b45c3249bea02783c25f97a6ddb8dc07adf4b176991836d39184b1f678adeda832fff15e3664a00a4e288060045e587774d8a64993a7add73068b16863145e1e8eeb4602e18e19a0340851f4046ad1298962d6e47d05c66329549c839c158748aaad7ae00b943aefa085b127bc2a3bd17604283de21b2b3c9aa8f1d4b7b85c94d8105a46fe32c77688a00f531d62b3c5435324c01009c284fe31277e8d38302b75ea01be89f09e205969a00011c8351c0e3d639ac54b9d3a59de630b16a67de8270d7d6064d0a67e93f9cca048780d32b7f2db88650b51c46f46fd0a68795edee1fd5ecee6eb3595741d9669a0c91afd74eaf8e08a997061a62b354e2516fdc494e8e26cc50ceeb8f4a175608ba0e2c07f1b48fab80eecb340f5882e8c7b32ee416e4045c61f1df646a133487303a01a1eff78435a7a29a29463bdc3486ae81364b00bea82ba0fdf67a110770f2261a04f2eb440ba71c72da5fd7f0e439018d6671dc809f747213a1ea755848124e994a074ff9f37fce99daa3ed01dd763076450022996fc729be2cc43c61ec5182c2366a0b80b36b7b621112592f52390b89748d422e9b1517c4b0203b8176a53f89d4a6680', '0xf90211a0b25f283bd01a8c8b2418049f9585bc37ff2c1e2e12eab4b7f64ae1f26647389aa02ad96c150d7c3c9c194d30315456852cf6a0a940e0191ae5d04007454823d4e9a0b220cf7a855e2dbcc0b973134e2e119b982d7d40dbb1b27d99816c41f40e829aa049224431da84cbf1b7ae813abcc9ef4c1dfc1760f6ddc5d57f7354bf3cbf6cc4a015191f879ac115b362f0257fd3eedb789537e836574a5b1abf1c9982ebe3bdfea07913c1b6e7282569d2d421e9fa2257f5d1698e93303bc49b941704287d7aaefea0a526576981ce6fd9f2bd48dd2ca6d5272f2fbdc85f0ee35a295f6ccd97ae8765a0313fad407f0c737c29024c02a890c4ecc12d7771c05ab7b435e5087a7cdef4d9a0d2044603cba9d4afdaf6fd2470e729ef3a65242de71276f20d59accfa6b53a7ca0457caacb9370c09b15f7d904adefd2308be94e23669ba5f43241ffff5f438a0aa09fb2dd45a383a0cc088a72b14117e1e9b7d6889218f3ac7631e8de644c5cb76da0c675dcd4d3fb692b514851c6106e2b09e6f5661d56a0a32ae02e2efc1515c235a074949a59ff1bdba87548510d6e404ec4532f4456dfdec8e753d92fda11a3088ba0a328c6ab1ab8f70db4d23e95bb163c13ba0c508f063a5b1393a4efd7ff375f05a0c722fe3ce796998269373cbb2fc229b2bdf2c43c6c2df003309422e043ce6c03a024e69343286eec44fa4744f6907209116e5383cff3fa98fe81ba06e7e8d4366680', '0xf90211a00e99ba2198124b8241ea304551fe973215829e2fbc0438d67922707a2a847432a0bb9ce24fd527879c5fe6dbbec1ef5a05ed9d1ca88e921d140bafbec1112f6a6aa099787fd6c7a1989229c4291ef5267335e66152ce417daea46e66d19cb6f81d1ca0e430ff4b8d5621baa5978673344e78b4d8b4df51431b6e63785267c98a24ce18a0bb3e91a825fe3d42ed270a93e9ad1aabd566c40cb28e622f7f1d7ee967c8afd6a0aa364b0056870c6507bc3262a5f851ecb13684088bdb13996d3cb2db401ce3ffa0a3732eba4c7a6e062665ab5be08acb986c3db87556fb138548cc900ff1e56995a026b088e90c9738b8ce16e853107a937a50d52726a24f9f6ce60f587762eb45a2a006c9d5bc3c064b5c1fb565bff91cace9161c64ae653a329610c1dcf34d434429a06c16df2edc70656d322d0c2403bad7d45bc790ffc3e7adeef856d98ea6afc91ba0ae05ed5d6c34b5da29c2e94d7880aeba0906f95f4ec10b132a1d4766a0701c98a01470a86aa350d1ada0c082eac75de828a851f9c8c7c4aa49b1556fe3a5574966a0334eef025100a6da1033710dd98e0475f29d3d7e397caf618ca71c336c5f4f49a0ef0b3abbebcff34d6a8a8f5cdbfbd154ab3452b58dcb09de58ec983644963675a041857e865ec38e200a13bc1a3cb71c7d69aeef7ffdee8be515c9a5b691ce091fa059edd0eb3bbec36bbf38a19802d4646c00ba821ab55fdeea12e15bab62c4e1e580', '0xf90211a0af0c7fa65ffcb84c31e68c1cf00e1a20bf8bb497c39883e19b66a99975b03431a0c492cab3623eb7926069794c3c718733e16c5fd0d4a13fb7c752ee9809aac7ada05003cea7132aa70d6f36731d60640a90bcd8f4fd493e4540d5ab1b4943679c0ca0fd700683405b1d2306b586dd3b5b2f92f1692fae20d17cd8b8e59d09b9c6670da01db8683910e46e56e8afeb9fe2b7c35382e5a0914d7b0dd8f0e8cb9981ba7435a0fa7f75d73aa73c35824387bec81388315caa4aee3f4f5562f971beb256c62d49a0ee478e420d83f413e8568dacfd5d83f83a5dd7c45f494b504828e5dc962f0e3ea094b95444a917ac94a675681f6bf851172ad0969801a783a63a71edafed45e7a7a0a0c46586e109abe80fe50361dd582e3f143cb416828239faa43bb2b890869501a0ae051d5d43634c68bf9c97823256cc68580f194dfdbd0c301140c7ca5853430ca0660b9365bb77ec9cdc6eb95516c162dca20727c6f828dbbeb1ae110dde4d3134a09feb1b75e84ff6722e4d837bfb6d207b6ee3b21b86844a01140ce293813b49a1a0ed58a70b04efa3bdc0babe2abfa20824a75d61d52291bfdb5cf08597800764d6a020a2d5d3a83f9e35ad9fd1c448626d90af0eb3efefaa4f2f93207b4096ef5507a0fc8efc4484dcf0a54f0574de9aaade0dcff6ec3599edb9f82efb26b6566dcaeaa032f7e79856db3fd984f72bb2c93d4dab328198d355a61c975fab1f08bdb2046580', '0xf90211a0c87222cccea2bf32759fcee9dbaacbe3ea4165dd6184af6773651c5e00e34a8ba0be90e6e5d1a67ab5587779c60ac136d6a96db62b84c04998a5f03a367346abd6a05344aa1c9ca2e3e56bf98fd718ec43728578d148e1967fbaf8bf17a2a073a0bda011a2f9312c3308640a0d6ceeae218747290f23806067456da1d444c65abae437a0b3097a108bfce79af6699da4ae3003cd4929f0b4576aad655c31cb725bde84c7a0c133d3c637e174f36a73c22b1039eb003da6374bc0929321241badb3efa3c4a9a0f13059f2301ad9862ce02e3f7f3f2c9ab78eb30583764d73654f7f1f8b1e86fda06544e3915748b18204e09df75ff20d2fa6bd8121e2e669699012d54590383d6fa070e3a8e093691581d58fadb560b510262a758037632cd8670d3a36df828976b7a062a88a2900544dc76a32255a6b2b2a2eef8fa68279700c00adc7508286702552a0a474aeebd5603dfce46a6ecd1ecd519068dc034a544fde03ac42d4018e60a334a0b7d528fc41c8fdc8ea18c6e7d0099270c777ec1403cf879d1f5134bdc12a6c6ca04440f1242e42c5bfa7c536591ab89c8e84bea417435871c32eef1e25295b20daa06a5dcfe3cc84cff9d3e3c3ae868cfba8f0dd111a90c3f85869dab5b893f96643a026b2fb9dd7d08b0ed2f1c44fbf875011412a384f86f751c92e1013248d4aa371a0c75597b2b789fc4e939b71937390ce9d7d53159431328ac52180eef08ef200f280', '0xf90191a0f0c5b800b542001597f2b7a8e106ac0e2849d2cc1df1727ac35c4ea3965f1c9180a08537f2e248702a6ae2a57e9110a5740f5772c876389739ac90debd6a0692713ea00b3a26a05b5494fb3ff6f0b3897688a5581066b20b07ebab9252d169d928717fa0a9a54d84976d134d6dba06a65064c7f3a964a75947d452db6f6bb4b6c47b43aaa01e2a1ed3d1572b872bbf09ee44d2ed737da31f01de3c0f4b4e1f046740066461a076f251d160b9a02eb0b5c1d83b61c9cdd4f37361705e79a45529bf49801fb824a0774a01a624cb14a50d17f2fe4b7ae6af8a67bbb029177ccc3dd729a734484d3ea05921b8a19aebe4fff5a36071e311778f9b93459183fdf7f6d870b401fa25dcbba0c8d71dd13d2806e2865a5c2cfa447f626471bf0b66182a8fd07230434e1cad2680a0e9864fdfaf3693b2602f56cd938ccd494b8634b1f91800ef02203a3609ca4c21a0c69d174ad6b6e58b0bd05914352839ec60915cd066dd2bee2a48016139687f21a0513dd5514fd6bad56871711441d38de2821cc6913cb192416b0385f025650731808080', '0xf8669d3802a763f7db875346d03fbf86f137de55814b191c069e721f47474733b846f8440101a0146f8675fabbf90b214375a6839a8ddfb33f4c556a26ade8a48c4a82d7055100a0b44fb4e949d0f78f87f79ee46428f23a2a5713ce6fc6e0beb3dda78c2ac1ea55' ], balance: '1', codeHash: '0xb44fb4e949d0f78f87f79ee46428f23a2a5713ce6fc6e0beb3dda78c2ac1ea55', nonce: '1', storageHash: '0x146f8675fabbf90b214375a6839a8ddfb33f4c556a26ade8a48c4a82d7055100', storageProof: [ { key: '0x9c7fca54b386399991ce2d6f6fbfc3879e4204c469d179ec0bba12523ed3d44c', value: '0x3499e1d1', proof: [Array] } ] } ``` ## Verifying the proof [This script](https://colab.research.google.com/drive/1ksKdHyUquq1Ea4RoDhypoLNg5ipxdE7H?usp=sharing) on Google Colaboratory is a working Python sample from the [web3.py’s official documentation](https://web3py.readthedocs.io/en/stable/web3.eth.html#web3.eth.Eth.get_proof). To run the script, follow the steps below. First, you need to uncomment and run the following `pip install` commands to install py-trie and [web3.py](https://github.com/ethereum/web3.py). ```bash Shell # pip install trie # https://github.com/ethereum/py-trie # pip install web3 # https://web3py.readthedocs.io/ ``` Then you can go to **Runtime > Run all** or press the button beside each cell. Most importantly, don’t forget to fill in your Chainstack endpoint in the following line: ```bash Shell httpUrl = "YOUR_CHAINSTACK_ENDPOINT" ``` You can sign up and get an Ethereum RPC endpoint for free: Next, you can run the code to see it in action. Scroll all the way down to the last section of the script. This is where the code is executed. getProof code To begin, we retrieve the latest block information by calling `w3.eth.get_block("latest")`. Next, we obtain the Merkle proof of the account `0xdAC17F958D2ee523a2206206994597C13D831ec7` and one of its storage values by calling `w3.eth.get_proof('0xdAC17F958D2ee523a2206206994597C13D831ec7', ["0x9c7fca54b386399991ce2d6f6fbfc3879e4204c469d179ec0bba12523ed3d44c"], "latest")`. Then, we verify the proof by calling `verify_eth_get_proof(proof, block.stateRoot)`. If the proof is valid, `verify_eth_get_proof` returns `true` to the `isValidProof` variable. Otherwise, it returns a "Failed to verify account proof" error. Proof response ## Q\&A This method allows for efficient verification of the network's state without having to download the entire state trie for a block. This can be particularly useful for building lightweight clients or auditing smart contract state changes. There are many reasons why Ethereum uses a Merkle trie for data storage, but one of the most important is its verifiability. As a famous blockchain quote goes, "Don’t trust, verify.” You can learn more about the Merkle trie in the [Ethereum documentation article](https://ethereum.org/en/developers/docs/data-structures-and-encoding/patricia-merkle-trie/). The implementation determines this. For Ethereum, the trie is hexary, which means that there are 16 entries for every node. Yes, Ethereum keeps a separate storage trie for each state. The storage trie is a part of the state trie that stores the contract state. Each contract on the blockchain has its own storage trie, which is a key-value store that maps 256-bit keys to 256-bit values. This allows smart contracts to persist data across transactions and to maintain their own state. When a contract is executed, and its state is updated, the new state is stored in a new storage trie, which is then added to the state trie. This creates a new state root, which represents the updated state of the blockchain. RLP (recursive length prefix) serialization is a technique widely used by Ethereum clients for serialization purposes. In a Merkle trie, RLP is used to convert key-value pairs (the node entries) into a long string, which is then used for hashing. You can read more in [Recursive-length prefix (RLP) serialization](https://ethereum.org/en/developers/docs/data-structures-and-encoding/rlp/). Yes, `eth_getProof` is available on the Erigon client but only supports 100,000 blocks in the past. On Geth, there's no 100,000 blocks limit. Note that you need an archive node for `eth_getProof`. ## Conclusion This concludes the article. Congratulations! You now know how to use the eth\_getProof method to retrieve a Merkle proof and verify a user's ERC-20 token balance. When using `eth_getProof`, remember to provide the correct block number, account address, and key path to retrieve the desired proof. ### See also ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Deploy a subgraph Source: https://docs.chainstack.com/docs/deploy-a-subgraph **TLDR:** * Set up your subgraph by adding it to the Chainstack console, configuring it via Graph CLI (install, init, build), and deploying it with the provided deployment command. * Configure `schema.graphql` and `subgraph.yaml` carefully to define the desired indexing entities and starting block. * Track deployment status in the console (sync in progress, up to date, or failed) and use logs for troubleshooting. * Query your subgraph using either the CLI’s **Query URL** or the **GraphQL UI URL** to seamlessly fetch on-chain data. ## Set up a subgraph To set up a subgraph, complete the following steps: ## Add a subgraph In [Subgraphs](https://console.chainstack.com/subgraphs), you can view a list of all existing subgraphs and can filter the view using the **Protocol** drop-down list. You can create a new subgraph using the **Add subgraph** button and view the details of a subgraph by clicking on the subgraph name. Subgraphs must be associated with a project; if you don’t already have a project to add the subgraph to, see [create a project](/docs/manage-your-project#create-a-project). To add a subgraph: Click **Add subgraph**. The Add subgraph page is displayed. In the **Choose network** section: * Choose a **Blockchain protocol**. * Choose the **Network**. Currently, **Mainnnet** is supported. * Choose the **Type**. * Click **Next**. The Create subgraph section is displayed. In the **Create subgraph** section: * Enter a **Name** for the subgraph. * Select the **Project** that you want to assign your subgraph to. * Click **Add subgraph**. The details page of the new subgraph is displayed. The subgraph details page includes information such as the **Owner**, **Creation date**, **Region**, and **Protocol**. You can view the **Metrics** about the requests made in the subgraph, view the **Subgraph Query URLs**, and the **Subgraph Deployment command**, which you require to deploy the subgraph. ## Create and configure a subgraph Before you begin: you must have Graph CLI installed globally to build and deploy subgraphs. You can install it with npm as follows: ```bash Shell npm install -g @graphprotocol/graph-cli ``` To check if the Graph CLI was installed correctly, run: ```bash Shell graph –v ``` To create and configure a subgraph: Open a new directory in your terminal and run: ```bash Shell graph init ``` Configure all the required parameters that are displayed in the CLI. The Graph will download your smart contract’s ABI and install all the dependencies by running `npm install` automatically. Set up your `schema.graphql` file to define all the entities and key-value pairs that you want to query. Go to your manifest file (`subgraph.yaml`) and make sure all the deployment parameters are correctly defined. We recommend that you only start indexing data from the block number of the first transaction that you want to track as this can save a lot of indexing time. To generate AssemblyScript types for the entities defined in your schema file, in your root directory, run: ```bash Shell graph codegen ``` When your mappings file is configured, run: ```bash Shell graph build ``` When your subgraph compiles successfully, you are now ready to deploy your subgraph. ## Deploy a subgraph To deploy your new subgraph: In **Subgraphs**, open the details page of your new subgraph and copy the **Deployment command**, which will have the following format: ```bash Shell graph deploy --node https://api.graph-eu.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d/deploy --ipfs https://api.graph-eu.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d/ipfs my_subgraph_v1_0 ``` Paste and run the command in your CLI. Enter a version label for your subgraph (you can use any number/letter character combination). If the subgraph is successfully deployed, you will receive a response like the following example: ```bash Shell Deployed to https://chainstack.com/subgraphs/SG-123-456-789 Subgraph endpoints: Queries (HTTP): https://ethereum-mainnet.graph-eu.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d/my_subgraph_v1_0 ``` In the subgraph details page, the status of the subgraph will change to **Deployed. Initial sync in progress** and will continuously provide feedback on the progress of the synchronization process. When it completes, the status will change to **Up to date**. If the subgraph is in the **Failed** state, click **Logs** to view four levels of messages generated by the indexer and troubleshoot the issue. ## Query a subgraph To query a subgraph, you can choose from either of the following **Subgraph query** options in the subgraph details page: * **Query URL** — use this URL to query in the CLI. * **GraphQL UI URL** — use this URL to query in the GraphQL UI. ### Query URL in CLI To query your subgraph in the CLI, copy the **Query URL** from the subgraph details page and add it to the query. For example, to query transactions: ```bash cURL curl -g \\ -X POST \\ -H "Content-Type: application/json" \\ -d '{"query":"{myTransactions(first: 5) { id _value _from _to}}"}' \\ https://ethereum-mainnet.graph-eu.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d/my_subgraph_v1_0 ``` ### GraphQL UI URL query in browser To query the subgraph using the GraphQL UI, copy the GraphQL UI URL from the subgraph details page and paste it in your browser. The GraphQL UI is displayed, where you can enter the details of your query. # Develop a Battleship Game Using Zero-Knowledge Concepts on Ethereum Source: https://docs.chainstack.com/docs/develop-a-battleship-game-using-zero-knowledge-concepts-on-ethereum **TLDR** * Showcases how to build a Battleship game on Ethereum using a lightweight cryptographic technique for hidden coordinates. * Stores hashed ship positions as ECDSA signatures so only the owner can prove hits or misses. * Demonstrates verifying these signatures on-chain to ensure game fairness and incomplete information. * Provides a complete Solidity contract and tests with Hardhat for a two-player Battleship match. ## Main article Zero-knowledge proof is a method by which one party (the prover) can prove to another party (the verifier) that they know a value, without conveying any information apart from the fact that they know the value. This concept is commonly used in zero-knowledge chains that are becoming more common in the blockchain ecosystem lately. This article will guide you through how you can develop a blockchain game using this principle. Learn more about zero-knowledge based blockchains by reading [zkEVM and zkRollups Explained](https://chainstack.com/zkevm-and-zkrollups-explained/). ## What is battleship Battleship, a guessing game played between players on separate grids representing their fleets of ships. Each player has their own grid and they place their ships on the grid in secret. The objective of the game is to sink the opponent's fleet of ships by correctly guessing the locations of their ships on the grid. Players take turns calling out coordinates, any `(x,y)`, on the opponent's grid, in an attempt to find the location of the opponent's ships. If a player guesses a coordinate where a ship is located, the opponent must respond with "hit". If a player guesses a coordinate where there is no ship, the opponent must respond with "miss". The player who sinks all of their opponent's ships first wins the game. ## What we'll be doing? In this tutorial, we'll be building our very own game of battleship on Ethereum. There is a slight hitch however. Battleship is a [game of incomplete information](http://gametheory101.com/courses/game-theory-101/introduction-to-incomplete-information/), and that doesn't sit nicely with the public, permissionless nature of Ethereum (and other public blockchains like Polygon). While we can declare state variables as private, [anyone could still access their values](https://ethereum.stackexchange.com/questions/44893/how-do-i-see-the-value-of-a-string-stored-in-a-private-variable). How do we store private data on a public blockchain? It's a catch-22, or is it? ## Our approach There are multiple approaches to privacy on the blockchain thus far. One particularly promising one has been [ZKPs](https://medium.com/@ashwin.yar/zkps-the-ultimate-shield-for-your-privacy-in-web3-a-beginners-introduction-to-the-future-of-e60918da01b4) (zkSNARKs, zkSTARKs etc.). However in this tutorial, we'll be going lightweight. We need to a create a unique identifier for each players ship's coordinate that would be impossible to guess/reverse by anybody else, but which can be verified in the future, to reveal the initial data (ship's coordinate). We need a one-way function, and public-private signatures are a perfect match for this use case. Public-private signatures are also known as digital or cryptographic signatures. It is the same technology behind authorizing transactions on the blockchain. We sign an arbitary byte of data (a ship coordinate in this case) with our private key, and this generates a signature. We can then retrieve the corresponding public key/address of a signature and confirm if it matches the expected value. Signing a message with a private key is deterministic, meaning signing the same data with the same private key will always produce the same signature. Here's a brief overview of how it all fits together: * *Player1* signs their ship coordinates and we store those signatures in our smart contract. * *Player2* declares which coordinates they've shot at. * *Player1* signs all the "shot" coordinates, and we check if such a signature exists in our smart contract. No? That was a miss. Yes? A ship has been hit! Couple of things to consider: * **Could *Player1* sign the wrong coordinates and provide us inaccurate data?**. Possibly, that's why we verify the signature in our smart contract to make sure they signed the right data. * **Since we treat all shots as the same regardless of who shot it, could *Player1* sink his/her own ships?**. In this implementation, yes. We could make it otherwise, but I think a bomb is a bomb, regardless of where it blows. You decide if that's a bug or feature. ## Diving in This is the source code of a smart contract in the Solidity programming language for our battleship game. For now, I restricted it to a two-players game, but with a little bit of work, it could support much more. Each player gets 10 coordinates (blocks) to form their ships. I think it would be best to envision the game as multiple states. ### State 1: Game start * The smart contract gets deployed with a whitelist of player addresses allowed to play in that game. It makes it easier to track when all players have joined and the game has started. * Players pick their ships coordinates, sign them, and send the signatures to the smart contract. Once they have done that, they have joined the game. * When all players have joined the game, we're off to the races. Players can not change their signatures (ship positions), nor can they take shots at this stage. ### State 2: Game in progress. Turn in progress * Turn 1, or turn 10, it doesn't matter much. Each player can pick a single coordinate where he'd be shooting at. We store this coordinate in our smart contract. * We make sure everyone has taken a shot before marking the turn as over. ### State 3: Game in progress. Turn over * Players retrieve all shots for that turn, sign them, and submit the signatures back to the smart contract. * For each signature, we verify that the player signed the right data. If the signature exists in our list of known ships, it's a hit! ### State 4: Game over * For a player to win, every other player must have lost all their ships. It's a draw when nobody has any ships left. Time to get into the code. ### If you'd like to follow along: Find the repository on the Chainstacklabs GitHub and find the instructions: [Game of Battleship Solidity and Hardhat](https://github.com/chainstacklabs/zk-battleship-game-hardhat) Heads up! It's a TypeScript project, but if you're not familiar with TypeScript, basic JavaScript understanding should be more than enough to follow along. ```shell Shell git clone https://github.com/chainstacklabs/developer-hub-content ``` Additionally, install [Hardhat](https://hardhat.org/tutorial/creating-a-new-hardhat-project) and the dependencies. ```shell Shell npm install --save-dev hardhat ``` ```shell Shell npm install ``` And you will also need a Chainstack node to deploy the contract on Sepolia, which is the recommended chain to test DApps, or Sepolia. Learn more about the differences between Sepolia and Goerli: [Goerli Sepolia transition](/docs/goerli-to-sepolia-transition) Deploy a node with Chainstack: ## Verifying digital signatures Since all signatures created on Ethreum make use of the ECDSA curve, there have been suggestions to have a [native compiled function for verifying signatures](https://eips.ethereum.org/EIPS/eip-665). However, that's not yet available, and we have to roll our own solution. We need a smart contract to derive the corresponding address of the public key used to create the digital signature. We'll be using the SigVerifier contract from the official Solidity documentation with some modifications. ```sol sol // SPDX-License-Identifier: GPL-3.0-or-later pragma solidity ^0.8.0; contract SigVerifier { function RecoverSigner( bytes32 _hashedMessage, uint8 _v, bytes32 _r, bytes32 _s ) public pure returns (address) { bytes memory prefix = "\x19Ethereum Signed Message:\n32"; bytes32 prefixedHashMessage = keccak256( abi.encodePacked(prefix, _hashedMessage) ); address signer = ecrecover(prefixedHashMessage, _v, _r, _s); return signer; } function SplitSignature( bytes memory sig ) public pure returns (uint8 v, bytes32 r, bytes32 s) { require(sig.length == 65, "Invalid Signature"); assembly { // first 32 bytes, after the length prefix. r := mload(add(sig, 32)) // second 32 bytes. s := mload(add(sig, 64)) // final byte (first byte of the next 32 bytes). v := byte(0, mload(add(sig, 96))) } return (v, r, s); } } ``` Digital signatures in Ethereum are based on the ECDSA curve. Each signature contains three parameters, `r`, `s`, and `v`. We can derive these parameters from a signature by splitting it into the requisite number of bytes. Retrieving the public address from these parameters is as simple as invoking Solidity's `ecrecover` function. This is a relatively costly process and consumes quite a bit of gas compared to normal transactions. To optimize the gas cost, we made use of assembly in our smart contract to split the signature. We could also have split the signature offchain with JavaScript or another programming language. ## The smart contract ```sol sol // SPDX-License-Identifier: GPL-3.0-or-later pragma solidity ^0.8.10; import "contracts/Verify.sol"; contract BattleShipGame is SigVerifier { // Admin address payable public owner; // Max Number of players participating in the game uint public constant NO_PLAYERS = 2; // Number of ship pieces a player can have, to build their ships uint public constant NO_SHIP_PIECES = 10; // All the players participating in the game mapping(address => bool) public players; address[] public playersAddress; // We use an array because it's easier to iterate than a mapping // Player ships mapping(address => mapping(bytes => bool)) ships; // Player ships that have been destroyed mapping(address => Coordinate[]) destroyedShips; // Players who have lost all their ships mapping(address => bool) public destroyedPlayers; uint public numberOfDestroyedPlayers; mapping(address => Coordinate) public playerShots; mapping(address => bool) public playerHasPlayed; mapping(address => bool) public playerHasPlacedShips; mapping(address => bool) public playerHasReportedHits; bool public isGameOver; struct Coordinate { uint8 x; uint8 y; } struct ShipShotProof { bytes signature; // The address of the player that shot // at this coordinate address shotBy; } event ShotReport( Coordinate coord, address target, address shotBy, bool isHit ); event PlayerJoinedGame(address player, uint playerIndex); // Emitted when a player loses all his ships event PlayerLost(address player); constructor(address[] memory _playersAddress) payable { require( _playersAddress.length == NO_PLAYERS, "_playersAddress does not match the number of expected players" ); for (uint i = 0; i < _playersAddress.length; i++) { address playerAddress = _playersAddress[i]; players[playerAddress] = true; playersAddress.push(playerAddress); emit PlayerJoinedGame(playerAddress, i); } owner = payable(msg.sender); } function joinGame(bytes[] memory _playerShips) public { require(!isGameOver, "Game is over"); require(players[msg.sender], "Address is not a part of this game"); require( _playerShips.length == NO_SHIP_PIECES, "Number of ship pieces does not match the expected value" ); require( !playerHasPlacedShips[msg.sender], "Player has already placed ships" ); for (uint i = 0; i < _playerShips.length; i++) { bytes memory shipHash = _playerShips[i]; require( !ships[msg.sender][shipHash], "User has already placed a ship on this tile." ); ships[msg.sender][shipHash] = true; } playerHasPlacedShips[msg.sender] = true; } function takeAShot(Coordinate memory _coord) public { require(isGameStarted(), "Game hasn't started"); require(!isGameOver, "Game is over"); require(players[msg.sender], "msg.sender is not a player in this game"); require( !playerHasPlayed[msg.sender], "Player has made a move for this turn" ); playerShots[msg.sender] = _coord; playerHasPlayed[msg.sender] = true; } function reportHits(ShipShotProof[] memory _shotSignatures) public { require(isGameStarted(), "Game hasn't started"); require(!isGameOver, "Game is over"); require(isTurnOver(), "All players have not played for this turn."); require( _shotSignatures.length <= NO_PLAYERS, "No way you can validate more shots than there are players!" ); for (uint i = 0; i < _shotSignatures.length; i++) { ShipShotProof memory shotProof = _shotSignatures[i]; (bool _isHit, Coordinate memory coord) = isHit(shotProof); if (_isHit) { destroyPlayerShip(msg.sender, coord); } emit ShotReport({ coord: coord, target: msg.sender, shotBy: shotProof.shotBy, isHit: _isHit }); } playerHasReportedHits[msg.sender] = true; } function isHit( ShipShotProof memory _hitProof ) internal view returns (bool, Coordinate memory) { Coordinate memory _playerShot = playerShots[_hitProof.shotBy]; bytes32 _calculatedHash = keccak256( abi.encodePacked(_playerShot.x, _playerShot.y) ); (uint8 v, bytes32 r, bytes32 s) = SplitSignature(_hitProof.signature); address signer = RecoverSigner(_calculatedHash, v, r, s); require( signer == msg.sender, "msg.sender and derived message signer do not match" ); // A ship piece at this coordinate exists return (ships[msg.sender][_hitProof.signature] == true, _playerShot); } // function destroyPlayerShip( address _player, Coordinate memory _coord ) internal { destroyedShips[_player].push(_coord); // All of a player's ships have been destroyed if (destroyedShips[_player].length == NO_SHIP_PIECES) { destroyedPlayers[_player] = true; numberOfDestroyedPlayers++; emit PlayerLost(_player); } } // Check if all players have played for this turn function isTurnOver() public view returns (bool) { for (uint i = 0; i < playersAddress.length; i++) { address _playerAddress = playersAddress[i]; if (!playerHasPlayed[_playerAddress]) { return false; } } return true; } function hasReportedShots() public view returns (bool) { for (uint i = 0; i < playersAddress.length; i++) { address _playerAddress = playersAddress[i]; if (!playerHasReportedHits[_playerAddress]) { return false; } } return true; } // End the current turn and reset all variables function endTurn() public returns (bool) { require( isTurnOver(), "The turn is not yet over, some players are yet to shoot" ); require(hasReportedShots(), "Some players are yet to report hits"); // Do we have a winner? // Only one player is left standing // It's also possible that everybody destroyed everybody. (Edge case) if (numberOfDestroyedPlayers >= (NO_PLAYERS - 1)) { isGameOver = true; } for (uint i = 0; i < playersAddress.length; i++) { address _playerAddress = playersAddress[i]; playerHasPlayed[_playerAddress] = false; playerHasReportedHits[_playerAddress] = false; playerShots[_playerAddress] = Coordinate({x: 0, y: 0}); } return true; } function isGameStarted() public view returns (bool) { for (uint i = 0; i < playersAddress.length; i++) { address _playerAddress = playersAddress[i]; if (!playerHasPlacedShips[_playerAddress]) { return false; } } return true; } function getWinner() public view returns (address winner) { require(isGameOver, "The game isn't over yet"); for (uint i = 0; i < playersAddress.length; i++) { address _playerAddress = playersAddress[i]; if (!destroyedPlayers[_playerAddress]) { return _playerAddress; } } // No winner return address(0); } } ``` Here is a summary of what the code does: * The game has a fixed number of players (two), each of whom can place 10 ship pieces on any coordinate. * Players take turns shooting at their opponent's grid by specifying the coordinates of a square. If a ship piece is on that square, it is destroyed. * If a player loses all of their ships, the game is over and the other player wins. * The contract uses the `SigVerifier` contract from the `Verify.sol` file, which provides functions for verifying digital signatures. * The contract emits events whenever a player takes a shot or loses all of their ships. The contract has several mappings to keep track of the state of the game. If you're not familiar with EVM programming, mappings might seem an odd choice to represent our data model. However, to keep gas costs down and optimize UX, we want to avoid looping through and modifying large arrays in our smart contract. Here are some of the key mappings: * `players` — a mapping of addresses to booleans indicating whether each address is a player in the game. * `ships` — a mapping of each player's address to a mapping of ship hashes to booleans. The `ships` mapping keeps track of which ship pieces each player has placed on the grid. * `destroyedShips` — a mapping of each player's address to an array of `Coordinate` structs. The `destroyedShips` mapping keeps track of which of a player's ships have been destroyed. * `playerShots` — a mapping of each player's address to a `Coordinate` struct. The `playerShots` mapping keeps track of the coordinates of the square each player has shot at during their turn. * `playerHasPlayed` — a mapping of each player's address to a boolean indicating whether that player has taken a shot during the current turn. * `playerHasPlacedShips` — a mapping of each player's address to a boolean indicating whether that player has placed all their ships on the grid. * `playerHasReportedHits` — a mapping of each player's address to a boolean indicating whether that player has reported all the hits from their opponent's shots during the current turn. The contract has several functions: * `joinGame` — allows a player to join the game and place their ships on the grid. * `takeAShot` — allows a player to take a shot at their opponent's grid. * `reportHits` — allows a player to report the hits from their opponent's shots. * `isHit` — whether a shot hits a ship and returns a boolean and the coordinate of the shot. * `destroyPlayerShip`: an internal function that adds a destroyed ship coordinate to the `destroyedShips` mapping and checks whether the player has lost all their ships. If so, the game is over. The contract also has several state variables: * `owner` — the address of the contract owner. * `NO_PLAYERS` — a constant that specifies the number of players in the game. * `NO_SHIP_PIECES` — a constant that specifies the number of ship pieces each player can place on the grid. * `playersAddress` — an array of addresses representing the players in the game. * `numberOfDestroyedPlayers` — a counter of the number of players who have lost all their ships. * `isGameOver` — a boolean indicating whether the game is over. Going through the [unit tests for the battleship contract](https://github.com/chainstacklabs/zk-battleship-game-hardhat/blob/main/test/Battleship.ts) will provide a lot of insight into its expected behavior, and how end users would interact with it. ## Interacting with the smart contract To be able to play a game with the smart contract, we need to create valid signatures that can be verified with Ethereum's [`ecrecover`](https://soliditydeveloper.com/ecrecover) method. It can be a little tricky to get signatures right with ethers.js, but here's one way: ```javascript Javascript import { ethers, Signer } from "hardhat"; export type ShipShotProof = { signature: string; // The address of the player that shot // at this coordinate shotBy: string; }; export type Coordinate = { x: number; y: number; }; export type Ship = Array; export async function signShipCoordinates(ships: Array, signer) { let signedShips = []; for (const ship of ships) { let signedShip = []; for (const coord of ship) { let { flatSig } = await signCoordinate(coord, signer); signedShip.push(flatSig); } signedShips.push(signedShip); } return signedShips; } export async function signCoordinate(coord: Coordinate, signer: Signer) { let hashedCoord = ethers.utils.solidityKeccak256( ["uint8", "uint8"], [coord.x, coord.y] ); hashedCoord = ethers.utils.arrayify(hashedCoord); let flatSig = await signer.signMessage(hashedCoord); return { flatSig, hashedCoord }; } export async function generateShipShotProof( player: Signer, allPlayers: Array, battleshipGame: any ) { let shotReports = []; for (const playerAddress of allPlayers) { let playerShot = await battleshipGame.playerShots(playerAddress); let { flatSig } = await signCoordinate(playerShot, player); let report: ShipShotProof = { signature: flatSig, shotBy: playerAddress, }; shotReports.push(report); } return shotReports; } ``` In brief: * `signShipCoordinates(ships: Array, signer)` takes an array of ships and a `Signer` object (wallet) as arguments, and returns an array of signed ships. * `generateShipShotProof(player: Signer, allPlayers: Array, battleshipGame: any)` quickly generates a list of proofs for each reported shot coordinate. To prevent signing arbitrary messages from signing transactions, messages are prefixed with `"\x19Ethereum Signed Message:\n"` + `length of the message`. Taking a look at our SigVerifier contract, we hardcoded the length of the message to be 32. We do that because we always hash our messages and data, and the length of hashes is always 32 bytes. ### Stats for nerds Here are some screenshots of the test case running through And gas reports: Apparently `joinGame` is our most expensive function. ## Conclusion We're finally done! We've built a complete game of incomplete information on a public blockchain. While we've built a game for recreation, these concepts could easily be applied to other ideas and projects. For example, we could create an anonymous NFT marketplace, where the owners of NFTs remain private, but they can verify their identity and sign off on bids. ### Further reading ## Improvements * We don't actually restrict players' ships to a board size. That doesn't seem quite practical. To do this, we'd have to somehow prove the coordinates are valid, without showing anyone. While beyond the scope of this article, it is a valid use case for [ZKPs](/docs/develop-a-battleship-game-using-zero-knowledge-concepts-on-ethereum#our-approach). I created a [circom circuit](https://gist.github.com/TobeTek/788aa89e5a483b5eeb1e7272ee1369f7) that does just that. * We could add support for more players. We'd need to add a check to prevent destroyed players from being able to play. * Using modifiers in the smart contract for tracking game state (`isTurnOver`, `isGameOver` etc.). I chose plain reverts for simplicity. * Our game doesn't yet have a UI. An interactive web UI would be a great addition! ### See also ### About the author Experienced with Django, micro-services, and RESTful APIs Working on AI and ML projects Ask me about data warehousing and analytics, system design & optimization, and automation programs [](https://github.com/TobeTek) [](https://www.linkedin.com/in/emmanuel-katchy) # Ethers.js: Enhancing Blockchain Data Reliability with FallbackProvider Source: https://docs.chainstack.com/docs/enhancing-blockchain-data-reliability-with-ethers-fallbackprovider **TLDR** * The FallbackProvider in ethers.js wraps multiple JSON-RPC providers to enhance data reliability and reduce single-node dependency. * It uses priorities, weights, and timeouts to determine consensus from multiple sources. * Configurable quorums ensure you only trust data that a majority or required number of providers agree on. * This method helps avoid inconsistent results from forks, latency issues, or out-of-sync nodes. ## Main article In the world of blockchain technology, where decentralization and transparency are paramount, ensuring the reliability and consistency of data is crucial. Sometimes, you might use different endpoints from different providers. However, they may be subject to network latency, temporary forks, or being out of sync with the rest of the network. Enter the `FallbackProvider`, a powerful tool provided by the `ethers.js` library. This utility is designed to enhance the reliability and accuracy of blockchain data by aggregating responses from multiple providers and forming a consensus. By leveraging redundancy and a consensus mechanism, the `FallbackProvider` mitigates the risks associated with relying on a single node, ensuring that the data you interact with is consistent and up-to-date. In this tutorial, we will dive into the inner workings of the `FallbackProvider`, exploring its configuration options, consensus mechanisms, and error-handling capabilities. ## The need for redundancy Blockchain networks are designed to be decentralized and distributed, with multiple nodes contributing to the maintenance and validation of the ledger. Relying solely on a single node to retrieve blockchain data can be risky, as it introduces potential points of failure and inconsistencies. One of the primary risks of depending on a single node is the possibility of temporary forks or network partitions. In such scenarios, different nodes may have divergent views of the blockchain's state, leading to inconsistent data being returned. Additionally, individual nodes may experience network latency, causing delays in propagating the latest blockchain data to other nodes. Nodes can occasionally become out of sync with the rest of the network, potentially providing outdated or incorrect information. To mitigate these risks and ensure the reliability and consistency of blockchain data, it is crucial to embrace redundancy by using multiple nodes. By querying multiple nodes and aggregating their responses, the chances of encountering inconsistent or inaccurate data are significantly reduced. Employing redundancy in blockchain data retrieval offers several benefits: 1. **Increased reliability**: With multiple nodes serving as data sources, the system becomes more resilient to individual node failures or temporary outages. If one node becomes unresponsive or returns erroneous data, the system can seamlessly fall back to other nodes, ensuring uninterrupted access to reliable blockchain data. 2. **Improved data accuracy**: By aggregating responses from multiple nodes, inconsistencies or temporary forks can be detected and resolved through a consensus mechanism. This mechanism ensures that the data retrieved is consistent with most nodes, reducing the likelihood of interacting with outdated or incorrect information. 3. **Load balancing**: Distributing queries across multiple nodes helps to balance the load and avoid overwhelming any single node with excessive requests. This load balancing can improve overall system performance and responsiveness. 4. **Fault tolerance**: Redundancy introduces fault tolerance into the system, as the failure or misconfiguration of a single node does not necessarily lead to complete system failure. The system can gracefully degrade and continue operating by leveraging the remaining functional nodes. Learn how to build a simple load balancer in JavaScript by reading [Make your DApp more reliable with Chainstack](/docs/make-your-dapp-more-reliable-with-chainstack). ## The ethers `FallbackProvider` The `ethers.js` library, a popular JavaScript library for interacting with Ethereum-based blockchains, provides the `FallbackProvider` tool. This utility is designed to enhance the reliability and consistency of blockchain data retrieval by leveraging redundancy and a consensus mechanism across multiple providers. At its core, the `FallbackProvider` acts as a wrapper around a set of individual providers, such as Ethereum JSON-RPC providers. When querying for blockchain data, the `FallbackProvider` sends requests to multiple providers simultaneously and aggregates their responses. It then applies a configurable consensus mechanism to determine the most reliable and consistent result. The `FallbackProvider` operates by distributing requests across multiple providers, each with its own priority, weight, and stall timeout settings. These settings allow the `FallbackProvider` to prioritize and weight the responses from different providers based on their expected reliability and responsiveness. When a request is made to the `FallbackProvider`, it sends the request to all configured providers concurrently. As responses start arriving, the `FallbackProvider` evaluates them against a pre-defined quorum value, which specifies the minimum number of providers that must agree on the same result for it to be considered a consensus. If the quorum is met, meaning that the required number of providers return the same result, the `FallbackProvider` considers this the consensus result and returns it to the caller. However, if the quorum is unmet, the `FallbackProvider` employs a fallback mechanism to handle potential inconsistencies or failures. The fallback mechanism prioritizes providers based on their assigned weights and stall timeouts. If a provider fails to respond within its configured stall timeout, the `Fallback provider` disregards its response and moves to the next highest-priority provider. This process continues until the quorum is met or all providers have been exhausted. By aggregating responses from multiple providers and applying a consensus mechanism, the `FallbackProvider` helps mitigate the risks of relying on a single node for blockchain data. It ensures that the data returned is consistent with most providers, reducing the likelihood of interacting with outdated, incorrect, or divergent information. ### Deploy a Chainstack node Before diving into the implementation, ensure you have a Chainstack account. Deploying a node on Chainstack is essential for accessing and interacting with blockchain networks. Deploy 3 or mix Chainstack and public nodes for a good demonstration. Choose different geographical regions for each to maximize network uptime and reduce latency, which is crucial for a robust and efficient blockchain application. ## Project setup To set up the JavaScript project and integrate the `FallbackProvider` from the `ethers.js` library, we'll need to follow these steps: First, create a new directory for your project and initialize a new Node.js project by running `npm init` in your terminal. This will create a `package.json` file, which will manage your project's dependencies. Check out [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) to learn how to manage Node projects. Next, install the required dependencies by running the following command: ``` npm install ethers dotenv ``` This will install the `ethers.js` library, which provides the `FallbackProvider` functionality, and the `dotenv` package, which allows us to load environment variables from a `.env` file. After the installation is complete, create a new file named `.env` in the root directory of your project. This file will store the URLs of the JSON-RPC providers you want to use with the `FallbackProvider`. Add the following lines to the `.env` file, replacing the placeholders with the actual provider URLs: ``` RPC_1="YOUR_NODE_URL" RPC_2="YOUR_NODE_URL" RPC_3="YOUR_NODE_URL" ``` You can add as many providers as you need, but we'll use three endpoints for this example. ## The full code Now that the project is setup create a new file named `index.js` and paste the following code: ```javascript Javascript const { ethers } = require('ethers'); require("dotenv").config(); const url1 = process.env.RPC_1; const url2 = process.env.RPC_2; const url3 = process.env.RPC_3; const stallTimeout = 2000; // Example timeout const quorum = 2; // Quorum needed for consensus // Define JSON RPC Providers without the network object const provider1 = new ethers.JsonRpcProvider(url1); const provider2 = new ethers.JsonRpcProvider(url2); const provider3 = new ethers.JsonRpcProvider(url3); // Create a FallbackProvider instance with a specified quorum const fallbackProvider = new ethers.FallbackProvider([ { provider: provider1, priority: 2, // Will prioritize this provider weight: 3, // Assuming provider1 is the most reliable stallTimeout }, { provider: provider2, priority: 1, weight: 2, stallTimeout: 1500 // Adjusted based on expected responsiveness }, { provider: provider3, priority: 1, weight: 1, stallTimeout: 2500 // Adjusted for a provider that might be slower } ], quorum); async function getBlockNumber() { try { const blockNumber = await fallbackProvider.getBlockNumber(); console.log(`Latest block: ${blockNumber}`); } catch (error) { console.error("Error fetching block number. Error:", error.message); console.log("Attempting to restart the program..."); // Optionally, implement a retry mechanism or other logic here // For example, wait for a few seconds before retrying setTimeout(getBlockNumber, 3000); // Retry after 3 seconds } } // Call getBlockNumber every 3 seconds console.log('Fetching latest block from various providers...') setInterval(getBlockNumber, 3000); ``` ## Code breakdown The code is designed to interact with blockchain networks through Ethereum's JSON RPC API using multiple providers for enhanced reliability and performance. It uses the `ethers.js` library, a popular choice for interacting with the Ethereum blockchain and its ecosystems. Let's break down how this code works, focusing on its key components and functionalities: ### Setup and configuration ```jsx jsx const { ethers } = require('ethers'); require("dotenv").config(); ``` This part imports the required dependencies. The `ethers` object is imported from the `ethers.js` library, which provides the functionality for interacting with Ethereum-based blockchains, including the `FallbackProvider`. The `dotenv` package is loaded, which allows us to load environment variables from the `.env` file. ### RPC URLs ```jsx jsx const url1 = process.env.RPC_1; const url2 = process.env.RPC_2; const url3 = process.env.RPC_3; ``` Here, we retrieve the URLs of the JSON-RPC providers from the environment variables stored in the `.env` file. These URLs will be used to create instances of the `JsonRpcProvider`. ### Configuration constants ```jsx jsx const stallTimeout = 2000; // Example timeout const quorum = 2; // Quorum needed for consensus ``` These lines define two constants: `stallTimeout` and `quorum`. `stallTimeout` is set to 2000 milliseconds (2 seconds), which determines the maximum time the `FallbackProvider` will wait for a response from a provider before considering it unresponsive. `quorum` is set to 2, specifying that at least two providers must return the same result to be considered a consensus. ### JSON RPC providers ```jsx jsx // Define JSON RPC Providers without the network object const provider1 = new ethers.JsonRpcProvider(url1); const provider2 = new ethers.JsonRpcProvider(url2); const provider3 = new ethers.JsonRpcProvider(url3); ``` In this section, we create instances of the `ethers.JsonRpcProvider` using the URLs retrieved from the environment variables. These providers will be used as the underlying data sources for the `FallbackProvider`. ### Fallback provider ```jsx jsx // Create a FallbackProvider instance with a specified quorum const fallbackProvider = new ethers.FallbackProvider([ { provider: provider1, priority: 2, weight: 3, // Assuming provider1 is the most reliable stallTimeout }, { provider: provider2, priority: 1, weight: 2, stallTimeout: 1500 // Adjusted based on expected responsiveness }, { provider: provider3, priority: 1, weight: 1, stallTimeout: 2500 // Adjusted for a provider that might be slower } ], quorum); ``` Here, we create an instance of the `ethers.FallbackProvider` by passing an array of provider configurations and the desired `quorum` value. Each provider configuration includes the following properties: * `provider`: The instance of the `JsonRpcProvider` to be used. * `Priority`: A numeric value representing the provider's priority. Higher values indicate higher priority. * `weight`: A numeric value representing the weight or reliability of the provider. Higher values indicate higher reliability. * `stallTimeout`: The maximum time (in milliseconds) to wait for a response from the provider before considering it unresponsive. In this example, `provider1` is given the highest priority (2) and weight (3), assuming it is the most reliable provider. `provider2` and `provider3` have lower priorities (1) and weights (2 and 1, respectively), with adjusted `stallTimeout` values based on their expected responsiveness. ### `getBlockNumber` function ```jsx jsx async function getBlockNumber() { try { const blockNumber = await fallbackProvider.getBlockNumber(); console.log(`Latest block: ${blockNumber}`); } catch (error) { console.error("Error fetching block number. Error:", error.message); console.log("Attempting to restart the program..."); // Optionally, implement a retry mechanism or other logic here // For example, wait for a few seconds before retrying setTimeout(getBlockNumber, 3000); // Retry after 3 seconds } } ``` The `getBlockNumber` function is an asynchronous function that fetches the latest block number from the `fallbackProvider`. Inside the `try` block, it calls `fallbackProvider.getBlockNumber()` and awaits the result. The console logs the latest block number if the block number is fetched successfully. If an error occurs, it catches the error and logs the error message. It also logs a message indicating that it's attempting to restart the program and includes a comment suggesting that a retry mechanism or other logic could be implemented here. This example uses `setTimeout` to call `getBlockNumber` again after a 3-second delay. ### Periodic execution ```jsx jsx // Call getBlockNumber every 3 seconds console.log('Fetching latest block from various providers...') setInterval(getBlockNumber, 3000); ``` Finally, this part logs a message to the console indicating it fetches the latest block from various providers. It then sets an interval using `setInterval` to call the `getBlockNumber` function every 3 seconds, continuously fetching and logging the latest block number. By combining the `FallbackProvider` with multiple JSON-RPC providers and configuring their priorities, weights, and stall timeouts, this code demonstrates how to enhance the reliability and consistency of blockchain data retrieval. The `FallbackProvider` will aggregate responses from the configured providers, apply the consensus mechanism based on the specified quorum, and handle failures or timeouts by falling back to other providers. ### Error handling and retries The error handling within `getBlockNumber` uses a `try-catch` block to catch any exceptions. Suppose an error occurs within the `FallbackProvider`, meaning there is a disagreement in the consensus or the providers with higher priority and weights fail. In that case, it logs the message and attempts to restart the function after a 3-second delay, demonstrating a simple retry mechanism. ## Understanding the `FallbackProvider` configuration The `FallbackProvider` instance is created by passing an array of provider configurations and the desired quorum value to the `ethers.FallbackProvider` constructor. This array allows you to specify multiple providers and configure their behavior within the `FallbackProvider`. Each provider configuration in the array is an object with the following properties: 1. `provider`: This is an instance of the `JsonRpcProvider` you want to include in the `FallbackProvider`. In this example, `provider1`, `provider2`, and `provider3` are instances created earlier using the provider URLs from the environment variables. 2. `priority`: This numeric value represents the provider's priority. Higher values indicate a higher priority. When the `FallbackProvider` needs to select a provider for a request, it will prioritize providers with higher priority values. In the example, `provider1` has the highest priority of 2, while `provider2` and `provider3` have a lower priority of 1. 3. `weight`: This numeric value represents the provider's weight or reliability. Higher values indicate a higher level of reliability. The `FallbackProvider` uses these weights when determining the consensus result. In the example, `provider1` has the highest weight of 3, indicating that it is considered the most reliable provider, while `provider2` weights 2, and `provider3` weights 1. 4. `stallTimeout`: This value specifies the maximum time (in milliseconds) that the `FallbackProvider` will wait for a response from the provider before considering it unresponsive or "stalled." If the provider doesn't respond within this time, the `FallbackProvider` will disregard its response and move on to the next provider. In the example, `provider1` uses the default `stallTimeout` value of 2000 milliseconds (2 seconds), `provider2` has a shorter `stallTimeout` of 1500 milliseconds (1.5 seconds), and `provider3` has a longer `stallTimeout` of 2500 milliseconds (2.5 seconds). By configuring these properties for each provider, you can fine-tune the behavior of the `FallbackProvider` based on your specific requirements and your providers' expected reliability and responsiveness. The `quorum` parameter passed to the `FallbackProvider` constructor specifies the minimum number of providers agreeing on the same result to be considered a consensus. In this example, the `quorum` is set to 2, meaning that at least two providers must return the same result for the `FallbackProvider` to consider it a valid consensus. Users can customize the configuration of the `FallbackProvider` by adjusting the properties of the provider objects in the array and the `quorum` value. For instance, if you have a provider that is known to be highly reliable, you can assign it a higher priority and weight. If you expect a provider to respond slower, you can increase its `stallTimeout` value accordingly. Additionally, you can adjust the `quorum` value based on the level of consensus you require for your application. ## Conclusion In this tutorial, we explored the `FallbackProvider` from the `ethers.js` library, a powerful tool designed to enhance the reliability and consistency of blockchain data retrieval. We learned the importance of redundancy when interacting with blockchain networks and how relying solely on a single node can introduce risks of inconsistent or inaccurate data due to factors like network latency, temporary forks, or out-of-synchrony nodes. The `FallbackProvider` addresses these challenges by leveraging multiple JSON-RPC providers and employing a consensus mechanism. By aggregating responses from multiple providers, prioritizing them based on their expected reliability, and applying a configurable quorum, the `FallbackProvider` ensures that the data retrieved is consistent with most providers, mitigating the risks associated with relying on a single source. We walked through the setup process, including installing dependencies, configuring environment variables, and creating instances of the `JsonRpcProvider` and `FallbackProvider`. We also explored the code implementation, breaking down each component and explaining the configuration options such as provider priority, weight, and stall timeout. By embracing redundancy and consensus mechanisms like the `FallbackProvider`developers can build more robust and fault-tolerant applications that interact with blockchain networks, ensuring reliable and accurate data retrieval, even in the face of potential inconsistencies or failures. # Solana: Enhancing SPL Token Transfers with Retry Logic Source: https://docs.chainstack.com/docs/enhancing-solana-spl-token-transfers-with-retry-logic * This guide adds simple retry logic to the SPL token transfer process on Solana, building on the [previous code](/docs/transferring-spl-tokens-on-solana-typescript). * We wrap transaction submissions in a loop, retrying up to a set limit (`MAX_RETRY_FUNCTION`) with a short delay to handle transient failures (e.g., `TransactionExpiredBlockheightExceededError`). * We still include priority fees to bump transaction priority, and we can also configure more robust backoff or advanced retry patterns (exponential backoff, circuit breakers, etc.) if needed. * This approach significantly improves reliability, making the code more tolerant of temporary network congestion or node issues. ## Main article The [previous article](https://dash.readme.com/project/chainstack/v1.0/docs/transferring-spl-tokens-on-solana-typescript) explored transferring SPL tokens on the Solana blockchain using TypeScript. While the provided code successfully demonstrated the token transfer process, it lacked retry logic—a crucial aspect for handling potential failures and retrying failed transactions in blockchain applications. Retry logic helps mitigate the impact of transient network issues or node overload by automatically retrying failed transactions multiple times. This approach increases the likelihood of successful execution, ensuring better reliability and improving the overall user experience. This guide will extend the existing codebase by adding a simple retry logic to the token transfer process. We will discuss why retries are needed, explore different error scenarios, and implement a straightforward retry mechanism. Incorporating retry logic will make our application more resilient to temporary network problems, laying the foundation for further improvements in robustness and performance. Read [Transferring SPL tokens on Solana: A step-by-step TypeScript tutorial](https://dash.readme.com/project/chainstack/v1.0/docs/transferring-spl-tokens-on-solana-typescript) to find the full code base and learn how it works. ## Understanding the need for retry logic In blockchain applications, transactions can sometimes fail for various reasons, including network congestion, node overload, or other transient issues. One specific error that can occur when working with the Solana blockchain is the `TransactionExpiredBlockheightExceededError`. This error occurs when a transaction is repeatedly forwarded to subsequent block leaders without being included in any block until the associated blockhash, or recent blockhash, expires. The blockhash is a critical component of a transaction on Solana, acting as a reference to a recent block to ensure the transaction is processed promptly and to prevent double-spending. While it is currently impossible to completely eliminate this issue due to the non-deterministic nature of the current Solana mainnet scheduler, implementing retry logic can significantly mitigate the impact of such failures and improve the overall reliability of your blockchain application. Retrying failed transactions is crucial in blockchain applications for several reasons: 1. **Network resilience**: Blockchain networks can experience temporary disruptions, congestion, or node failures. Retrying transactions after a failure increases the chances of successful execution, ensuring that your application remains functional despite transient network issues. 2. **User experience**: In user-facing applications, failed transactions can lead to frustration and a poor user experience. By automatically retrying failed transactions, you can provide a seamless experience for your users, minimizing the need for manual intervention or retries. 3. **Data consistency**: In applications that involve critical data or financial transactions, failed transactions can result in data inconsistencies or financial losses. Retry logic helps ensure that transactions are eventually executed, maintaining data integrity and preventing potential losses. 4. **Fault tolerance**: Implementing retry logic is fundamental to building robust, fault-tolerant applications. By anticipating and handling failures gracefully, your application becomes more resilient and can recover from unexpected situations. By understanding the importance of retry logic and the potential issues that can arise in blockchain applications, you can take proactive steps to enhance the reliability and robustness of your Solana-based applications, providing a better user experience and ensuring the integrity of your data and transactions. ## Implementing the retry logic This section will focus on implementing a simple retry logic for the token transfer process. The approach involves wrapping the transaction send logic in a loop with a maximum number of retry attempts. We'll handle different error types, introduce a delay between retries, and log the retry attempts for better visibility. Find the original code base here: [Transferring SPL tokens on Solana: A step-by-step TypeScript tutorial](/docs/transferring-spl-tokens-on-solana-typescript#transferring-spl-tokens-code-walkthrough). ### Overview of the approach * Use a `for` loop to control the number of retry attempts * Set a maximum retry count using an environment variable (`MAX_RETRY_FUNCTION`) * Catch and handle errors within the loop * Implement a delay or backoff strategy between retries * Log or report retry attempts and errors Before starting, add two environment variables named `MAX_RETRY_FUNCTION` and `MAX_RETRY_WEB3JS` to your `.env` file and set the maximum number of retries. ``` MAX_RETRY_WEB3JS=10 # Max retries for the Web3.js instance MAX_RETRY_FUNCTION=5 # Max retries of the Retry function logic ``` The `MAX_RETRY_WEB3JS` variable controls the maximum number of retries performed by the Web3.js library when sending a transaction, while `MAX_RETRY_FUNCTION` controls the maximum number of retries for the custom retry logic implemented in our code. ## Code snippets and explanation Let's walk through the code snippet by snippet and explain what each part does in our exploration. This SPL transfer implementation includes priority fees, edit the micro lamports you want to add in this line: ```typescript TypeScript const PRIORITY_RATE = 12345; // MICRO_LAMPORTS ``` ### Wrapping the transaction send logic in a retry loop ```typescript TypeScript const retryCount = Number(process.env.MAX_RETRY_FUNCTION); // Default retry count set to 5 for (let attempt = 1; attempt <= retryCount; attempt++) { try { // Transaction send logic goes here ... return;// Exit the function on a successful transaction } catch (error) { // Handle errors and retry logic ... } } ``` In this snippet, we first retrieve the maximum retry count from the `MAX_RETRY_FUNCTION` environment variable. Then, we use a `for` loop to control the number of retry attempts. If the transaction is successful, we exit the function using the `return` statement. If an error occurs, we handle it in the `catch` block. ### Handling different error types ```typescript TypeScript catch (error) { console.error(`Attempt ${attempt} failed with error: ${error}`); if (attempt === retryCount) { // Last attempt failed, throw the error throw new Error(`Transaction failed after ${retryCount} attempts.`); } // Additional error handling or logging can be added here ... } ``` We log the current retry attempt and the error message in the catch block. If it's the last attempt (`attempt === retryCount`), we throw the error, effectively terminating the retry loop. Depending on your specific requirements, you can add error handling or logging logic here. ### Implementing a delay or backoff strategy between retries ```typescript TypeScript // Wait for 2 seconds before retrying await new Promise((resolve) => setTimeout(resolve, 2000)); ``` Introduce a delay or backoff strategy between retry attempts to avoid overwhelming the network or the Solana node with rapid retries. In this example, we use a simple 2-second delay (`setTimeout`) wrapped in a `Promise` to pause execution before the next retry. Based on your application's needs, you can adjust the delay duration or implement more advanced backoff strategies, such as exponential backoff. ### Logging or reporting retry attempts ```typescript TypeScript console.log(`Attempt ${attempt}: Starting Token Transfer Process`); ... console.error(`Attempt ${attempt} failed with error: ${error}`); ``` To provide better visibility and debugging capabilities, we log the current retry attempt at the beginning of each iteration and log the error message with the attempt number in case of failure. By incorporating these code snippets and explanations, you can implement a simple retry logic for your token transfer process on the Solana blockchain. This retry logic will help improve your application's reliability and resilience by automatically retrying failed transactions up to a specified maximum number of attempts, with a delay between each retry to avoid overwhelming the network. ## Customizing the retry logic While the implemented retry logic provides a simple and effective mechanism for handling transient failures, several potential enhancements can further augment the reliability and performance of your Solana blockchain application. These enhancements warrant consideration and evaluation based on your application's requirements and constraints. ### Adaptive delay or backoff strategy The current implementation employs a fixed 2-second delay between retry attempts. While this approach is suitable for various scenarios, it may be advantageous to consider dynamically adjusting the delay based on network conditions, the number of retry attempts, or the specific error encountered. One widely adopted strategy is exponential backoff, in which the delay between retries increases exponentially with each failed attempt. This approach can reduce the load on the network during periods of high congestion and provide the network with time to recuperate. Alternatively, adaptive delays could be incorporated based on real-time network metrics, such as the current transaction confirmation time or the number of pending transactions in the mempool (for EVM chains). Monitoring these metrics can adjust the delay accordingly, balancing retry frequency and network load. ### Advanced retry patterns The current implementation uses a simple retry pattern, where all failed transactions are retried up to a maximum number of attempts. However, more advanced retry patterns may be considered, contingent upon the application's requirements and the nature of the errors encountered. One such pattern is the Circuit Breaker pattern, which introduces a temporary pause in retries if a certain threshold of consecutive failures is reached. This can be advantageous in scenarios where immediately retrying after multiple failures is unlikely to succeed, allowing the network or the application to recover before attempting further retries. Another pattern is the Bulkhead pattern, which limits the number of concurrent retries to prevent overwhelming the system or network resources. This can be particularly beneficial in applications with a high volume of transactions or when dealing with resource-intensive operations. ## The full code Here, you can find the full implementation of SPL Token transfer plus retry logic. To learn how to set up and run the project, check out [Transferring SPL tokens on Solana: A step-by-step TypeScript tutorial](/docs/transferring-spl-tokens-on-solana-typescript). ```typescript TypeScript import { getOrCreateAssociatedTokenAccount, createTransferInstruction, } from "@solana/spl-token"; import { Connection, PublicKey, TransactionMessage, VersionedTransaction, Keypair, ParsedAccountData, ComputeBudgetProgram } from "@solana/web3.js"; import bs58 from "bs58"; import "dotenv/config"; // Fetches the number of decimals for a given token to accurately handle token amounts. async function getNumberDecimals( mintAddress: PublicKey, connection: Connection ): Promise { const info = await connection.getParsedAccountInfo(mintAddress); const decimals = (info.value?.data as ParsedAccountData).parsed.info .decimals as number; console.log(`Token Decimals: ${decimals}`); return decimals; } // Initializes a Keypair from the secret key stored in environment variables. Essential for signing transactions. function initializeKeypair(): Keypair { const privateKey = new Uint8Array(bs58.decode(process.env.PRIVATE_KEY!)); const keypair = Keypair.fromSecretKey(privateKey); console.log( `Initialized Keypair: Public Key - ${keypair.publicKey.toString()}` ); return keypair; } // Sets up the connection to the Solana cluster, utilizing environment variables for configuration. function initializeConnection(): Connection { const rpcUrl = process.env.SOLANA_RPC!; const connection = new Connection(rpcUrl, { commitment: "confirmed", wsEndpoint: process.env.SOLANA_WSS, }); // Redacting part of the RPC URL for security/log clarity console.log(`Initialized Connection to Solana RPC: ${rpcUrl.slice(0, -32)}`); return connection; } async function main() { const retryCount = Number(process.env.MAX_RETRY_FUNCTION); // Default retry count set to 5 for (let attempt = 1; attempt <= retryCount; attempt++) { try { console.log(`Attempt ${attempt}: Starting Token Transfer Process`); const connection = initializeConnection(); const fromKeypair = initializeKeypair(); const destinationWallet = new PublicKey( "CzNGm14nMopjGYyycMbWqEF2e1aEHcJLKk2CHw9BiZwC" ); const mintAddress = new PublicKey( "EPjFWdd5AufqSSqeM2qN1xzybapC8G4wEGGkZwyTDt1v" ); // Config priority fee and amount to transfer const PRIORITY_RATE = 12345; // MICRO_LAMPORTS const transferAmount = 0.01; // This will need to be adjusted based on the token's decimals // Instruction to set the compute unit price for priority fee const PRIORITY_FEE_INSTRUCTIONS = ComputeBudgetProgram.setComputeUnitPrice({microLamports: PRIORITY_RATE}); console.log("----------------------------------------"); const decimals = await getNumberDecimals(mintAddress, connection); let sourceAccount = await getOrCreateAssociatedTokenAccount( connection, fromKeypair, mintAddress, fromKeypair.publicKey ); console.log(`Source Account: ${sourceAccount.address.toString()}`); let destinationAccount = await getOrCreateAssociatedTokenAccount( connection, fromKeypair, mintAddress, destinationWallet ); console.log( `Destination Account: ${destinationAccount.address.toString()}` ); console.log("----------------------------------------"); const transferAmountInDecimals = transferAmount * Math.pow(10, decimals); const transferInstruction = createTransferInstruction( sourceAccount.address, destinationAccount.address, fromKeypair.publicKey, transferAmountInDecimals ); let latestBlockhash = await connection.getLatestBlockhash("confirmed"); const messageV0 = new TransactionMessage({ payerKey: fromKeypair.publicKey, recentBlockhash: latestBlockhash.blockhash, instructions: [PRIORITY_FEE_INSTRUCTIONS, transferInstruction], }).compileToV0Message(); const versionedTransaction = new VersionedTransaction(messageV0); versionedTransaction.sign([fromKeypair]); const txid = await connection.sendTransaction(versionedTransaction, { skipPreflight: false, maxRetries: Number(process.env.MAX_RETRY_WEB3JS), preflightCommitment: "confirmed", }); console.log(`Transaction Submitted: ${txid}`); const confirmation = await connection.confirmTransaction( { signature: txid, blockhash: latestBlockhash.blockhash, lastValidBlockHeight: latestBlockhash.lastValidBlockHeight, }, "confirmed" ); if (confirmation.value.err) { throw new Error("🚨Transaction not confirmed."); } console.log( `Transaction Successfully Confirmed! 🎉 View on SolScan: https://solscan.io/tx/${txid}` ); return; // Success, exit the function } catch (error) { console.error(`Attempt ${attempt} failed with error: ${error}`); if (attempt === retryCount) { // Last attempt failed, throw the error throw new Error(`Transaction failed after ${retryCount} attempts.`); } // Wait for 2 seconds before retrying await new Promise((resolve) => setTimeout(resolve, 2000)); } } } main() ``` ## Conclusion In this article, we extend the functionality of our Solana SPL token transfer application by implementing simple retry logic. By adding this retry mechanism, we significantly improve our application's reliability and resilience. Failed transactions are automatically retried multiple times, mitigating the impact of transient network issues or node overload. We started by understanding the importance of retry logic in blockchain applications, particularly in the Solana blockchain and the `TransactionExpiredBlockheightExceededError` context. We then implemented the retry logic by wrapping the transaction send logic in a loop with maximum retry attempts, handling different error types, introducing a delay or backoff strategy between retries, and logging retry attempts for better visibility. Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Ethereum Dencun: Rundown with Examples Source: https://docs.chainstack.com/docs/ethereum-dencun-rundown-with-examples **TLDR:** * Dencun brings several EIPs that streamline both execution and consensus layers, especially reducing costs for rollups (EIP-4844) and introducing transient storage for more efficient contract logic (EIP-1153). * Block data can now reference the beacon chain (EIP-4788) and handle data blobs separately for \~18 days (EIP-4844), which cuts down on rollup fees. * Consensus updates (EIPs 7044, 7045, 7514) simplify validator exits, extend attestation windows, and limit validator churn. * Other highlights include new EVM opcodes for memory copying (EIP-5656), restricting SELFDESTRUCT usage (EIP-6780), and offering a `BLOBBASEFEE` opcode (EIP-7516). ## Main article The Dencun upgrade activates on the mainnet at [epoch 269568](https://beaconcha.in/epoch/269568), so here's a quick rundown on all the EIPs included in the hardfork. The EIPs here follow the same order as they are in the meta [EIP-7569: Hardfork Meta - Dencun](https://eips.ethereum.org/EIPS/eip-7569) — an EIP that lists all the EIPs of the hardfork & other hardfork details. ## EIP-1153: Transient storage opcodes Execution layer. [EIP-1153 link](https://eips.ethereum.org/EIPS/eip-1153). This EIP adds a new type to storage: transient storage. With the EIP-1153 activation on the network, there are now three types of storage (previously two): * temporary memory storage — byte-level storage that persists data during a function execution and is then discarded * transient storage — data persists within one transaction (across all function executions) and is then discarded * permanent storage — data persists across transactions & blocks The transient storage opcodes are `TSTORE` and `TLOAD`. Having no transient storage contributed to extra gas consumption quirks in Ethereum; the most well known one being users charged extra for interacting with the contracts implementing OpenZeppelin's Reentrancy Guard. In practice, you will pay less on the execution level when interacting with the contracts that have functions that add no state change, like, again Reentrancy Guard. Check out [this nice deep dive into EIP-1153](https://medium.com/@organmo/demystifying-eip-1153-transient-storage-faeabbadd0d). ## EIP-4788: Beacon block root in the EVM Execution layer. [EIP-4788 link](https://eips.ethereum.org/EIPS/eip-4788). EVM can now see the parent consensus layer (beacon chain) block root. Practically, running an [eth\_getBlockByNumber | Ethereum](/reference/ethereum_node_api/blocks_info/eth_getBlockByNumber) after Dencun adds `parentBeaconBlockRoot` in the block details: ```javascript Javascript "baseFeePerGas": "0x17357a9e30", "blobGasUsed": "0x0", "difficulty": "0x0", "excessBlobGas": "0x4c80000", "extraData": "0x34353131353565", "gasLimit": "0x1c9c380", "gasUsed": "0x9ee9e4", "hash": "0x632c2cf1d7d980760507f679d5bd3f07f1e767f63e28b193cbb2efbc86c95e53", "logsBloom": "0x42014...", "miner": "0x0c06b6d4ec451987e8c0b772ffcf7f080c46447a", "mixHash": "0xab9d05372e79c60cf348746f5772c9b28aad1164a1736611584b97080dca4999", "nonce": "0x0000000000000000", "number": "0x538e0e", "parentBeaconBlockRoot": "0x877cfcbcc3624e070c89287db70841834b9284dc135d674921e11a4c8fc68784", // <-- HERE "parentHash": "0x54ed6b5344a035b71cce3bd96ad84283fd2b56f63c76f8d3aab8e52974e7c6ba", "receiptsRoot": "0x3378df8d38bead81edb3f081f1f63a77a32f6e1f872c275ed8907aef59d0ba02", "sha3Uncles": "0x1dcc4de8dec75d7aab85b567b6ccd41ad312451b948a7413f0a142fd40d49347", "size": "0xb7fd", "stateRoot": "0xfece204505a4cd70af610900fb5d116c291407c55d24d066177738edcf4c6431", "timestamp": "0x65f15304", "totalDifficulty": "0x3c656d23029ab0" ``` ## EIP-4844: Shard Blob Transactions Consensus layer. Execution layer. [EIP-4844 link](https://eips.ethereum.org/EIPS/eip-4844). This is your favorite one that's been covered multiple times by everyone, so here's a TLDR and a practical example. **TLDR** EIP-4844 makes rollup transactions cheaper because the EVM originally wasn't designed to execute the rollup batches and store them on the execution layer as part of a smart contract state. This is where the majority of your rollup transaction fees used to go to — to execute & store the data on the execution layer as part of the contract state. So EIP-4844 removes this "quirk" and implements as a sane version — rollup batches can now be submitted as a special transaction type as pure data not executable by the EVM and the data is stored on the consensus layer for roughly 18 days. These are called data blobs or "sidecars" attached to blocks. This makes submitting & keeping the rollup data on Ethereum much cheaper and consequently the rollup transaction fees cheaper. Let's now a have a practical walkthrough and get the data. (On Sepolia where the Dencun upgrade is already live). **Walkthrough with Arbitrum Sepolia** Check out the batches of transactions rolled up that Arbitrum is sending to Ethereum: [Rollup batches](https://sepolia.arbiscan.io/batches). Pick a batch. For example, [batch 90169](https://sepolia.arbiscan.io/batch/90169?isnitro=true). See that the batch's L1 block number is [5475939](https://sepolia.etherscan.io/block/5475939). Now let's do an [eth\_getBlockByNumber | Ethereum](/reference/ethereum_node_api/blocks_info/eth_getBlockByNumber) on block `5475939` to get the block's `parentBeaconBlockRoot` as discussed previously: ```shell Shell curl --request POST \ --url https://ethereum-sepolia.core.chainstack.com/1fc2ff7e068591f6b44db1a454232d3d \ --header 'accept: application/json' \ --header 'content-type: application/json' \ --data ' { "id": 1, "jsonrpc": "2.0", "method": "eth_getBlockByNumber", "params": [ "5475939", false ] } ' | jq -r '.result.parentBeaconBlockRoot' ``` Use the `parentBeaconBlockRoot` to identify that it's [slot 4548475](https://sepolia.beaconcha.in/slot/0xbb7ef08755c4f394fd2e2cb325ac2228aa933d8a581e8e19e26f31f610ab118b) on the consensus layer. Since it's parent, we need the next slot: [slot 4548476](https://sepolia.beaconcha.in/slot/4548476). Use the slot number in a [Retrieve blob sidecar](/reference/ethereum_beacon_chain_api/state/getBlobSidecarByRoot) call to the consensus layer: ```shell Shell curl -X 'GET' \ 'https://ethereum-sepolia.core.chainstack.com/beacon/1fc2ff7e068591f6b44db1a454232d3d/eth/v1/beacon/blob_sidecars/4548476' \ -H 'accept: application/json' ``` The response is the rolled up batch 90169 of Arbitrum Sepolia transactions as referenced in block 5475939 as a blob. Note that the blob lives for about 18 days, so make sure you use your own values. For a detailed walkthrough and explanation, see [Blob transactions the hard way](/docs/blob-transactions-the-hard-way). ## EIP-5656: MCOPY - Memory copying instruction Execution layer. [EIP-5656 link](https://eips.ethereum.org/EIPS/eip-5656). Basically a new EVM operation that's also efficient & saves gas costs: where the EVM used to do `MSTORE` & `MLOAD`, the same thing can now be done with `MCOPY`. This depends on the smart contract developer implementation however. ## EIP-6780: SELFDESTRUCT only in same transaction Execution layer. [EIP-6780 link](https://eips.ethereum.org/EIPS/eip-6780). `SELFDESTRUCT`, originally introduced as an understandable house cleaning opcode for developers — to remove the no longer needed code from the Ethereum state, actually made metamorphic contracts possible through the use of `CREATE2` & `SELFDESTRUCT`. You can deploy a contract, destroy the code, and then deploy a different contract to the same address. [Example implementation](https://ethereum-blockchain-developer.com/110-upgrade-smart-contracts/12-metamorphosis-create2/). [Vitalik's thoughts on the issue](https://hackmd.io/@vbuterin/selfdestruct). This EIP makes the use of `SELFDESTRUCT` much more limited: * it can either be called with the full set of instructions (recovering funds to the target and deleting contract code) within the contract creation transaction * OR it's limited to only recovering the funds and NOT deleting the contract code ## EIP-7044: Perpetually Valid Signed Voluntary Exits Consensus layer. [EIP-7044 link](https://eips.ethereum.org/EIPS/eip-7044). The EIP simplifies the process of exiting from staking positions for users doing non-custodial delegated staking. Before EIP-7044, users in non-custodial delegated staking arrangements depended on the process required a voluntary exit message to be signed by the validator's signing key, which is typically controlled by the validator operator. Pre-signed voluntary exits were valid only for the current and previous consensus layer fork versions, creating a dependency on the validator operator to process exit requests in a timely manner. After EIP-7044 activation, signed voluntary exit messages become perpetually valid, eliminating the need for them to be re-signed after future consensus layer upgrades. This change removes the uncertainty and reliance on validator operators for executing exit requests. Stakers now have more autonomy and assurance that they can exit their positions and access their funds at any time, without worrying about the validity of their pre-signed exit messages being affected by future network upgrades. ## EIP-7045: Increase Max Attestation Inclusion Slot Consensus layer. [EIP-7045 link](https://eips.ethereum.org/EIPS/eip-7045). This EIP extends the timeframe within which validator attestations can be included in a block on the consensus layer. Previously, attestations had to be included within the window of 1 epoch (approximately 6.4 minutes, given that an epoch consists of 32 slots and each slot is 12 seconds). This was considered too narrow of a window. EIP-7045 expands the window from 1 epoch to 2 epochs. ## EIP-7514: Add Max Epoch Churn Limit Consensus layer. [EIP-7514 link](https://eips.ethereum.org/EIPS/eip-7514). EIP-7514 adds a max epoch churn limit to regulate the number of validators that can join or leave the network within a given epoch. It caps the churn limit at 8 validators per epoch to prevents the state-size bloat and the potential strain on network resources as the number of active validators increases. The churn limit, before this EIP, was variable with a minimum value of four and increased as more validators joined, but with EIP-7514, it's now fixed, preventing the active validator set from growing too rapidly. ## EIP-7516: BLOBBASEFEE opcode Execution layer. [EIP-7516 link](https://eips.ethereum.org/EIPS/eip-7516). Tightly related to EIP-4844 (see above), this is a simple opcode `BLOBBASEFEE` allows contracts to manage data costs dynamically, especially for rollups. `BLOBBASEFEE` only costs 2 gas to execute and provides the value of the blob base-fee directly from the block header. Track the [eth\_blobBaseFee](https://github.com/ethereum/go-ethereum/pull/29140) implementation in Go Ethereum. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Ethereum: How to analyze pending blocks Source: https://docs.chainstack.com/docs/ethereum-how-to-analyze-pending-blocks TLDR: * **What is a pending block?** It’s the block a node (whether proposer or not) believes will be next in line for inclusion on the chain. A proposer’s pending block is built from their local mempool and is what they’ll broadcast. A non-proposer’s pending block, however, is only a “best guess” since the proposer might have a different mempool or prioritization strategy. * **How does Geth create it?** Up to version 1.14.0, Geth continuously managed a pending block in the background, updating it as new transactions arrived. Now, Geth builds a pending block on demand, caches it briefly (2 seconds by default), then discards or updates it when a new block is published or if a reorg happens. * **Why do transaction sets differ?** Nodes prioritize local transactions first (if they meet gas requirements) and then sort remote ones by their effective gas tip. The block proposer might see different transactions altogether and order them differently based on fees, local config, or simply having a newer view of the mempool. * **What does the code show?** By comparing the “pending” block to the finalized block with the same block number, you can see which transactions made it and how their positions changed. In practice, you’ll often find: * Some transactions in the pending block don’t appear in the finalized block. * The finalized block may include additional transactions. * Transaction orders may differ because the proposer’s actual block can diverge from your local pending snapshot. ## Main article Overall, monitoring the pending block gives insights into potential arbitrage or flash loan opportunities, but it’s never a hard guarantee that your transactions will be included exactly as you see them. In the Ethereum blockchain, understanding the concept of a pending block is crucial for seizing arbitrage opportunities, executing flash loans, and managing risks. Monitoring pending blocks allows users to identify large transactions or price discrepancies across decentralized exchanges. When a user queries the pending block through a node client, several processes occur under the hood to assemble and manage this block. Since Geth stays [the most popular Ethereum execution client](https://www.ethernodes.org/), let's explore how Geth specifically constructs the pending block. ### Run Ethereum nodes on Chainstack [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Introduction The pending tag of blocks allows developers to get information on the transactions expected to be included in the next block. According to the [JSON-RPC specification](https://github.com/ethereum/execution-apis/blob/main/src/schemas/block.yaml) for Ethereum clients, there are five block tags that the clients must use: 1. **Earliest**. Refers to the lowest numbered block available to the client. 2. **Pending**. Refers to the next block anticipated by the client, built atop the latest and including transactions typically sourced from the local mempool. 3. **Latest**. Points to the most recent block in the client's observed canonical chain, subject to potential reorganizations even under normal conditions. 4. **Safe**. Represents the latest block secure from reorganizations under assumptions of an honest majority and specific synchronicity conditions. 5. **Finalized**. Denotes the most recent crypto-economically secure block, immune to reorganizations except under manual intervention coordinated by the community. It must be noted that if the node is a block proposer, then the pending block is the one the node intends to broadcast later. This pending block is built using the latest transactions from the local mempool, prioritized, and validated according to the consensus rules. On the other hand, if the node is not a block proposer (whether it is a validator or a non-validator), the pending block still consists of transactions from the local mempool. However, these transactions are included based on the node’s expectations of what might be in the next block, but there is no guarantee that these transactions will be included. **The actual inclusion of transactions depends on the block proposer who might have a different mempool state and transaction selection criteria.** ## Geth implementation Let’s explore the pending block on the example of the Geth execution client. Recently, [the client has been updated](https://github.com/ethereum/go-ethereum/releases/tag/v1.14.0) and a pending block is created upon request (whenever this block tag is used) and cached for a period set in the pendingTTL (time-to-live) property of the miner ([currently, it’s set to 2 sec](https://github.com/ethereum/go-ethereum/blob/master/miner/pending.go)). Before the update, the pending block was managed continuously in the background. Geth would keep updating the pending block as new transactions arrived in the mempool, and the state of the blockchain changed. To check which client version you are connected to, you can use the [web3\_clientVersion RPC method](/reference/ethereum_node_api/client_info/web3_clientVersion). Let’s examine the lifecycle of the pending block before the update has been introduced (Geth \< 1.14.0). The component of the Geth client that is responsible for managing the pending block is [the miner](https://github.com/ethereum/go-ethereum/blob/v1.13.15/miner/worker.go). After receiving the request to interact with a pending block, the Geth checks if a pending block is already available and up-to-date. If it is not available, the validator fetches, validates, sorts pending transactions in the local mempool, and then includes them in the pending block. From now on, it generates and maintains the pending block continuously. If a pending block is available, the worker ensures it is updated with any new transactions or state changes that have occurred. Here are the scenarios that can occur to an existing pending block: 1. **Update**. When new transactions arrive in the mempool, they are validated, sorted, and included in the pending block. 2. **Discarding**. When a new block is built and added to the blockchain or a blockchain reorganization occurs, the current pending block may become invalid due to changes in the state. A new pending block is generated based on the new blockchain state. To better understand the construction of a pending block, we need to examine how transactions are filtered and ordered in the mempool (or txpool, as it's called in Geth). ### Mempool When transactions are added to the mempool, they undergo several [validation](https://github.com/ethereum/go-ethereum/blob/master/core/txpool/validation.go) steps to ensure they meet network conditions and node-specific configurations. Two relevant gas checks related to the article's topic are listed below. GasFeeCap ≥ GasTipCap and GasTipCap ≥ MinGasTip Once validated, transactions are dynamically filtered and sorted to maintain the mempool's efficiency and relevance. **Filtering** Transactions remain in the mempool if their gas fee cap exceeds the current base fee. Additionally, the effective gas tip must be greater than or equal to the minimum gas tip specified in the node configuration. GasFeeCap ≥ BaseFee and EffectiveGasTip ≥ MinGasTip EffectiveGasTip = min(GasFeeCap - BaseFee, GasTipCap) **Sorting** Transactions in the mempool are primarily sorted by their gas price, specifically their effective gas tip, which is the difference between the gas fee cap and the base fee or the gas tip cap, whichever is lower. ### EIP-1559 Gas fee cap and gas tip cap were introduced with EIP-1559 transactions. Legacy transactions, also known as Type-0 transactions, do not have these properties and only specify a gas price. These legacy transactions are sorted [by gas price alone](https://github.com/ethereum/go-ethereum/blob/7cfff30ba3a67de767a9b2a7405b91f120873d10/core/types/tx_legacy.go#L101), without the need for calculating an effective gas tip. For transactions from the same account, they are sorted by nonce to ensure they can be executed sequentially. If necessary, the time when the transaction was first seen can also be considered to help order transactions with the same gas price. ### Pending block When requesting a pending block, transactions are sourced from the mempool. Geth employs filtering and sorting logic mentioned the mempool section to ensure that pending transactions adhere to the current network conditions and node configuration. [Local transactions](https://github.com/ethereum/go-ethereum/blob/master/miner/worker.go#L415) are given the highest priority and are included in the block before considering remote ones, regardless of their effective gas tip, as long as it exceeds the minimum gas tip set in the node configuration. ### Local transactions Local transactions in Geth are those originating from addresses [explicitly added to the node's configuration](https://geth.ethereum.org/docs/fundamentals/command-line-options) or transactions submitted directly by the node itself. ## Code walkthrough Now let's have a hands-on walkthrough in Python. Here's what we are going to do: 1. Check the version of the node we are connected to. 2. Fetch a pending block and its number. 3. Fetch the latest block with the same number. 4. Compare transactions in the pending and latest blocks. ### Prerequisites Log in to your [Chainstack account](https://console.chainstack.com/) and get an Ethereum mainnet node. You can get by with a full node for this exercise. Open your terminal (Command Prompt for Windows or Terminal for macOS/Linux) and run the following command to clone the GitHub repository. ```bash Bash git clone https://github.com/smypmsa/geth-pending-latest-block.git cd your-repository ``` Set up your Python virtual environment. ```bash macOS/Linux python3 -m venv venv source venv/bin/activate ``` ```Text Windows python -m venv venv venv\Scripts\activate ``` With the virtual environment activated, run the following command to install the required dependencies. ```bash Bash pip install -r requirements.txt ``` Create the environment variable file (.env) and paste your Chainstack endpoint URL there. ```python Python ETH_NODE_URL= ``` [GitHub repository](https://github.com/smypmsa/geth-pending-latest-block) for all the scripts. ### Connect to the node Load the configuration file (.env), read its values and connect to the Ethereum node. ```python Python import os import time from dotenv import load_dotenv from web3 import Web3 # Load environment variables load_dotenv() ETH_NODE_URL = os.getenv("ETH_NODE_URL") # Create connection to Ethereum node web3 = Web3(Web3.HTTPProvider(ETH_NODE_URL)) ``` ### Fetch a pending block Check that the connection to the Ethereum node is established, get the node version and fetch a pending block. ```python Python def main(): if not web3.is_connected(): print("\nFailed to connect to the Ethereum node.") return print("\nConnected to the Ethereum node.") print("Node version:", web3.client_version) # Fetch pending block pending_block = web3.eth.get_block('pending') pending_block_number = pending_block.number ``` ### Fetch a finalized block To fetch a finalized block with the same number as the pending block we fetched earlier, a few retries may be needed. ```python Python # Retry fetching the finalized block with the same block number as the pending block finalized_block = None start_time = time.time() while True: try: finalized_block = web3.eth.get_block(pending_block_number, full_transactions=True) break except Exception as e: if time.time() - start_time > 60: print(f"Failed to fetch the finalized block within 60 seconds: {e}") return time.sleep(2) ``` ### Compare transactions Get transactions that are common for both blocks and that are unique for each block. ```python Python # Compare transactions in the pending block and finalized block print(f"\nComparing txs in pending block and finalized block (block number: {pending_block_number}):") pending_block_tx_hashes = {tx.hex() for tx in pending_block.transactions} finalized_block_tx_hashes = {tx['hash'].hex() for tx in finalized_block.transactions} common_txs = pending_block_tx_hashes.intersection(finalized_block_tx_hashes) pending_only_txs = pending_block_tx_hashes - finalized_block_tx_hashes finalized_only_txs = finalized_block_tx_hashes - pending_block_tx_hashes ``` Let’s also compare the positions of transactions in the pending block with their positions in the latest block. ```python Python position_changes = { "same_position": 0, "higher_position": 0, "lower_position": 0 } for tx_hash in common_txs: pending_index = [tx.hex() for tx in pending_block.transactions].index(tx_hash) finalized_index = [tx['hash'].hex() for tx in finalized_block.transactions].index(tx_hash) if pending_index == finalized_index: position_changes["same_position"] += 1 elif pending_index > finalized_index: position_changes["higher_position"] += 1 else: position_changes["lower_position"] += 1 ``` ### Putting it all together ```python Python import os import time from dotenv import load_dotenv from web3 import Web3 # Load environment variables load_dotenv() ETH_NODE_URL = os.getenv("ETH_NODE_URL") # Create connection to Ethereum node web3 = Web3(Web3.HTTPProvider(ETH_NODE_URL)) def main(): if not web3.is_connected(): print("\nFailed to connect to the Ethereum node.") return print("\nConnected to the Ethereum node.") print("Node version:", web3.client_version) # Fetch pending block pending_block = web3.eth.get_block('pending') pending_block_number = pending_block.number # Retry fetching the finalized block with the same block number as the pending block finalized_block = None start_time = time.time() while True: try: finalized_block = web3.eth.get_block(pending_block_number, full_transactions=True) break except Exception as e: if time.time() - start_time > 60: print(f"Failed to fetch the finalized block within 60 seconds: {e}") return time.sleep(2) # Compare transactions in the pending block and finalized block print(f"\nComparing txs in pending block and finalized block (block number: {pending_block_number}):") pending_block_tx_hashes = {tx.hex() for tx in pending_block.transactions} finalized_block_tx_hashes = {tx['hash'].hex() for tx in finalized_block.transactions} common_txs = pending_block_tx_hashes.intersection(finalized_block_tx_hashes) pending_only_txs = pending_block_tx_hashes - finalized_block_tx_hashes finalized_only_txs = finalized_block_tx_hashes - pending_block_tx_hashes print(f"\nTxs in pending block: {len(pending_block.transactions)}") print(f"Txs in latest block: {len(finalized_block.transactions)}") position_changes = { "same_position": 0, "higher_position": 0, "lower_position": 0 } for tx_hash in common_txs: pending_index = [tx.hex() for tx in pending_block.transactions].index(tx_hash) finalized_index = [tx['hash'].hex() for tx in finalized_block.transactions].index(tx_hash) if pending_index == finalized_index: position_changes["same_position"] += 1 elif pending_index > finalized_index: position_changes["higher_position"] += 1 else: position_changes["lower_position"] += 1 print(f"\nTxs with higher position in the latest: {position_changes['higher_position']}") print(f"Txs with the same position in the latest: {position_changes['same_position']}") print(f"Txs with lower position in the latest: {position_changes['lower_position']}") print(f"Txs not included in the latest: {len(pending_only_txs)}") if __name__ == "__main__": main() ``` ## Pending and latest blocks analysis If you run the code above, you will notice that the transactions in the pending block differ from those in the final block added to the blockchain. The point is that the pending block is constructed by any node (validator or non-validator), but the finalized block is constructed by the block proposer (validator). **The block proposer may have a different view of the mempool, apply different prioritization criteria, or include different transactions based on network conditions and their own transaction pool.** When examining the transactions in the pending block versus the finalized block for block number 20212119, we can observe some interesting patterns and differences: | Metric | Number of transactions | | ------------------------------------------------------- | ---------------------- | | Pending block | 122 | | Finalized block | 189 | | Transactions with higher positions in the latest block | 0 | | Transactions with the same position in the latest block | 0 | | Transactions with lower positions in the latest block | 120 | | Transactions not included in the latest block | 2 | ### Key observations 1. The finalized block contains significantly more transactions than the pending block. This means that additional transactions were included by the block proposer after the pending block snapshot was taken. 2. Almost all transactions (120 out of 122) from the pending block ended up in a lower position in the finalized block. 3. Two transactions from the pending block were not included in the finalized block at all. This could be due to various reasons such as insufficient fees, reorganization, or replacement by higher-priority transactions. ## Conclusion The goal of this article was to demystify the concept of a pending block in the Ethereum blockchain, specifically focusing on how Geth, the most popular Ethereum execution client, constructs pending blocks. ### Key findings Block proposer and non-block proposer limitations: * Block proposers construct the pending block using the latest transactions from the local mempool, ensuring they are prioritized and validated according to consensus rules. These blocks are proposed to the network for inclusion in the blockchain. * Non-block proposers (which can include validators when they are not proposing a block) construct a pending block based on their expectations of the next block's transactions but lack the guarantee of actual inclusion, as the final decision rests with the block proposer. Transaction ordering in a pending block: * Transactions in the pending block are sourced from the mempool and primarily sorted by their effective gas tip, calculated as the difference between the gas fee cap and the base fee or the gas tip cap. * Local transactions are prioritized over remote transactions, regardless of their effective gas tip, as long as they meet the minimum gas tip requirement. Handling of legacy transactions: * Legacy transactions (Type-0) are sorted by gas price alone, without considering the effective gas tip. This ensures backward compatibility and straightforward inclusion based on gas price. ### About author Developer Advocate @ Chainstack Multiple years of software development and Web3 expertise. Creator of the open-source Compare Dashboard for RPC provider performance benchmarking. Core contributor to the DevEx team’s pump.fun trading bot. Author of technical tutorials on EVM blockchains, Solana, TON and Subgraphs. [](https://github.com/smypmsa) [](https://x.com/sensuniama) [](https://www.linkedin.com/in/anton-sauchyk/) # Ethereum logs tutorial series: Logs and filters Source: https://docs.chainstack.com/docs/ethereum-logs-tutorial-series-logs-and-filters **TLDR** * Explains how Ethereum logs provide an immutable record of on-chain events, while filters allow targeted retrieval of these logs based on addresses, topics, or block ranges. * Demonstrates retrieving historical logs with getPastLogs for one-time queries, and real-time subscription to new logs with eth\_subscribe for reactive DApps. * Highlights best practices such as indexing event parameters, breaking large block ranges, caching, and throttling requests. * Concludes that logs + filters combine to power event-driven, scalable Ethereum applications. ## Introduction Alright, so why do programmers never get lost in the woods?! Because they always follow the trail of logs they've left behind (ba-dum-tss). In the world of software development, logs are an indispensable tool for understanding and troubleshooting the behavior of applications. By acting as a detailed journal of system events, errors, and other relevant information, logs provide developers with crucial insights into the inner workings of their programs. As applications continue to grow more complex and interconnected, the importance of logs becomes increasingly apparent. In Ethereum, logs take on a distinct role compared to those in traditional software development. They provide a mechanism to store and access data generated by smart contracts during their execution. These logs, which are stored as immutable entries within transaction receipts, provide insights into events, state changes, and data storage related to smart contracts. Now, the utilization of the said information is only possible through convenient means of access to these logs. This is where filters come in. Filters are mechanisms that allow external applications, developers, and users to efficiently search, access, and monitor logs generated by the Ethereum blockchain. By applying specific criteria to filter logs, they enable targeted retrieval of relevant information from the vast amount of data stored in the form of logs. One can think of Ethereum logs as newspaper articles. They contain recorded information about specific events, stories, or data, just as logs hold information about events and data generated by smart contracts. Newspapers provide a means of communication and record-keeping, much like logs in the Ethereum ecosystem. On the other hand, filters can be compared to search engines that help users find relevant articles or information from newspapers based on specific criteria or keywords. Filters in Ethereum allow developers and DApps to efficiently access specific data from logs or other aspects of the blockchain without having to go through the entire blockchain, just as search engines enable users to find the exact information they are looking for without reading all available newspapers. Now, what follows is an in-depth analysis of Ethereum logs, filters, and their working. Whether you're a seasoned Ethereum developer or just getting started with the platform, this in-depth analysis will provide you with the knowledge you need to make the most of Ethereum logs and filters. Let's get started! ## Analyzing Ethereum logs Ethereum logs are data structures created as a result of events emitted by smart contracts during their execution. The key difference between logs and events is that events are a programming feature in Solidity that enables smart contracts to communicate and interact with external applications, while logs are the data structures generated by the Ethereum Virtual Machine (EVM) when events are emitted. Smart contracts can emit events to indicate specific occurrences or state changes, which are logged and stored as part of transaction receipts. Ethereum logs primarily serve two main purposes: event logging and data storage. Developers and DApps can access these logs to trigger further actions or to track important events. Additionally, logs offer a cost-effective way to store data off-chain while maintaining a reference on the blockchain, reducing gas costs and enhancing efficiency. To get a more in-depth idea about event logs, check out [Tracking some Bored Apes: The Ethereum event logs tutorial](/docs/tracking-some-bored-apes-the-ethereum-event-logs-tutorial) To understand the basic components of an Ethereum log, let's consider a simple example of a token transfer event log. The `Transfer` event is used in a simple token smart contract to record and communicate token transfers between addresses: ```javascript Javascript /** * @dev Emitted when `value` tokens are moved from one account (`from`) * to another (`to`). * Note that `value` may be zero. */ event Transfer(address indexed from, address indexed to, uint256 value); ``` The event takes three parameters: the sender's address, the recipient's address, and the transferred token value. When the `transfer` function is called within the smart contract, the `Transfer` event is emitted, generating the following Ethereum log entry: ```javascript Javascript { address: '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48', topics: [ '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef', '0x00000000000000000000000037b2b98ea5d620a4064fc954e9f374c8a28e2125', '0x000000000000000000000000dc412bbc1875e588166211defa9c84ac195094cf' ], data: '0x0000000000000000000000000000000000000000000000000000000047868c00', blockNumber: 14673517, transactionHash: '0x63aff25171e502638926200a480fbd0365ca0c76d036acaa866f8af5d26e9a61', transactionIndex: 311, blockHash: '0xfa470ae2ccbe4d1a83b2ab039e300b4fc6d2904db80f8f8a0f4ab890f4240a09', logIndex: 531, removed: false, id: 'log_3cd63a01' } ``` The basic components of the given Ethereum log include: 1. `address` — the address of the contract that generated this log. 2. `topics` — an array of topics associated with the log. Topics are indexed event parameters that can be used to filter logs. 3. `data` — the log data as a hexadecimal string. This contains the non-indexed event parameters. 4. `blockNumber` — the number of the block in which the log was created. 5. `transactionHash` — the hash of the transaction that generated this log. 6. `transactionIndex` — the index of the transaction within the block. 7. `blockHash` — the hash of the block that contains this log. 8. `logIndex` — the index of the log within the block. In this case, the log index is `531`. 9. `removed` — a boolean value that indicates whether the log was removed due to a chain reorganization. 10. `id` — a unique identifier for this log entry. Find more explanations and examples in the [Chainstack API reference | eth\_getLogs](/reference/ethereum_node_api/logs_and_events/eth_getLogs). ## Understanding filters Ethereum filters are essential tools for developers when working with Ethereum-based applications. They facilitate real-time monitoring and interaction with the Ethereum blockchain, offering developers the ability to listen for specific events, state changes, or transactions. Filters help create responsive, event-driven applications that continuously adapt to changes on the blockchain. There are three primary types of filters in Ethereum: ### Log filters Log filters are essential for monitoring and reacting to events generated by smart contracts on the Ethereum blockchain. These events can include but are not limited to, transferring tokens, updating contract states, or invoking specific contract functions. Log filters allow developers to fetch logs based on certain conditions, such as: * **Contract address**: Monitor events emitted by a specific smart contract. * **Event signature**: Filter logs by the event's unique signature, which is the Keccak-256 hash of the event definition (e.g., `Transfer(address,address,uint256)`). * **Block range**: Limit the logs fetched to a specific range of blocks. * **Indexed event parameters**: Filter logs based on specific indexed event parameters, which are used to narrow down the logs relevant to the application's needs. Logs contain valuable information, including the event signature, contract address, block number, transaction hash, and event parameters. By utilizing log filters, developers can track contract events and update their applications accordingly. ### Block filters Block filters focus on monitoring newly generated blocks on the Ethereum blockchain. These filters are useful for applications that need to react to new blocks, such as: * Fetching the latest gas prices * Verifying transactions included in a block * Analyzing block data, such as miner rewards, difficulty, and uncle blocks Block filters provide developers with real time updates on new blocks mined, allowing them to retrieve block header data. This data includes the block number, miner, timestamp, parent hash, and other pertinent information. By monitoring new blocks, developers can ensure their applications stay up-to-date with the latest blockchain data. ### Pending transaction filters Pending transaction filters are designed to monitor and track pending transactions in the Ethereum network. These filters are beneficial for applications that require information about unconfirmed transactions, such as: * Displaying pending transactions in a user's wallet * Analyzing transaction behavior, like gas price bidding or transaction propagation * Estimating transaction confirmation times Pending transaction filters notify developers when a new transaction enters the pending state, providing the transaction hash. Developers can then fetch the transaction data using the hash, which includes the sender, recipient, value, gas price, and other relevant transaction details. ## Accessing logs using filters ### Prerequisites Access an Ethereum node: 1. [Head over to Chainstack](https://console.chainstack.com/user/account/create) and create an account. 2. [Deploy an Ethereum Mainnet node](/docs/manage-your-networks#join-a-public-network) in Chainstack. Set up a Node project: 1. Install the following dependencies in your system: * [Node (version ≥ 16)](https://nodejs.org/en/download) and the corresponding npm * A code editor (VS Code, preferably) Once you have everything, the next step is to set up a Node project and install the web.js library. 2. Create a new directory. 3. Open a terminal in the directory. 4. Use the following command to initialize a Node project: ```shell Shell npm init -y ``` 5. Once the project is initialized, use the following command to install the web.js package: ```shell Shell npm install web3 ``` 6. Within the project, create a new JavaScript file, `fetchLogs.js`, and add the following lines of code: ```javascript fetchLogs.js const Web3 = require('web3'); // Add your Chainstack node endpoint (HTTPS) const chainstackRpcUrl = 'YOUR_CHAINSTACK_ENDPOINT'; const web3 = new Web3(new Web3.providers.HttpProvider(chainstackRpcUrl)); // USDC token contract address on Ethereum mainnet const usdcContractAddress = '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'; // The Transfer event signature for ERC20 tokens (keccak256 hash of "Transfer(address,address,uint256)") const transferEventSignature = '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'; // Define the filter options for fetching USDC transfer logs const filterOptions = { fromBlock: 'latest', // Start from the genesis block address: usdcContractAddress, topics: [transferEventSignature], }; // Fetch logs using the filter options web3.eth.getPastLogs(filterOptions).then(logs => { // Define the Transfer event ABI const transferEventInputs = [ { type: 'address', name: 'from', indexed: true }, { type: 'address', name: 'to', indexed: true }, { type: 'uint256', name: 'value', indexed: false }, ]; // Process each log logs.forEach(log => { // Decode the log data const decodedLog = web3.eth.abi.decodeLog(transferEventInputs, log.data, log.topics.slice(1)); // Log the decoded transfer event data console.log('USDC Transfer:'); console.log(' From:', decodedLog.from); console.log(' To:', decodedLog.to); console.log(' Value (in USDC):', parseInt(decodedLog.value) / 1e6); }); }).catch(error => { console.error('Error:', error); }); ``` In this code example, we demonstrate how to fetch USDC token transfer event logs using log filters in web3.js. The purpose of this code is to provide a clear and concise example for developers to understand how to interact with the Ethereum blockchain and obtain event logs, specifically for the USDC stablecoin. This information can be useful for applications that need to track token transfers or analyze token transaction history. The code starts by importing the `Web3` library, which is the primary library for interacting with Ethereum nodes. We then provide the Chainstack node endpoint, which is used to connect to the Ethereum Mainnet. The USDC token contract address and the `Transfer` event signature are defined to be used in the log filter. We create a `filterOptions` object, which contains the necessary filter parameters, such as the starting and ending blocks (`fromBlock` and `toBlock`), the contract address, and the event signature. This object is used as input for the `web3.eth.getPastLogs()` method, which fetches logs matching the filter options. Upon successful retrieval of the logs, the function iterates through each log entry, decoding the log data using the `Transfer` event ABI. The decoded log information is then printed to the console, showing the sender, recipient, and value of each USDC transfer. ## Subscribing to logs In the above-given code, we use the `web3.eth.getPastLogs()` to perform a one-time query to the Ethereum node to retrieve logs matching the given filters. The result is a collection of logs from the specified block range that match the filter criteria. This approach is useful for obtaining historical event data or performing a one-time analysis of contract events. To monitor the logs in real time, we need to use the `web3.eth.subscribe()` method. When we use the `subscribe` method, we create an ongoing subscription to a specific event, such as logs, new block headers, or pending transactions. The Ethereum node pushes new data to the subscriber in real time whenever the subscribed event occurs. This approach is useful for building event-driven applications that need to react to changes on the blockchain as they happen. To use the `subscribe` function, we need to get the WSS endpoint of your Chainstack node. You can find more examples and explanations about the subscriptions method in the [Chainstack API reference](/reference/ethereum-web3js-subscriptions-methods). Once you fetch the endpoint, use the following code to *subscribe* to event logs: ```javascript Javascript const Web3 = require('web3'); // Add your Chainstack node endpoint (WSS) const chainstackRpcUrl = 'YOUR_CHAINSTACK_ENDPOINT'; // Configure reconnect options for WebsocketProvider const websocketOptions = { clientConfig: { reconnect: { auto: true, // Automatically attempt to reconnect delay: 5000, // Reconnect after 5 seconds maxRetries: 10, // Max number of retries }, }, }; const web3 = new Web3(new Web3.providers.WebsocketProvider(chainstackRpcUrl, websocketOptions)); // USDC token contract address on Ethereum mainnet const usdcContractAddress = '0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48'; // The Transfer event signature for ERC20 tokens (keccak256 hash of "Transfer(address,address,uint256)") const transferEventSignature = '0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'; // Define the filter options for fetching USDC transfer logs const filterOptions = { address: usdcContractAddress, topics: [transferEventSignature], }; // Subscribe to logs and process USDC transfer events web3.eth.subscribe('logs', filterOptions, (error, log) => { if (error) { console.error('Error:', error); return; } // Define the Transfer event ABI const transferEventInputs = [ { type: 'address', name: 'from', indexed: true }, { type: 'address', name: 'to', indexed: true }, { type: 'uint256', name: 'value', indexed: false }, ]; // Decode the log data const decodedLog = web3.eth.abi.decodeLog(transferEventInputs, log.data, log.topics.slice(1)); // Log the decoded transfer event data console.log('USDC Transfer:'); console.log(' From:', decodedLog.from); console.log(' To:', decodedLog.to); console.log(' Value (in USDC):', parseInt(decodedLog.value) / 1e6); }); ``` The given code leverages the `subscribe` method available in the web3.js library to enable real-time monitoring of logs. The `subscribe` method sets up a WebSocket connection with the Ethereum node, allowing for efficient and continuous updates on new events without the need for polling. In this specific example, the code creates a subscription to the 'logs' event, applying a filter to focus on the USDC token contract's `Transfer` events. Upon receiving a new log event matching the filter criteria, the callback function is triggered, processing and decoding the event data to extract valuable information about the USDC transfers occurring on the Ethereum network. The code is also designed to automatically reconnect to the Ethereum node if the WebSocket connection is lost or interrupted. This is important to ensure that the script continues to monitor USDC transfers without manual intervention in case of connection issues. The reconnection logic is implemented using the `clientConfig` option when creating a new instance of`Web3.providers.WebsocketProvider`. This option allows you to customize the behavior of the WebSocket client, allowing automatic reconnection attempts following the delay and maximum retries specified in the `websocketOptions`. ## Making the best out of Ethereum logs and filters When developing Ethereum applications, it's important to ensure that your code is efficient, secure, and scalable. To help you build reliable and performant applications, we have compiled a list of best practices for retrieving and using event logs and filters in Ethereum. These recommendations cover various aspects of development, such as safe block range, optimal coding practices, and more. By following these guidelines, you can improve your application's performance and ensure seamless interactions with the Ethereum network. Here are a few best practices that you can follow while using Ethereum logs and filters: 1. **Use the latest version of web3.js**. Make sure to always use the most recent version of the web3.js library, as it includes the latest features, optimizations, and security fixes. 2. **Choose a safe block range**. When retrieving event logs, it is essential to choose a reasonable block range. Requesting too large a range may lead to timeouts and slow response times. If you need to retrieve logs from a wide range, consider breaking the request into smaller chunks. Chainstack does not impose strict limitations on the block range you can query. However, for optimal performance, on the Ethereum network, we suggest keeping the block range within 5,000 blocks. 3. **Filter events by indexed parameters**. Using indexed parameters in your event filters can significantly improve efficiency by allowing you to target specific logs. Make use of indexed event parameters whenever possible. 4. **Monitor for new events**. Use the web3.js library's built-in event subscription feature to monitor new events in real-time. This can help you avoid the need to constantly poll for updates. 5. **Throttle your requests**. To avoid overwhelming your Ethereum provider, throttle your requests to a reasonable rate. You can use libraries like `lodash` to implement request throttling easily. 6. **Use the `fromBlock` parameter wisely**. When retrieving historical event logs, be mindful of the `fromBlock` parameter. It is often more efficient to start from a block that you know contains relevant events rather than querying from the genesis block. 7. **Cache logs locally**. Cache event logs locally to minimize the need for frequent API calls. This can improve your application's performance and reduce the load on your Ethereum provider. 8. **Monitor gas usage**. Keep track of gas usage for your contract interactions. Optimizing your contract's gas consumption can save your users money and improve the overall user experience. 9. **Secure your Ethereum provider**. For higher security, ensure that you use properly secure connections with API keys and other security measures in your Chainstack nodes. ## Conclusion In conclusion, logs and filters are essential components of Ethereum-based applications. Logs provide valuable information about contract events, while filters allow developers to monitor and interact with the blockchain in real time. By leveraging these tools, developers can build event-driven applications that respond to changes on the blockchain and provide a seamless user experience. It's important to follow best practices when working with logs and filters to ensure that your code is efficient, secure, and scalable. By keeping these recommendations in mind, you can improve your application's performance and provide reliable interactions with the Ethereum network. ### See also ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, NEAR , Graph Protocol and Oasis. Majored in computer science and technology. [](https://github.com/SethuRamanOmanakuttan) [](https://twitter.com/Sethu_Raman_O) [](https://www.linkedin.com/in/sethuraman-omanakuttan) # Ethereum methods Source: https://docs.chainstack.com/docs/ethereum-methods See also [Interactive Ethereum API call examples](/reference/ethereum-getting-started). ## Execution layer | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------------------------ | | eth\_accounts | | Will always return empty | | eth\_blobBaseFee | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_simulateV1 | | No archive data support | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | erigon\_blockNumber | | | | erigon\_forks | | | | erigon\_getBlockByTimestamp | | | | erigon\_getBlockReceiptsByBlockHash | | | | erigon\_getHeaderByHash | | | | erigon\_getHeaderByNumber | | | | erigon\_getLatestLogs | | | | erigon\_getLogsByHash | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | ## Consensus layer (Beacon Chain) | Method | Availability | Comment | | -------------------------------------------------------------- | --------------------------------------------- | ------- | | /eth/v1/beacon/blocks/\{block\_id}/attestations | | | | /eth/v1/beacon/blocks/\{block\_id}/root | | | | /eth/v1/beacon/blob\_sidecars/\{\{block\_id}} | | | | /eth/v1/beacon/genesis | | | | /eth/v1/beacon/headers | | | | /eth/v1/beacon/headers/\{block\_id} | | | | /eth/v1/beacon/states/\{state\_id}/committees | | | | /eth/v1/beacon/states/\{state\_id}/finality\_checkpoints | | | | /eth/v1/beacon/states/\{state\_id}/fork | | | | /eth/v1/beacon/states/\{state\_id}/root | | | | /eth/v1/beacon/states/\{state\_id}/sync\_committees | | | | /eth/v1/beacon/states/\{state\_id}/validator\_balances | | | | /eth/v1/beacon/states/\{state\_id}/validators | | | | /eth/v1/beacon/states/\{state\_id}/validators/\{validator\_id} | | | | /eth/v1/beacon/rewards/sync\_committee/\{block\_id} | | | | /eth/v1/beacon/rewards/blocks/\{block\_id} | | | | /eth/v1/beacon/rewards/attestations/\{epoch} | | | | /eth/v1/config/deposit\_contract | | | | /eth/v1/config/spec | | | | /eth/v1/events | | | | /eth/v1/node/peer\_count | | | | /eth/v1/node/peers | | | | /eth/v1/node/syncing | | | | /eth/v1/node/version | | | | /eth/v1/validator/aggregate\_attestation | | | | /eth/v1/validator/blinded\_blocks/\{slot} | | | | /eth/v1/validator/duties/attester/\{epoch} | | | | /eth/v1/validator/duties/proposer/\{epoch} | | | | /eth/v1/validator/sync\_committee\_contribution | | | | /eth/v2/beacon/blocks/\{block\_id} | | | | /eth/v2/debug/beacon/states/\{state\_id} | | | # Ethereum: How to set up a redundant event listener with Python Source: https://docs.chainstack.com/docs/ethereum-redundant-event-listener-python-version **TLDR** * This tutorial shows how to create a redundant event listener in Python to reliably capture Ethereum events by connecting to multiple endpoints. * Using web3.py, it continuously checks for WETH “Transfer” events, mitigating the risk of missing events due to node downtime or latency issues. * Each event is tracked by a unique hash to avoid duplicates, ensuring only the first instance gets processed. * The result is a more robust, fault-tolerant DApp setup that remains accurate and responsive even during endpoint failures. ## Main article Event listeners play an important role in blockchain technology, enabling applications to respond to specific events emitted by smart contracts. These events are vital for decentralized applications (DApps), such as tracking token transfers and monitoring contract executions. In the Ethereum ecosystem, events are logged on the blockchain. They can be listened to by off-chain applications to trigger specific actions or update their state based on the latest blockchain activity. This tutorial aims to build a resilient Ethereum event listener using Python. By leveraging multiple endpoints across different regions, we aim to achieve better consistency in catching events and ensure redundancy in case of endpoint failures. This approach enhances the reliability and robustness of the event listener, making it more effective in real-world scenarios where network issues or endpoint downtimes might occur. This tutorial teaches you how to set up, configure, and run a resilient event listener that can handle events efficiently and reliably. ### JavaScript version We also have a JavaScript version of this tutorial, find it here: [Ethereum: BUIDLing a redundant event listener with ethers and web3.js](/docs/ethereum-redundant-event-llstener-ethers-web3js). ## Prerequisites Before diving into the tutorial, let's go over the basic setup: Ensure you have the following setup before starting the tutorial: * **Python:** Make sure Python is installed on your machine. You can download it from the [official Python website](https://www.python.org/). * **Virtual environment:** Using a virtual environment to manage your project dependencies is a good practice. To set up and activate a virtual environment, run the following commands in the terminal in the directory where you want to create the project: ```bash Bash $ python -m venv venv $ source venv/bin/activate # On Windows, use `venv\Scripts\activate` ``` * **web3.py library:** Install the web3.py library using pip. Run the following command from the terminal in the directory where you want to create the project: ```bash Bash pip install web3 ``` * **Ethereum node endpoints:** ### Get you own node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. You need at least a Chainstack paid plan to deploy multiple nodes. Check the [Chainstack pricing](https://chainstack.com/pricing/) page for a coupon. ## The code Now that your environment is ready let's go over the code. In the directory where your project is create a new file named `main.py` and paste the following code: ```python main.py import os import asyncio import logging from web3 import Web3 from web3.middleware import geth_poa_middleware # Configure logging logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') # List of endpoints endpoints = [ "ENDPOINT_1", "ENDPOINT_2", "ENDPOINT_3" ] # Filter for WETH transfer events logs_filter = { 'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2', # WETH contract address 'topics': ['0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'], # Transfer event signature } # Set to track seen event identifiers to prevent duplicates seen_events = set() async def handle_event(event, endpoint): event_id = f"{event['transactionHash'].hex()}-{event['logIndex']}" if event_id not in seen_events: seen_events.add(event_id) logging.info(f"Event received first from {endpoint}: {event_id}") async def subscribe_to_logs(endpoint): while True: try: web3 = Web3(Web3.HTTPProvider(endpoint)) web3.middleware_onion.inject(geth_poa_middleware, layer=0) if not web3.is_connected(): logging.warning(f"Failed to connect to endpoint {endpoint}") await asyncio.sleep(5) continue logging.info(f"Connected to endpoint {endpoint}") event_filter = web3.eth.filter(logs_filter) while True: for event in event_filter.get_new_entries(): await handle_event(event, endpoint) await asyncio.sleep(1) except Exception as e: logging.error(f"Error when subscribing to logs from {endpoint}: {e}") await asyncio.sleep(5) # Wait before retrying async def main(): tasks = [subscribe_to_logs(endpoint) for endpoint in endpoints] await asyncio.gather(*tasks) if __name__ == "__main__": asyncio.run(main()) ``` ### Step-by-step code breakdown Let's break down the code step-by-step to understand how it works: 1. **Import libraries** The first step is to import the necessary libraries. We import `os`, `asyncio`, and `logging` for general purpose use, and `Web3` and `geth_poa_middleware` from the `web3` library to interact with the Ethereum blockchain. ```python Python import os import asyncio import logging from web3 import Web3 from web3.middleware import geth_poa_middleware ``` 2. **Configure logging** Logging is configured to display informational messages with a specific format, which includes the timestamp, log level, and message. This helps in tracking the application's behavior and debugging issues. ```python Python logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') ``` 3. **Define endpoints** A list of Ethereum node endpoints is defined. These endpoints should be URLs of your RPC nodes. Replace `ENDPOINT_1`, `ENDPOINT_2`, and `ENDPOINT_3` with the actual URLs of your Ethereum nodes. ```python Python endpoints = [ "ENDPOINT_1", "ENDPOINT_2", "ENDPOINT_3" ] ``` 4. **Set up event filter** The event filter is configured to listen for WETH transfer events. The `address` field specifies the WETH contract address and the `topics` field contains the signature for the Transfer event. ```python Python logs_filter = { 'address': '0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2', # WETH contract address 'topics': ['0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef'], # Transfer event signature } ``` 5. **Track seen events** A set named `seen_events` is used to track event identifiers that have already been processed. This helps in preventing duplicate handling of the same event. ```python Python seen_events = set() ``` 6. **Handle events** The `handle_event` function processes the received events. It constructs a unique event identifier from the transaction hash and log index. If the event has not been seen before, it logs the event and adds the identifier to the `seen_events` set. ```python Python async def handle_event(event, endpoint): event_id = f"{event['transactionHash'].hex()}-{event['logIndex']}" if event_id not in seen_events: seen_events.add(event_id) logging.info(f"Event received first from {endpoint}: {event_id}") ``` 7. **Subscribe to logs** The `subscribe_to_logs` function continuously tries to connect to an Ethereum node, sets up a filter for the specified events, and processes new entries as they arrive. If the connection fails, it waits for 5 seconds before retrying. ```python Python async def subscribe_to_logs(endpoint): while True: try: web3 = Web3(Web3.HTTPProvider(endpoint)) web3.middleware_onion.inject(geth_poa_middleware, layer=0) if not web3.is_connected(): logging.warning(f"Failed to connect to endpoint {endpoint}") await asyncio.sleep(5) continue logging.info(f"Connected to endpoint {endpoint}") event_filter = web3.eth.filter(logs_filter) while True: for event in event_filter.get_new_entries(): await handle_event(event, endpoint) await asyncio.sleep(1) except Exception as e: logging.error(f"Error when subscribing to logs from {endpoint}: {e}") await asyncio.sleep(5) # Wait before retrying ``` 8. **Run the main function** The `main` function creates tasks for each endpoint and runs them concurrently using `asyncio.gather`. The script starts by calling `asyncio.run(main())`. ```python Python async def main(): tasks = [subscribe_to_logs(endpoint) for endpoint in endpoints] await asyncio.gather(*tasks) if __name__ == "__main__": asyncio.run(main()) ``` ### Run the code To run the code, simply add your endpoints in the `endpoints` list and run the command in the terminal: ``` python3 main.py ``` It will start listening and logging; here is an example: ``` 2024-05-16 17:07:09,452 - INFO - Connected to endpoint https://nd-974-620-518.p2pify.com/ 2024-05-16 17:07:11,345 - INFO - Connected to endpoint https://nd-777-597-727.p2pify.com/ 2024-05-16 17:07:12,776 - INFO - Connected to endpoint https://ethereum-mainnet.core.chainstack.com/ 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0xcab7994c8ff1495136db4966f4ed4556513540c6cf08dbd22e09fb3496acadef-1 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x649a1c6138ae7f3135d9ec2a24068ced7d1d2f00fd63781fa11153915d3f22b4-4 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x52c3c874bc8a8a7c34cdcfbf7e0adad89164b2069d1b445feb44504d350dee59-7 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x8755f753006a4bcf6b436bf1b377ca41d39c33219658426e8ae6a63b914279c3-30 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x4b5d3072d599cb4fc148a09c38b85bfa2729f5ad95e2b676cd15c8ebd4cff76d-43 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x54ebab7ec57c78c8182ab6797bc4a41f31c852e4d206ea13c59166592c44f41a-91 2024-05-16 17:07:13,805 - INFO - Event received first from https://ethereum-mainnet.core.chainstack.com/: 0x8f69c9aa219fcc6d29edec88d72de91d7ab4bbb2f3854e7d4c339deb313badf9-128 ``` You can find the full event in `event` in `handle_event` and you can use it for further processing: ``` async def handle_event(event, endpoint): event_id = f"{event['transactionHash'].hex()}-{event['logIndex']}" if event_id not in seen_events: seen_events.add(event_id) logging.info(f"Event received first from {endpoint}: {event_id}") ``` ## Conclusion Building a resilient Ethereum event listener is crucial for maintaining reliable and consistent decentralized applications. Using multiple endpoints across different regions can ensure better event capture and redundancy, making your event listener robust against network issues and endpoint downtimes. This tutorial guided you through setting up, configuring, and running a resilient event listener using Python and web3.py. With this setup, you are well-equipped to handle blockchain events efficiently and reliably, enhancing the effectiveness of your DApps. # Ethereum: BUIDLing a redundant event listener with ethers and web3.js Source: https://docs.chainstack.com/docs/ethereum-redundant-event-llstener-ethers-web3js * Monitoring Ethereum events is crucial for accurate data and a smooth user experience. * Subscribing to multiple nodes in parallel prevents missed events during node downtime, ensuring reliability. * Using web3.js or ethers.js with environment variables streamlines setup and protects your endpoints. * Redundant event listeners are key for mission-critical DApps, providing robust real-time updates. ## Main article When building on Ethereum, reliably monitoring and reacting to events is important. Events like token transfers often trigger downstream actions or application updates. Missing important events can lead to data inconsistencies, incorrect user balances, and poor user experience. While Ethereum nodes provide access to the blockchain and its events, relying on a single node for event listening can be risky. Nodes can experience downtime, connectivity issues, or other failures, resulting in missed events and application functionality disruptions. This challenge is amplified in mission-critical applications or those handling high-value transactions, where missing events can have severe consequences. ### Python version We also have a Python version of this tutorial, find it here: [How to set up a redundant Ethereum event listener with Python](/docs/ethereum-redundant-event-listener-python-version). To address this issue, the concept of a redundant event listener emerges as a solution. By subscribing to multiple Ethereum nodes simultaneously, a redundant event listener increases the chances of receiving events even if one or more nodes fail. This approach introduces redundancy and fault tolerance, ensuring that important events are not missed and that the application remains responsive and up-to-date. This tutorial will explore how to BUIDL a redundant Ethereum event listener using Node.js and the popular web3.js and ethers.js libraries using WSS RPC nodes. Chainstack global nodes already have redundancy and fault tolerance built in, but we can improve the robustness of this event listener by using Chainstack [Trader Node](/docs/trader-node). This architecture will allow your app to place event listeners in various regions worldwide, ensuring redundancy and accuracy. Learn everything about event logs in [Tracking some Bored Apes: The Ethereum event logs tutorial](/docs/tracking-some-bored-apes-the-ethereum-event-logs-tutorial). ## Project overview This tutorial shows you how to build a redundant Ethereum event listener using Node.js, web3.js, and ethers.js. This application aims to ensure reliable and fault-tolerant monitoring of events on the Ethereum blockchain, specifically the transfer events of the Wrapped Ether (WETH) contract. The application achieves redundancy by establishing multiple WebSocket connections to Ethereum RPC nodes, using Chainstack global and regional infrastructure. By subscribing to multiple nodes simultaneously, the application increases its chances of receiving events even if one or more nodes experience downtime or connectivity issues. The logic behind the application is as follows: 1. It defines an array of WebSocket endpoints from various Ethereum node providers. 2. It creates a filter object that specifies the contract address of the WETH contract and the topic (event signature) for the "Transfer" event, which is the event we want to listen for. 3. It initializes a `Set` data structure to track unique event identifiers and prevent duplicate event processing. 4. The application defines a function called `subscribeToLogs` that takes an endpoint as input, creates a new Web3 instance with that endpoint, and sets up a WebSocket subscription to listen for logs (events) matching the defined filter. 5. When a new event is received, the function checks if the event identifier (a combination of the transaction hash and log index in web3.js and transaction hash and block in ethers) has been seen before. If not, it logs the event data to the console and adds the event identifier to the `Set` to mark it as processed. 6. The function also handles subscription errors by logging them to the console. By implementing this redundant event listener, the application ensures that important events, such as WETH transfers, are not missed, even in the face of node failures or connectivity issues. This increased reliability and fault tolerance can be crucial for applications that rely heavily on monitoring and reacting to Ethereum events in real-time. ## Prerequisites To start with a JavaScript development project, you'll need to install `node.js`, a powerful JavaScript runtime environment that enables developers to run JavaScript code outside a web browser. For this project, it's recommended to use at least version 18. You can [download it from their website](https://nodejs.org/en/download/). With `node.js` installed, you're ready to start using JavaScript. Now, you can set up your nodes. Keep in mind that you need at least a Chainstack paid plan to deploy multiple nodes. Check the [Chainstack pricing](https://chainstack.com/pricing/) page for a coupon. For this project, it is recommended to deploy at least three RPC nodes, one [Global Node](/docs/global-elastic-node), and two [Trader Nodes](/docs/trader-node). Remember that there is no limit on how many nodes you can use in the project, and can also mix different providers. Once the infrastructure is set up, you can work on a new project. ## Create a new JavaScript project First, we'll create a new Node.js project and initialize it with a `package.json` file. Open your terminal, navigate to the desired directory, and run the following command: ``` npm init -y ``` This will create a `package.json` file with default settings. Next, we'll install the required dependencies, which are the `web3.js`, `ethers.js` and the `dotenv` package for loading environment variables from a `.env` file: ``` npm install web3 ethers dotenv ``` After installing the dependencies, create a new file called `.env` in the root directory of your project. This file will store your environment variables, including the WebSocket endpoints for the Ethereum nodes. In the `.env` file, you can define the endpoints like this: ``` ENDPOINT_1=YOUR_CHAINSTACK_GLOBAL_NODE ENDPOINT_2=YOUR_CHAINSTACK_TRADER_NODE ENDPOINT_3=YOUR_CHAINSTACK_TRADER_NODE ``` Rememeber that we are using WSS endpoints. In your Node.js script, you'll need to load the environment variables from the `.env` file using the `dotenv` package. At the top of your script, add the following line: ```javascript Javascript require('dotenv').config(); ``` This will load the environment variables from the `.env` file and make them accessible in your script using `process.env`. Now, instead of hardcoding the endpoints in your script, you can read them from the environment variables: ```javascript Javascript const endpoints = [ process.env.ENDPOINT_1, process.env.ENDPOINT_2, process.env.ENDPOINT_3, ]; ``` By following these steps, you'll have a basic Node.js project set up with the required dependencies, and you'll be able to store and access your WebSocket endpoints securely using environment variables in a `.env` file. Check out [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) to learn more about managing Node.js projects. ## Web3.js code This section will guide you through this project using web3.js, and the following section will go over ethers.js. This way, you have options based on your favorite library. The project is already set up, so let's create a new file named `index.js` and paste the following code: ```javascript index.js const { Web3 } = require("web3"); require('dotenv').config(); // Add this in case you didn't earlier // List of RPC endpoints const endpoints = [ process.env.ENDPOINT_1, process.env.ENDPOINT_2, process.env.ENDPOINT_3, ]; // Filter for WETH transfer events const logsFilter = { address: "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2", // WETH contract address topics: [ "0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef", ], // Transfer event signature }; // Set to track seen event identifiers to prevent duplicates const seenEvents = new Set(); async function subscribeToLogs(endpoint) { const web3 = new Web3(endpoint); try { const subscription = await web3.eth.subscribe("logs", logsFilter); console.log(`Subscription created with ID: ${subscription.id}`); subscription.on("data", (log) => { const eventId = `${log.transactionHash}-${log.logIndex}`; if (!seenEvents.has(eventId)) { seenEvents.add(eventId); console.log(`Event received first from ${endpoint.slice(0, 33)}:`, log); } }); subscription.on("error", (error) => { console.error(`Error when subscribing to logs from ${endpoint}:`, error); }); } catch (error) { console.error(`Error setting up subscription from ${endpoint}:`, error); } } // Initialize subscriptions for all endpoints endpoints.forEach((endpoint) => { subscribeToLogs(endpoint); }); ``` ### Web3 code explanation This code sets up a redundant Ethereum event listener using the Web3.js library and environment variables for storing WebSocket endpoints. Here's what's happening: 1. The code starts by importing the `Web3` object from the `web3` library and configuring the `dotenv` package to load environment variables from a `.env` file. 2. The WebSocket endpoints for various Ethereum nodes are read from the environment variables `ENDPOINT_1`, `ENDPOINT_2`, and `ENDPOINT_3` and stored in the `endpoints` array. 3. The `logsFilter` object is defined, which specifies the contract address of the Wrapped Ether (WETH) contract and the topic (event signature) for the "Transfer" event. This filter will be used to subscribe to only the desired events. 4. A `Set` data structure called `seenEvents` is initialized to track unique event identifiers and prevent duplicate event processing. 5. The `subscribeToLogs` async function is defined, which takes an endpoint as input and sets up a WebSocket subscription to listen for logs (events) matching the `logsFilter`. 6. Inside the `subscribeToLogs` function, a new `Web3` instance is created using the provided endpoint, and the `web3.eth.subscribe("logs", logsFilter)` method is called to create a subscription. 7. When a new event is received through the subscription, the `subscription.on("data", ...)` callback is triggered. The callback generates a unique event identifier using the transaction hash and log index and checks if this identifier has been seen before in the `seenEvents` Set. 8. If the event identifier is new, it is added to the `seenEvents` Set and the event data is logged to the console along with the endpoint from which the event was received. 9. The `subscription.on("error", ...)`callback is set up to handle any errors during the subscription process, logging the error into the console. 10. Finally, the `endpoints.forEach(subscribeToLogs)` line iterates over the `endpoints` array and calls the `subscribeToLogs` function for each endpoint creates multiple WebSocket subscriptions to different Ethereum nodes. By implementing this redundant event listener, the application can monitor WETH transfer events reliably and consistently, even if one or more Ethereum nodes experience downtime or connectivity issues. Using environment variables to store endpoints makes managing and updating endpoint configurations easier without modifying the code directly. ## Ethers.js code Here, we'll cover the ethers.js version of this project; you can create an entire new Node.js project or just make a new JavaScript file, then paste this code: ``` const { ethers } = require("ethers"); require('dotenv').config(); // List of RPC endpoints const endpoints = [ process.env.ENDPOINT_1, process.env.ENDPOINT_2, process.env.ENDPOINT_3, ]; const contractAddress = "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2"; // WETH contract address const contractABI = [ "event Transfer(address indexed from, address indexed to, uint amount)", ]; // Set to track seen event identifiers to prevent duplicates const seenEvents = new Set(); async function subscribeToEvents(endpoint) { const provider = new ethers.WebSocketProvider(endpoint); const contract = new ethers.Contract(contractAddress, contractABI, provider); // Subscribe to the Transfer event contract.on("Transfer", (from, to, amount, event) => { const eventId = `${event.log.transactionHash}-${event.log.blockNumber}`; if (!seenEvents.has(eventId)) { seenEvents.add(eventId); console.log(`Event received first from ${endpoint.slice(0, 33)}:`); console.log( `Transfer from ${from} to ${to} of ${ethers.formatEther( amount.toString() )} ETH` ); // console.log("Transaction details:", event); } }); // Handle errors provider.on("error", (error) => { console.error(`WebSocket error from ${endpoint}:`, error); }); } // Initialize subscriptions for all endpoints endpoints.forEach((endpoint) => { subscribeToEvents(endpoint); }); ``` Ethers.js now supports `ChainstackProvider`, learn more about it by reading [ethers ChainstackProvider Documentation](/reference/ethersjs-chainstackprovider). ### Ethers.js code breakdown This code sets up a redundant Ethereum event listener using the `ethers.js` library and environment variables for storing WebSocket endpoints. Here's what's happening: 1. The code starts by importing the `ethers` object from the `ethers.js` library and configuring the `dotenv` package to load environment variables from a `.env` file. 2. The WebSocket endpoints for various Ethereum nodes are read from the environment variables `ENDPOINT_1`, `ENDPOINT_2`, and `ENDPOINT_3` and stored in the `endpoints` array. 3. The contract address of the Wrapped Ether (WETH) contract and its contract ABI (Application Binary Interface) are defined as constants. 4. A `Set` data structure called `seenEvents` is initialized to track unique event identifiers and prevent duplicate event processing. 5. The `subscribeToEvents` async function is defined, which takes an endpoint as input and sets up an event subscription for the WETH contract's "Transfer" event. 6. Inside the `subscribeToEvents` function, a new `WebSocketProvider` instance is created using the provided endpoint, and a `Contract` instance is created using the contract address, ABI, and provider. 7. The `contract.on("Transfer", ...)` method is used to subscribe to the "Transfer" event of the WETH contract. 8. When a new "Transfer" event is received, the callback function is triggered, generating a unique event identifier using the transaction hash and block number. 9. If the event identifier is new, it is added to the `seenEvents` Set, and the event details (sender, recipient, and amount transferred) are logged to the console along with the endpoint from which the event was received. 10. The `provider.on("error", ...)` callback is set up to handle any errors during the WebSocket connection, logging the error to the console. 11. Finally, the `endpoints.forEach(subscribeToEvents)` line iterates over the `endpoints` array and calls the `subscribeToEvents` function for each endpoint creates multiple WebSocket connections and event subscriptions to different Ethereum nodes. By implementing this redundant event listener with `ethers.js`, the application can monitor WETH transfer events reliably and consistently, even if one or more Ethereum nodes experience downtime or connectivity issues. Using environment variables to store endpoints makes managing and updating endpoint configurations easier without modifying the code directly. ## Conclusion Building reliable and fault-tolerant systems is crucial when developing applications on the Ethereum blockchain. By implementing a redundant event listener using Node.js and libraries like web3.js or ethers.js, you can consistently ensure that your application receives important events, such as token transfers, without disruptions. This approach mitigates the risks associated with relying on a single node. It introduces redundancy by subscribing to multiple nodes simultaneously, increasing the chances of receiving events even if some nodes fail. The tutorial provided a step-by-step guide to setting up a project, configuring environment variables, and implementing the redundant event listener using web3.js and ethers.js. With this solution, your Ethereum-based application can remain responsive and up-to-date and provide a seamless user experience, even in node failures or connectivity issues. # Ethereum tooling Source: https://docs.chainstack.com/docs/ethereum-tooling ### Run nodes on Chainstack [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Wallets Wallets allow the addition of custom RPC endpoints either by directly modifying the settings of an existing network (e.g., Ethereum) or by adding a custom network and specifying a custom RPC endpoint. To obtain the address of your RPC endpoint, navigate to the node details on the Chainstack console and locate the Execution client HTTPS endpoint in the Access and credentials section. When adding a new network, you need a chain ID which you can on resources like [chainlist.org](http://chainlist.org/) or [chainlist.wtf](http://chainlist.wtf). ### MetaMask MetaMask screenshot In the section **Access and credentials** of a Chainstack Node, press **Add to MetaMask**. This will prompt you to confirm a new network details. To add a network manually, go to **Networks** and add a new network with a required chain ID and your Chainstack RPC endpoint. ### Trust Wallet TW screenshot To add a custom RPC to Trust Wallet, open the wallet and navigate to the **Settings** section. Look for the **Network** section to add a custom network with your Chainstack RPC endpoint. ### Rainbow Rainbow screenshot To add a custom RPC to Rainbow Wallet, open the wallet and navigate to the **Settings** section. Look for the **Networks** to add your Chainstack RPC endpoint. ### Rabby Rabby screenshot To add a custom RPC to Rabby Wallet, open the wallet and navigate to the **Settings (More)** section. Look for the **Modify RPC URL** to add your Chainstack RPC endpoint. ### Frame Desktop Frame screenshot To add a custom RPC to Frame Desktop, open the wallet and navigate to the **Chains** section. Click on the chain details to add your Chainstack RPC endpoint. ## IDEs Cloud-based IDEs provide the flexibility to use injected providers. MetaMask is the most commonly used one. By adding a Chainstack RPC node in MetaMask and connecting to the wallet in your IDE, you can seamlessly interact with the network throught a Chainstack node. ### Remix IDE To enable Remix IDE to interact with the network through a Chainstack node, you can follow these steps: 1. Install and set up MetaMask to use a Chainstack node for interaction. You can refer to the guide on Interacting through MetaMask for detailed instructions. 2. Open Remix IDE and go to the Deploy tab (or Deploy and run transactions tab). Here, select **Injected Provider - MetaMask** as your [environment](https://remix-ide.readthedocs.io/en/latest/run.html#environment). ### ChainIDE To enable Remix IDE to interact with the network through a Chainstack node, you can follow these steps: 1. Install and set up MetaMask to use a Chainstack node for interaction. You can refer to the guide on Interacting through MetaMask for detailed instructions. 2. Open ChainIDE and navigate the [Connect Wallet](https://chainide.gitbook.io/chainide-english-1/ethereum-ide-1/1.-ethereum-ide/quick-demo#id-3.-connect-to-ethereum-evm) button. Here, select **Injected Web3 Provider** and then **MetaMask**. ## Programming languages and libraries ### Communication protocols WebSockets and HTTP are essential communication protocols in web applications. WebSockets enable two-way, persistent communication between a client and a server, useful for real-time price feeds, live transaction monitoring, and event notifications. In contrast, HTTP follows a one-way, request-response model, ideal for retrieving periodic price updates and transaction histories. ### web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Ethereum nodes deployed with Chainstack. ```Text Bash npm install web3 ``` ```javascript Javascript import { Web3, HttpProvider, WebSocketProvider } from 'web3'; // Using HTTP provider const httpProvider = new Web3(new HttpProvider("/*YOUR_HTTP_CHAINSTACK_ENDPOINT*/")); httpProvider.eth.getBlockNumber().then(console.log); // Using WebSocket provider const wsProvider = new Web3(new WebSocketProvider("/*YOUR_WS_CHAINSTACK_ENDPOINT*/")); wsProvider.eth.getBlockNumber().then((blockNumber) => { console.log(blockNumber); wsProvider.currentProvider.safeDisconnect(); }); ``` ### ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js) and Ethereum nodes deployed with Chainstack. ```Text Bash npm install ethers ``` ```javascript Javascript import { ethers } from 'ethers'; // Using HTTP provider const httpProvider = new ethers.JsonRpcProvider("/*YOUR_HTTP_CHAINSTACK_ENDPOINT*/"); httpProvider.getBlockNumber().then(console.log); // Using WebSocket provider const wsProvider = new ethers.WebSocketProvider("/*YOUR_WS_CHAINSTACK_ENDPOINT*/"); wsProvider.getBlockNumber().then((blockNumber) => { console.log(blockNumber); wsProvider.destroy(); }); ``` ### web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Ethereum nodes deployed with Chainstack. ```Text Bash pip install web3 ``` ```python Python from web3 import Web3 # Using HTTP provider http_web3 = Web3(Web3.HTTPProvider("/*YOUR_HTTP_CHAINSTACK_ENDPOINT*/")) print(http_web3.eth.block_number) # Using WebSocket provider with Web3(Web3.WebsocketProvider("/*YOUR_WS_CHAINSTACK_ENDPOINT*/")) as ws_web3: print(ws_web3.eth.block_number) ``` ### Nethereum (.NET) Build DApps using [Nethereum](https://github.com/Nethereum/Nethereum) and Ethereum nodes deployed with Chainstack. ```csharp csharp using Nethereum.JsonRpc.WebSocketClient; using Nethereum.Web3; namespace TutorialWeb3 { internal class Program { static void Main(string[] args) { // Using HTTP provider var httpClient = new Web3("/*YOUR_HTTP_CHAINSTACK_ENDPOINT*/"); var httpBlockNumber = httpClient.Eth.Blocks.GetBlockNumber.SendRequestAsync().GetAwaiter().GetResult(); Console.WriteLine($"HTTP Block Number: {httpBlockNumber}"); // Using WebSocket provider using (var wsClient = new WebSocketClient("/*YOUR_WS_CHAINSTACK_ENDPOINT*/")) { var web3Ws = new Web3(wsClient); var wsBlockNumber = web3Ws.Eth.Blocks.GetBlockNumber.SendRequestAsync().GetAwaiter().GetResult(); Console.WriteLine("WS Block Number: " + wsBlockNumber); } } } } ``` ### viem Build DApps using [viem](https://github.com/wevm/viem) and Ethereum nodes deployed with Chainstack. ```typescript TypeScript import { createPublicClient, http, webSocket } from 'viem'; import { mainnet } from 'viem/chains'; // Using HTTP provider const httpClient = createPublicClient({ chain: mainnet, transport: http("/*YOUR_HTTP_CHAINSTACK_ENDPOINT*/"), }); const blockNumber = await httpClient.getBlockNumber(); console.log(blockNumber); // Using WebSocket provider const wsClient = createPublicClient({ chain: mainnet, transport: webSocket("/*YOUR_WS_CHAINSTACK_ENDPOINT*/"), }) const wsBlockNumber = await wsClient.getBlockNumber(); console.log(wsBlockNumber); await wsClient.transport.getRpcClient().then((rpcClient) => { rpcClient.close(); }); ``` ## Querying and indexing tools ### Chainstack subgraphs (GraphQL) GraphQL can be utilized on [Chainstack Subgraphs](/docs/chainstack-subgraphs-tutorials) to query indexed blockchain data. You can deploy your own subgraph or use subgraphs, already deployed subgraphs for major DeFi applications. There are multiple tools and libraries available for that. Please check our tutorials on subgraphs. ## Development frameworks and toolkits ### Foundry Configure [Foundry](https://github.com/foundry-rs/foundry) to deploy contracts using Chainstack Ethereum nodes. ```Text Bash curl -L https://foundry.paradigm.xyz | bash foundryup ``` Create a new Foundry project: ```Text Bash forge init my_project cd my_project ``` Create a `.env` file in your project root and add your Chainstack endpoint and private key: ``` CHAINSTACK_MAINNET_ENDPOINT=https://your-chainstack-mainnet-url CHAINSTACK_DEVNET_ENDPOINT=https://your-chainstack-devnet-url PRIVATE_KEY_MAINNET=your-mainnet-private-key PRIVATE_KEY_DEVNET=your-devnet-private-key ``` You need to load the environment variables into your shell session: ```bash Bash source .env echo $CHAINSTACK_DEVNET_URL echo $PRIVATE_KEY_DEVNET ``` Update your `foundry.toml` file to include the Chainstack network: ```toml TOML [profile.default] src = 'src' out = 'out' libs = ['lib'] [rpc_endpoints] mainnet = "${CHAINSTACK_MAINNET_URL}" devnet = "${CHAINSTACK_DEVNET_URL}" ``` ```bash Bash forge build ``` ```bash Bash forge create --rpc-url ${CHAINSTACK_DEVNET_URL} --private-key ${PRIVATE_KEY_DEVNET} src/YourContract.sol:YourContract ``` Alternatively, you can create a deployment script in the `script` folder, for example `Deploy.s.sol`: ```solidity solidity // script/Deploy.s.sol // Specify the Solidity version pragma solidity ^0.8.20; // Import the Forge script library import "forge-std/Script.sol"; // Import your contract import "../src/YourContract.sol"; contract DeployScript is Script { function run() external { // Retrieve the deployer's private key from the environment variables uint256 deployerPrivateKey = vm.envUint("PRIVATE_KEY_DEVNET"); // Change to PRIVATE_KEY_MAINNET for mainnet // Start broadcasting transactions using the deployer's private key vm.startBroadcast(deployerPrivateKey); // Deploy the contract YourContract yourContract = new YourContract(); // Stop broadcasting transactions vm.stopBroadcast(); // Log the deployed contract address console2.log("Contract deployed at:", address(yourContract)); } } ``` Then run the script with: ```bash Bash forge script script/Deploy.s.sol:DeployScript --rpc-url devnet--broadcast ``` ```bash Bash cast balance --rpc-url $CHAINSTACK_RPC_URL ``` Replace \ with the actual Ethereum address you want to query. ### Hardhat Configure [Hardhat](https://github.com/NomicFoundation/hardhat) to deploy contracts using Chainstack Ethereum nodes. ```Text Bash npm init -y npm install --save-dev hardhat @nomiclabs/hardhat-ethers ethers dotenv ``` Create a `.env` file in your project root and add your Chainstack endpoint and private key: ```Text Bash CHAINSTACK_MAINNET_ENDPOINT=https://your-chainstack-mainnet-url CHAINSTACK_DEVNET_ENDPOINT=https://your-chainstack-devnet-url PRIVATE_KEY_MAINNET=your-mainnet-private-key PRIVATE_KEY_DEVNET=your-devnet-private-key ``` Initiate a project: ```bash Bash npx hardhat init ``` Update your `hardhat.config.js` file to use the Chainstack endpoint: ```javascript Javascript require('@nomiclabs/hardhat-ethers'); require('dotenv').config(); module.exports = { solidity: "0.8.20", // Specify the required compiler version networks: { mainnet: { url: process.env.CHAINSTACK_MAINNET_ENDPOINT, accounts: [process.env.PRIVATE_KEY_MAINNET] // Mainnet private key }, devnet: { url: process.env.CHAINSTACK_DEVNET_ENDPOINT, accounts: [process.env.PRIVATE_KEY_DEVNET] // Devnet private key } } }; ``` ```javascript Javascript const hre = require("hardhat"); async function main() { // Retrieve the deployer account from the list of signers const [deployer] = await hre.ethers.getSigners(); console.log("Deploying contracts with the account:", deployer.address); // Load the contract factory from the contracts folder const Token = await hre.ethers.getContractFactory("MyToken"); // Replace "MyToken" with your contract name // Specify the initial supply for the token (adjust as necessary) const initialSupply = hre.ethers.utils.parseUnits("1000", 18); // Adjust the supply and decimals as needed // Deploy the contract and wait for it to be mined const token = await Token.deploy(initialSupply); await token.deployed(); // Output the contract address and transaction hash once deployed console.log("Contract deployed at address:", token.address); console.log("Transaction hash:", token.deployTransaction.hash); } // Execute the main function and handle potential errors main() .then(() => process.exit(0)) .catch((error) => { console.error("Error during deployment:", error); process.exit(1); }); ``` ```tsx Bash npx hardhat compile ``` ```bash Bash npx hardhat run scripts/deploy.js --network devnet ``` ### Scaffold-ETH 2 Configure [Scaffold-ETH 2](https://github.com/scaffold-eth/scaffold-eth-2) to deploy contracts using Chainstack Ethereum nodes. ```bash Bash git clone https://github.com/scaffold-eth/scaffold-eth-2.git cd scaffold-eth-2 ``` ```bash Bash yarn install ``` ```bash Bash touch .env.local ``` ``` CHAINSTACK_MAINNET_ENDPOINT=https://your-chainstack-mainnet-url CHAINSTACK_DEVNET_ENDPOINT=https://your-chainstack-devnet-url DEPLOYER_PRIVATE_KEY=your-private-key-here ``` Note: by default, Scaffold-ETH 2 uses Alchemy RPC nodes and it assumes the config contains an Alchemy API key. We can override this setting Chainstack RPC nodes and updating some other configs. ```javascript Bash const config: HardhatUserConfig = { solidity: "0.8.20", // Ensure to specify the correct Solidity version networks: { mainnet: { url: process.env.CHAINSTACK_MAINNET_ENDPOINT || `https://eth-mainnet.alchemyapi.io/v2/${providerApiKey}`, // Use the Chainstack mainnet endpoint // accounts: [process.env.DEPLOYER_PRIVATE_KEY_MAINNET as string], // Mainnet private key }, sepolia: { url: process.env.CHAINSTACK_DEVNET_ENDPOINT || `https://eth-sepolia.g.alchemy.com/v2/${providerApiKey}`, // Use the Chainstack devnet endpoint // accounts: [process.env.DEPLOYER_PRIVATE_KEY_DEVNET as string], // Devnet private key }, // ... other network configurations }, }; ``` ```bash Bash cd packages/hardhat yarn hardhat compile ``` ```Text Bash yarn deploy --network sepolia ``` ```Text Bash yarn start ``` ### Truffle (no longer maintained) Configure [Truffle](https://github.com/trufflesuite/truffle) to deploy contracts using Chainstack Ethereum nodes. ```Text Bash # Initialize a new npm project npm init -y # Install Truffle as a local development dependency npm install --save-dev truffle @truffle/hdwallet-provider dotenv ``` Initialize a new Truffle project: ```bash Bash npx truffle init ``` Create a `.env` file in your project root and add your Chainstack endpoint and private key: ```bash CHAINSTACK_MAINNET_ENDPOINT=https://your-chainstack-mainnet-url CHAINSTACK_DEVNET_ENDPOINT=https://your-chainstack-devnet-url PRIVATE_KEY_MAINNET=your-mainnet-private-key PRIVATE_KEY_DEVNET=your-devnet-private-key ``` Update your `truffle-config.js` file to use the Chainstack endpoint: ```javascript Javascript require('dotenv').config(); const HDWalletProvider = require('@truffle/hdwallet-provider'); module.exports = { networks: { // Configuration for Mainnet mainnet: { provider: () => new HDWalletProvider(process.env.PRIVATE_KEY_MAINNET, process.env.CHAINSTACK_MAINNET_ENDPOINT), network_id: 1, // Mainnet's network ID }, // Configuration for Devnet devnet: { provider: () => new HDWalletProvider(process.env.PRIVATE_KEY_DEVNET, process.env.CHAINSTACK_DEVNET_ENDPOINT), network_id: 11155111, // Sepolia }, }, compilers: { solc: { version: "0.8.20", // Specify the Solidity compiler version }, }, }; ``` ```javascript Javascript // Import your contract artifact const YourContract = artifacts.require("CONTRACT_NAME"); module.exports = function (deployer) { deployer.YourContract(); }; ``` ```jsx Bash npx truffle compile ``` ```bash Bash truffle migrate --network devnet ``` ### Ape Configure [Ape Framework](https://github.com/ApeWorX/ape) to deploy contracts using Chainstack Ethereum nodes. Before performing the following steps, you can set up and activate a virtual environment. ```Text Bash pip install eth-ape ``` Initialize a project: ```bash Bash ape init ``` Create or update your `ape-config.yaml` file in your project root: ```yaml yaml name: ape-1 contracts_folder: contracts # Default is contracts default_ecosystem: ethereum # Default is ethereum node: ethereum: mainnet: uri: CHAINSTACK_MAINNET_ENDPOINT sepolia: uri: CHAINSTACK_SEPOLIA_ENDPOINT dependencies: - name: OpenZeppelin github: OpenZeppelin/openzeppelin-contracts/token/ERC20/ERC20.sol # Example version: 4.4.2 ``` ```bash Bash ape accounts import devnet_deployer ``` ```python Python from ape import accounts, project def main(): # Load the account for deployment # Change 'devnet_deployer' to 'mainnet_deployer' when deploying to mainnet account = accounts.load("devnet_deployer") # Deploy the contract contract = account.deploy(project.MyContract) # Replace 'MyContract' with your contract's name and add its arguments # Output the contract address print(f"Contract deployed at: {contract.address}") # Output the transaction hash print(f"Transaction hash: {contract.txn_hash.hex()}") if __name__ == "__main__": main() ``` ```bash Bash ape compile ``` ```Text Bash ape run scripts/deploy.py --network ethereum:sepolia ``` ### Brownie (no longer maintained) Configure [Brownie](https://www.google.com/url?sa=t\&rct=j\&q=\&esrc=s\&source=web\&cd=\&ved=2ahUKEwjYhp_e7KuHAxX4VfEDHSpMChEQFnoECBgQAQ\&url=https%3A%2F%2Fgithub.com%2Feth-brownie%2Fbrownie\&usg=AOvVaw0edmb1wsW78Y-AhUEmQwpz\&opi=89978449) to deploy contracts using Chainstack Ethereum nodes. ```bash python -m venv venv source venv/bin/activate # On Windows, use venv\Scripts\activate pip install eth-brownie ``` Create a new Brownie project (add `--force` if a folder is not empty): ```Text Bash brownie init ``` In your project directory, create a `.env` file and add your Chainstack endpoint and private key: ``` CHAINSTACK_MAINNET_ENDPOINT=https://your-chainstack-mainnet-url CHAINSTACK_DEVNET_ENDPOINT=https://your-chainstack-devnet-url PRIVATE_KEY_MAINNET=your-mainnet-private-key PRIVATE_KEY_DEVNET=your-devnet-private-key ``` Ensure that your terminal session loads these environment variables. You can manually source the .env file: ```bash Bash source .env ``` Update the `brownie-config.yaml` file in your project root: ```yaml yaml dotenv: .env networks: default: development chainstack-mainnet: host: ${CHAINSTACK_MAINNET_ENDPOINT} chainid: 1 explorer: https://etherscan.io chainstack-devnet: host: ${CHAINSTACK_DEVNET_ENDPOINT} chainid: 1337 # Example chain ID for devnet; adjust if different ``` Add a custom network to Brownie's network list: ```bash Bash brownie networks add Ethereum chainstack-mainnet host=${CHAINSTACK_MAINNET_ENDPOINT} chainid=1 brownie networks add Ethereum chainstack-devnet host=${CHAINSTACK_DEVNET_ENDPOINT} chainid=1337 ``` To use your private key for deployments, you can create a `scripts/deploy.py` file: ```python Python import os from brownie import accounts, Counter # Replace 'Counter' with your contract's class name def main(): # Load the deployer's account using the private key from the environment variable acct = accounts.add(os.getenv("PRIVATE_KEY_DEVNET")) # Use PRIVATE_KEY_MAINNET for mainnet # Deploy the contract (Replace 'Counter' with the actual name of your contract) contract = Counter.deploy({'from': acct}) # Output the contract address and transaction hash print(f"Contract deployed at: {contract.address}") print(f"Transaction hash: {contract.tx.txid}") ``` ```Text Bash brownie run scripts/deploy.py --network chainstack-devnet ``` # Ethereum Trader Nodes with Warp transactions Source: https://docs.chainstack.com/docs/ethereum-trader-nodes Warp transactions propagate through [bloXroute's high-speed transaction relay network](https://docs.bloxroute.com/bdn-architecture). This makes your transactions available for validators to pick up and include in blocks much faster than regular propagation. Each extra transaction (not request, just the transaction) consumed is billed separately. For details, see [Pricing](https://chainstack.com/pricing/) . ## Benefits with Chainstack Chainstack Trader nodes with Warp transactions combine reliable nodes for pre-transaction operations with bloXroute's high-speed propagation for the transaction itself. Switching to a Chainstack Trader node is as simple as changing the endpoint in your code. All calls go through Chainstack's infrastructure, except for `eth_sendRawTransaction`, which is routed directly to bloXroute. ## Sample use case 1. A token liquidity is deployed on Ethereum. 2. You receive a signal through your Ethereum node by monitoring the Uniswap contracts. 3. Your bot acts on the signal, sending a transaction via the Chainstack Trader node with Warp transactions enabled. 4. Transaction is quickly propagated, increasing chances of winning the race. ## Availability and usage * Available from the [paid plans](https://chainstack.com/pricing). * Warp transactions (not requests, just the transactions) consumed are billed separately. For details, see [Pricing](https://chainstack.com/pricing/) with the [pay-as-you-go](/docs/manage-your-billing) setting enabled. * Only `eth_sendRawTransaction` calls are consumed as Warp transactions. * Deploy nodes close to your bot for best performance. See also [Sending Trader node warp transaction with web3.js, ethers.js, web3.py, and ethClient.go](/docs/sending-warp-transaction-with-web3js-ethersjs-web3py-and-ethclientgo). # Ethereum: Academic Certificates with Truffle Source: https://docs.chainstack.com/docs/ethereum-tutorial-academic-certificates-with-truffle **TLDR:** * You’ll build and deploy a contract-based DApp that generates and verifies academic certificates on Ethereum. * You’ll use Truffle locally to create, test, and compile your contracts. * You’ll integrate with Chainstack to run your own Ethereum Sepolia node and migrate your contracts. * By the end, you’ll have a fully working certificate-issuing DApp ready for a testnet deployment. ## Main article In this tutorial, you will: * Create a DApp that generates an academic certificate. * Deploy the DApp on a public Ethereum node using Chainstack. The contract and the Truffle configuration are in the [GitHub repository](https://github.com/chainstack/ethereum-certificates-tutorial). ## Prerequisites * [Chainstack account](https://console.chainstack.com/user/login) to deploy an Ethereum node * [Truffle Suite](https://trufflesuite.com/docs/truffle/how-to/install/) to create and deploy contracts ## Overview To get from zero to a deployed DApp on the Ethereum Sepolia testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the Ethereum Sepolia testnet. With Chainstack, access your Ethereum node credentials. With Truffle, create and compile the DApp contract. With Truffle, deploy the contract to your local development network. With Truffle, interact with the contract on your local development network. With Truffle, create and run the contract test. With Truffle, deploy the contract to your Ethereum node running with Chainstack. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Ethereum node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Get Sepolia testnet ether from a faucet In your MetaMask, fund each account with Sepolia ether from our [Sepolia Ethereum faucet](https://faucet.chainstack.com). ### Create and compile the contracts On your machine, create a directory for the contract. Initialize Truffle in the directory: ```shell Shell truffle init ``` This will generate the Truffle boilerplate structure: ```shell Shell . ├── contracts │ └── .gitkeep ├── migrations │ └── .gitkeep ├── test │ └── .gitkeep └── truffle-config.js ``` Go to the `contracts` directory. In the directory, create two files: `Ownable.sol` and `DocStamp.sol`. ```sol Ownable.sol //SPDX-License-Identifier:MIT // Ownable.sol pragma solidity ^0.5.0; contract Ownable { address public owner; event OwnershipTransferred(address indexed previousOwner, address indexed newOwner); constructor() public { owner = msg.sender; } modifier onlyOwner() { require(msg.sender == owner); _; } function transferOwnership(address newOwner) onlyOwner public { require(newOwner != address(0)); emit OwnershipTransferred(owner, newOwner); owner = newOwner; } } ``` This is an ownable contract. The contract implementation is the following: * Only an authority can generate a certificate. On contract deployment, the authority is the account that deploys the contract. The authority is the contract owner. * The contract owner can transfer their authority. ```sol DocStamp.sol //SPDX-License-Identifier:MIT // DocStamp.sol pragma solidity ^0.5.0; import './Ownable.sol'; contract DocStamp is Ownable { mapping (bytes32 => address) public records; event CertificateIssued(bytes32 indexed record, uint256 timestamp, bool returnValue); function issueCertificate(string calldata name, string calldata details) external onlyOwner { bytes32 certificate = keccak256(abi.encodePacked(name, details)); require(certificate != keccak256(abi.encodePacked(""))); records[certificate] = msg.sender; emit CertificateIssued(certificate, block.timestamp, true); } function owningAuthority() external view returns (address) { return owner; } function verifyCertificate(string calldata name, string calldata details, bytes32 certificate) external view returns (bool) { bytes32 certificate2 = keccak256(abi.encodePacked(name, details)); // are certificates the same? if (certificate == certificate2) { // does the certificate exist on the blockchain? if (records[certificate] == owner) { return true; } } return false; } } ``` This is the main contract. The contract handles the generation and verification of certificates. * `issueCertificate()` — generates a certificate by calculating a hash of the student name and details. * Can be called only by the owner. * Emits a certificate generation event with the timestamp. * The issuer puts the certificate on the blockchain by storing it in the global variable records by passing `records[certificate] = msg.sender`. * `owningAuthority()` — returns the address of issuer/authority. * `verifyCertificate()` — calculates a hash of the student name and details, and checks if the contract is on the blockchain. * Can be called by anyone. Create `2_deploy_contracts.js` in the `migrations` directory. ```javascript Javascript var DocStamp = artifacts.require("./DocStamp.sol"); module.exports = function(deployer) { deployer.deploy(DocStamp); }; ``` ### Deployment details Since DocStamp inherits from Ownable, Ownable will be deployed together with DocStamp. Compile the contracts: ```shell Shell truffle compile ``` This will compile the contracts and put them in your `build/contracts` directory in the `.json` format. If the contract does not compile, check the compiler version in your `truffle-config.js` file and ensure that your compiler version matches the pragma solidity version of the contract. ### Deploy the contract to your local development network Start the development network on your machine: ```shell Shell truffle develop ``` Without exiting the Truffle console, deploy the contract to the local development network: ```shell Shell truffle(develop)> migrate ``` This will deploy the contract to the development network as specified in the`truffle-config.js`. ### Interact with the contract on your local development network In your Truffle console, create an instance of the deployed contract: ```javascript Javascript let instance = await DocStamp.deployed() ``` You can run `instance` to see the contract object ABI, bytecode, and methods. Declare the contract owner: ```javascript Javascript let owner = await instance.owningAuthority() ``` You can run `owner` to see the account that deployed the contract and owns the contract. Issue the certificate: ```javascript Javascript let result = await instance.issueCertificate("John", "graduate", {from: owner}) ``` This issues the certificate. Run `result.logs` to view the full certificate details. ### Getting certificate details Running result will not print the certificate details in Truffle console. You must run `result.logs`. See also [Processing transaction results](https://www.trufflesuite.com/docs/truffle/getting-started/interacting-with-your-contracts#processing-transaction-results). Example output: ```shell Shell logIndex: 0, transactionIndex: 0, transactionHash: '0xb3ef241d76bd4d3a3d92ad4fd382785589033a4f561baa2895136a3315b3561b', blockHash: '0x29343b9fc5b88bb8c85287463a37a00e8fecce36553880365ca5395d9fb18eeb', blockNumber: 7, address: '0x3113Aa54D455142a254b43b83FB16c18eD30ba33', type: 'mined', id: 'log_dbbbec7e', event: 'CertificateIssued', args: Result { '0': '0x837e31a66aa8eec0d7adfd41f84175803ddcae69afd451598f2672f652b2c153', '1': [BN], '2': true, __length__: 3, record: '0x837e31a66aa8eec0d7adfd41f84175803ddcae69afd451598f2672f652b2c153', timestamp: [BN], returnValue: true ``` Note the `record` value in the output. This is the hash of the certificate values: name and details. You will need this hash to create the contract test later in this tutorial. Run the certificate verification: ```javascript Javascript let verify = await instance.verifyCertificate("NAME", "DETAILS", "CERTIFICATE_HASH", {from: owner}) ``` where * NAME — the student name on the certificate used on the issuing step. * DETAILS — any details * CERTIFICATE\_HASH — the hash of DETAILS and NAME. You should have received this hash in the record field at the previous step by running `result.logs`. Example: ```javascript Javascript let verified = await instance.verifyCertificate("John", "graduate", "0x837e31a66aa8eec0d7adfd41f84175803ddcae69afd451598f2672f652b2c153", {from: owner}) ``` Running `verify` will now print `true` if there is a match, and `false` if there is no match. ### Test the contract Navigate to the `test` directory. Create a `test.js` file: ```javascript Javascript const DocStamp = artifacts.require('./DocStamp.sol') contract('DocStamp', function(accounts) { it('should issue a certificate', async function() { const account = accounts[0] try { const instance = await DocStamp.deployed() await instance.issueCertificate("NAME", "DETAILS") const authority = await instance.owningAuthority() assert.equal(authority, account) } catch(error) { assert.equal(error, undefined) } }) it('should verify a certificate', async function() { const account = accounts[0] try { const instance = await DocStamp.deployed() const verified = await instance.verifyCertificate("NAME", "DETAILS", "CERTIFICATE_HASH") assert.equal(verified, true) } catch(error) { assert.equal(error, undefined) } }) }) ``` where * NAME — the student name on the certificate used on the issuing step. * DETAILS — any details * CERTIFICATE\_HASH — the hash of DETAILS and NAME. You should have received this hash in the record field at one of the previous steps by running `result.logs`. Example: ```javascript Javascript const DocStamp = artifacts.require('./DocStamp.sol') contract('DocStamp', function(accounts) { it('should issue a certificate', async function() { const account = accounts[0] try { const instance = await DocStamp.deployed() await instance.issueCertificate("John", "graduate") const authority = await instance.owningAuthority() assert.equal(authority, account) } catch(error) { assert.equal(error, undefined) } }) it('should verify a certificate', async function() { const account = accounts[0] try { const instance = await DocStamp.deployed() const verified = await instance.verifyCertificate("John", "graduate", "0x837e31a66aa8eec0d7adfd41f84175803ddcae69afd451598f2672f652b2c153") assert.equal(verified, true) } catch(error) { assert.equal(error, undefined) } }) }) ``` Run the test: ```shell Shell truffle test ``` The test run output should be `Passing`. ### See also [Truffle: Writing Tests in JavaScript](https://trufflesuite.com/docs/truffle/how-to/debug-test/write-tests-in-javascript/) ### Deploy the contract to your Ethereum node Install `HDWalletProvider`. [HDWalletProvider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider) is Truffle's separate npm package used to sign transactions. Run: ```shell Shell npm install @truffle/hdwallet-provider ``` Edit `truffle-config.js` to add: * HDWalletProvider * Your Ethereum node access and credentials. Example: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = "misery walnut expose ..."; module.exports = { networks: { development: { host: "127.0.0.1", port: 9545, network_id: "5777" }, sepolia: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: 11155111, gas: 4500000, gasPrice: 10000000000 } } }; ``` where * `sepolia` — any network name that you will pass to the `truffle migrate --network` command. * `HDWalletProvider` — Truffle's custom provider to sign transactions * `mnemonic` — your mnemonic that generates your accounts. You can also generate a mnemonic online with [Mnemonic Code Converter](https://iancoleman.io/bip39/). Make sure you generate a 15-word mnemonic. * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint. See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [Ethereum tooling](/docs/ethereum-tooling). * `network_id` — the Ethereum Sepolia testnet network ID: `5`. Run: ```shell Shell truffle migrate --network sepolia ``` This will engage `2_deploy_contracts.js` and deploy the contract to the Ethereum Sepolia testnet as specified in `truffle-config.js`. ### Get testnet ether You must get the Sepolia testnet ether to deploy the contract to the testnet. See also: * [Ethereum: Asset tokenization with Embark](/docs/ethereum-tutorial-asset-tokenization-with-embark) * [Ethereum: Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix) ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Ethereum: Asset Tokenization with Embark Source: https://docs.chainstack.com/docs/ethereum-tutorial-asset-tokenization-with-embark * You’ll build a simple contract in Embark that tokenizes assets with a fixed supply of 1,000 tokens. * You’ll set a price of 0.1 ether per token and let anyone exchange ether for asset tokens on Sepolia. * You’ll use Chainstack to deploy and run your own Sepolia node, and Geth to manage your accounts. * You’ll confirm everything works by testing the contract with Embark’s Cockpit and by sending ether from MetaMask. ## Main article This tutorial will guide you through creating a tokenized asset contract and deploying it on Sepolia testnet. For illustration purposes, this contract does the following: * Creates a total supply of 1,000 tokens specific to the asset. * Sets the token price to 0.1 ether. * Let anyone exchange ether for asset tokens. This tutorial uses [Embark](https://framework.embarklabs.io/) to test and deploy the contract. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Sepolia testnet node. * [Embark](https://framework.embarklabs.io/) to test and deploy the contract. * [Geth](https://geth.ethereum.org/) to create an Ethereum account that will deploy the contract. * [MetaMask](https://metamask.io/) to interact with the contract. ## Overview To get from zero to an asset tokenization contract running on Sepolia, do the following: With Chainstack, create a public chain project. With Chainstack, deploy a Sepolia testnet node. With Embark, create a project and the contract. With Geth, create a new account. Import the account in MetaMask and fund the account with Sepolia ether. With Embark, deploy the contract through the Sepolia node. With Embark, check the contract with Cockpit. Interact with the contract through MetaMask. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Deploy a Sepolia node See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get Sepolia testnet ether from a faucet In your MetaMask, fund each account with Sepolia ether [Chainstack's Sepolia faucet](https://faucet.chainstack.com). ### Create an Embark project and the contract Create a new Embark project: ```bash Shell embark new asset ``` This will create an Embark directory called `asset`. Change to the `contracts` directory of the Embark project. Create an `AssetTokenized.sol` file in the `contracts` directory: ```solidity solidity pragma solidity = 0.5.0; contract AssetTokenized{ uint public supply; uint public pricePerEth; mapping( address => uint ) public balance; constructor() public { supply = 1000; // There are a total of 1000 tokens for this asset pricePerEth = 100000000000000000; // One token costs 0.1 ether } function check() public view returns(uint) { return balance[msg.sender]; } function () external payable { balance[msg.sender] += msg.value/pricePerEth; // adds asset tokens to how much ether is sent by the investor supply -= msg.value/pricePerEth; //subtracts the remaining asset tokens from the total supply } } ``` ### Create an Ethereum account You will use this account to deploy the contract. Create the account: ```bash Shell geth account new ``` Check the path to the keystore file created for the new account: ```bash Shell geth account list ``` ### Import the account in MetaMask and fund the account In MetaMask, click **Import Account** > **JSON File.** Select the keystore file that you created earlier. Fund the account with [Sepolia ether](https://faucet.paradigm.xyz/). ### Deploy the contract In your Embark project directory, change to `config`. Append `contracts.js` with the following configuration: ```js JavaScript chainstack: { deployment:{ accounts: [ { privateKeyFile:"//root/.ethereum/keystore/UTC--2019-08-01T07-24-17.754471456Z- -73236c8d8aaee5263e8a32c71171030dd7a3e8e6", password:"PASSWORD" } ], host:"YOUR_CHAINSTACK_ENDPOINT", port:false, protocol:"https", type:"rpc" }, dappConnection: [ "$WEB3", // uses pre-existing Web3 object if available (e.g in Mist) "ws://localhost:8546", "" ], gas: "auto", } ``` where * `//root/.ethereum/keystore/...` — the location of the keystore file * PASSWORD — the password you provided when creating the Ethereum account with Geth * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). Deploy the contract with Embark: ```bash Shell embark run chainstack ``` where `chainstack` — the argument from the configuration file `contracts.js` This will deploy the contract on Sepolia. ### Check the contract with Cockpit On contract deployment, Embark runs Cockpit which is a front-end application to test the contract. In your browser, open: `` where CONTRACT\_NAME — the name of your contract. In this tutorial, the path is ``. This will also display the contract address in the `Deployed at` line. Test the contract by calling: * `supply()` — to check the remaining supply of tokens on the contract. * `check()` — to check the amount of tokens owned by the Ethereum address you are using to call the contract. * `pricePerEth()` — to check the token price in wei. ### Interact with the contract 1. In MetaMask, send an amount of Sepolia ether to the contract address. 2. In Cockpit, call the contract functions `supply()` and `check()` after a few seconds to see a change in values returned. ### See also Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Ethereum: Trust Fund Account with Remix Source: https://docs.chainstack.com/docs/ethereum-tutorial-trust-fund-account-with-remix **TLDR:** * You’ll build a trust fund contract that can accept deposits from anyone, but only the owner can withdraw. * Ownership can be transferred to another address, making it useful for simple DeFi scenarios. * You’ll rely on Chainstack for an Ethereum Sepolia node, and Remix + MetaMask for contract compilation and deployment. * By the end, you’ll have a fully functional trust fund contract deployed to Sepolia and ready to receive or withdraw Ether. ## Main article Unlike legacy finance systems, where you need to rely on a well-established third party, you can build your own financial instrument on Ethereum. The objective of this tutorial is to show how easy it is to build and run your own instance of a simple decentralized finance example, or DeFi. In this tutorial, you will: * Create a basic Trust Fund account smart contract with the following interaction options: * Fund the account from any Ethereum address * Withdraw all funds from the account only from the account owner's address * Withdraw partial funds from the account only from the account owner's address * Transfer the account ownership to any Ethereum address * Compile the smart contract with Remix IDE. * Deploy the smart contract to the Ethereum Sepolia testnet through a Chainstack node. * Interact with the smart contract through Remix IDE and a Chainstack node. ### Ethereum Sepolia testnet For illustration purposes, this tutorial uses Ethereum Sepolia testnet. For Ethereum mainnet, the steps are exactly the same, except you need to use mainnet ether. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an Ethereum node. * [Remix IDE](https://remix.ethereum.org/) to compile the contract and deploy it through MetaMask. * [MetaMask](https://metamask.io/) to deploy the contract through the Chainstack node and interact with the contract. ## Overview To get from zero to a deployed Trust Fund account on the Ethereum Sepolia testnet, do the following: With Chainstack, create a public chain project . With Chainstack, join the Ethereum Sepolia testnet. With Chainstack, access your Ethereum node credentials. Set up your MetaMask to work through a Chainstack node. With Remix IDE, create and compile the Trust Fund smart contract. Set up your Remix IDE to work through a Chainstack node. With Remix IDE, deploy the contract to the Ethereum Sepolia testnet. With Remix IDE, interact with the contract on the Ethereum Sepolia testnet. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Ethereum node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Set up MetaMask See [Ethereum tooling: MetaMask](/docs/ethereum-tooling#metamask). Having set up your MetaMask to interact through a Chainstack node, your Remix IDE will also interact with the network through a Chainstack node. Create at least two accounts in MetaMask. You need two accounts to transfer the contract ownership from one to another. In your MetaMask, fund each account with Sepolia ether [Chainstack's Sepolia faucet](https://faucet.chainstack.com). ### Create and compile the Trust Fund smart contract Open [Remix IDE](https://remix.ethereum.org/). On the home page, click **Environments** > **Solidity**. On the left pane, click **File explorers** > **+**. In the modal, give any name to your contract. For example, `transferableTrustFund.sol`. Put in the contract code: ```sol transferableTrustFund.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; /** * @title TransferableTrustFundAccount * @dev A trust fund account that can be transferred to a new owner */ contract TransferableTrustFundAccount { address payable public owner; // The current owner of the contract /** * @dev Modifier to restrict access to owner-only functions */ modifier onlyOwner() { require(msg.sender == owner, "Caller is not the owner"); _; } /** * @dev Initializes the contract by setting the initial owner */ constructor() { owner = payable(msg.sender); } /** * @dev Withdraws the entire contract balance to the owner */ function withdrawAll() public onlyOwner { uint256 balance = address(this).balance; require(balance > 0, "Insufficient funds"); owner.transfer(balance); } /** * @dev Withdraws a specified amount to the owner * @param amount The amount to withdraw */ function withdrawAmount(uint256 amount) public onlyOwner { uint256 balance = address(this).balance; require(balance >= amount, "Insufficient funds"); owner.transfer(amount); } /** * @dev Transfers ownership of the contract to a new account * @param newOwner The address of the new owner */ function transferAccount(address payable newOwner) public onlyOwner { require(newOwner != address(0), "Invalid address"); owner = newOwner; } /** * @dev Gets the current balance of the contract * @return The current balance of the contract */ function getBalance() public view returns (uint256) { return address(this).balance; } /** * @dev Gets the current owner of the contract * @return The address of the current owner */ function getOwner() public view returns (address) { return owner; } /** * @dev Fallback function to accept incoming transfers */ receive() external payable {} } ``` This is your Trust Fund account smart contract: * The contract belongs to the Ethereum account that deploys the contract through: ```sol transferableTrustFund.sol contract TransferableTrustFundAccount { address payable public owner; ... } ``` * The `onlyOwner` modifier is used to restrict access to certain functions in the smart contract to only the owner of the contract. A modifier is like a wrapper that can be applied to a function, and it modifies the behavior of the function in some way: ```sol transferableTrustFund.sol modifier onlyOwner() { require(msg.sender == owner, "Caller is not the owner"); _; } ``` * Only the contract owner can withdraw all funds from the contract through: ```sol transferableTrustFund.sol function withdrawAll() public onlyOwner { uint256 balance = address(this).balance; require(balance > 0, "Insufficient funds"); owner.transfer(balance); } ``` * Only the contract owner can withdraw partial funds from the contract through: ```sol transferableTrustFund.sol function withdrawAmount(uint256 amount) public onlyOwner { uint256 balance = address(this).balance; require(balance >= amount, "Insufficient funds"); owner.transfer(amount); } ``` * Anyone can deposit funds to the contract through: ```sol transferableTrustFund.sol receive() external payable {} ``` * Only the contract owner can transfer the contract ownership to any other Ethereum account through: ```sol transferableTrustFund.sol function transferAccount(address payable newOwner) public onlyOwner { require(newOwner != address(0), "Invalid address"); owner = newOwner; } ``` * Retrieve the current balance on the smart contract and the current owner with: ```sol transferableTrustFund.sol function getBalance() public view returns (uint256) { return address(this).balance; } function getOwner() public view returns (address) { return owner; } ``` Compile the contract. On the left pane, click **Solidity compiler** > **Compile**: ### Compilation failed? If the contract does not compile, make sure to change the compiler version to 0.8.0 or higher. ### Set up Remix IDE to work through your Chainstack node On the left pane, click **Deploy** and switch to **Injected Provider - Metamask**: This will engage your MetaMask and interact with the network through the Chainstack node provided in MetaMask. ### Deploy the Trust Fund smart contract On the left pane, click **Deploy & run transactions** > **Deploy:** This will engage your MetaMask to deploy the contract to the Ethereum Sepolia testnet through your currently selected MetaMask account. Click **Confirm** in the MetaMask modal. ### Interact with the Trust Fund smart contract Once the contract is deployed, fund the contract: 1. Copy the contract address under **Deployed Contracts**. 2. Open your MetaMask and send Sepolia ether to the copied contract address. Now that your contract is funded, you can interact with it. Expand the contract under **Deployed Contracts**: * `withdrawAmount` — enter any amount that is less than the current contract balance to withdraw partial funds. * `withdrawAll` — click to withdraw all funds from the contract. * `transferAccount` — enter any Ethereum address to transfer the contract ownership. For this example, enter the address of your second account in MetaMask. * `getBalance` — click to display the current ether balance in the smart contract; the balance is shown in Wei. * `getOwner` — click to display the address currently owning the Trust fund smart contract. ### See also ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # ethers.js ChainstackProvider: How to BUIDL a multi-chain wallet balance aggregator Source: https://docs.chainstack.com/docs/ethersjs-chainstackprovider-how-to-multi-chain-wallet-balance-aggregator **TLDR** * `ChainstackProvider` in ethers.js lets you connect to multiple blockchains (Ethereum, Polygon, Arbitrum, etc.) with minimal setup. * This Next.js DApp demonstrates fetching wallet balances from multiple chains via a simple API endpoint using ChainstackProvider for each network. * Keeping providers on the backend helps protect your endpoints, with the frontend only calling your API endpoint. * Easily scale to more EVM chains or extend functionality by adjusting the provider config and adding UI elements. ## Main article ethers.js recently added support for `ChainstackProvider`. This allows you to leverage Chainstack nodes in ethers out of the box. This tutorial will guide you through setting up a Next.js project, integrating ethers.js with `ChainstackProvider` building a fully functional DApp that cretrieves wallet balances from multiple blockchain networks. Learn more about `ChainstackProvider` and the supported chains on the [ethers ChainstackProvider Documentation](/reference/ethersjs-chainstackprovider). ### What is ethers.js? ethers.js is a JavaScript library designed to interact with EVM blockchains. It provides a lightweight and easy-to-use API for connecting to various blockchains, making transactions, interacting with smart contracts, and handling cryptographic functions. Due to its simplicity, flexibility, and robust features, Ethers.js is popular among developers, making it a go-to choice for building Ethereum-based applications. ## Prerequisites Spinning up a DApp with `ChainstackProvider` is so simple that you only need an IDE and Node.js installed. Install Node.js from the [official website](https://nodejs.org/en) if you don't have it yet. Find the repository for this DApp on the [Chainstack Labs GitHub](https://github.com/chainstacklabs/ethers-chainstackprovider-demo-nextjs) and see a functioning and deployed version of the DApp: [Multi-chain wallet balance aggregator DApp](https://ethers-chainstackprovider-demo-nextjs.vercel.app/) At this stage, let's go ahead and set up the Next.js project; create a new directory for your DApp and run the command to initialize Next.js: ``` npx create-next-app@14.2.3 ``` This project uses Next 14; select the following options during the setup: ``` ✔ What is your project named? … balances ✔ Would you like to use TypeScript? … No ✔ Would you like to use ESLint? … Yes ✔ Would you like to use Tailwind CSS? … Yes ✔ Would you like to use `src/` directory? … Yes ✔ Would you like to use App Router? (recommended) … Yes ? Would you like to customize the default import alias (@/*)? › No ``` Then, move into the newly created project and install the ethers library. To have access to the `ChainstackProvider` you need to at least ethers `V6.12.0`. ``` npm i ethers ``` At this point, you are ready to code. ## How to use ethers.js' `ChainstackProvider`? Let's briefly talk about the `ChainstackProvider`. It provides a very convenient way to spin up DApps and prototypes but keep in mind that the endpoints used are heavily throttled and will not be suitable for a production environment. The good news is that `ChainstackProvider` supports your own Chainstack nodes as well; you just need to deploy a [Global Node](/docs/global-elastic-node) from the Chainstack console and initialize the provider with the node authorization key. Follow these steps to sign up on Chainstack, deploy a node, and find your endpoint credentials: You must deploy a Global Node to use the authorization key in `ChainstackProvider`. Once deployed, your node RPC ULR will look like this: > [https://ethereum-mainnet.core.chainstack.com/AUTH\_KEY](https://ethereum-mainnet.core.chainstack.com/AUTH_KEY) Now you can add the `AUTH_KEY` to the `ChainstackProvider` instance: ```javascript ethers const ethers = require("ethers"); // Create a ChainstackProvider instance for Ethereum mainnet const chainstack = new ethers.ChainstackProvider("mainnet", "AUTH_KEY"); ``` Check out the supported chains on the [ethers ChainstackProvider Documentation](/reference/ethersjs-chainstackprovider). ## Project overview ### Core DApp functionality 1. **Address input**: Users enter an Ethereum wallet address into a text field. 2. **Fetch balances**: Upon submitting the address, the DApp communicates with a backend API to fetch the wallet balances from various blockchain networks. 3. **Display results**: The DApp displays the retrieved balances in a user-friendly format. To achieve this functionality, ywork with two main components: 1. **Frontend UI (page.js)**: * This file handles the user interface, including the input field for the wallet address and the button to trigger the balance check. * It also displays the fetched balances and handles loading and error states. 2. **Backend API (route.js)**: * This file manages the logic for connecting to your blockchain networks using ChainstackProvider. * It defines an API endpoint the frontend can call to fetch the balances. * Since you use the backend to communicate with a chain, you don't expose any endpoints on the front end, preserving good security practices. Focusing on these two components streamlines the development process and demonstrates the simplicity of integrating ethers.js with `ChainstackProvider` in a Next.js project. ## The front-end code Let's first work on the UI. Once your project is initialized, go to `...src/app/page.js` and paste the following code: ```javascript page.js "use client"; import { useState } from "react"; import { ethers } from "ethers"; export default function Home() { const [walletAddress, setWalletAddress] = useState(""); const [balances, setBalances] = useState(null); const [loading, setLoading] = useState(false); const [error, setError] = useState(null); const handleCheckBalances = async () => { if (!ethers.isAddress(walletAddress)) { setError("Invalid Ethereum address"); return; } setLoading(true); setError(null); setBalances(null); // Clear previous balances try { const response = await fetch(`/api/balances`, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ walletAddress }), }); if (!response.ok) throw new Error("Failed to fetch balances"); const data = await response.json(); setBalances(data); } catch (err) { setError(err.message); } setLoading(false); }; return (

Chainstack Powered

Multi-chain Wallet Balance Aggregator

This simple DApp uses the ChainstackProvider from ether.js to interact with those chains.

Learn more about the ChainstackProvider from ether.js on the{" "} Chainstack documentation .

setWalletAddress(e.target.value)} className="w-full p-2 mb-4 border rounded bg-gray-700 border-gray-600 text-white" /> {error &&

{error}

} {balances && (

Ethereum:{" "} {parseFloat(balances.ethereum).toFixed(5)} ETH

Polygon:{" "} {parseFloat(balances.polygon).toFixed(5)} MATIC

BNB Smart Chain:{" "} {parseFloat(balances.bnb).toFixed(5)} BNB

Arbitrum:{" "} {parseFloat(balances.arbitrum).toFixed(5)} ETH

)}
); } ```
### Breakdown of `page.js` The `page.js` file is responsible for the user interface of your DApp. It allows users to input their Ethereum wallet address and fetches the wallet balances from multiple blockchain networks by interacting with your backend API. Here’s a simple breakdown of the important parts: #### Import required modules ```javascript Javascript "use client"; import { useState } from "react"; import { ethers } from "ethers"; ``` You import `useState` from React for managing state within the component and `ethers` from the ethers.js library to validate Ethereum addresses. #### State management ```javascript Javascript const [walletAddress, setWalletAddress] = useState(""); const [balances, setBalances] = useState(null); const [loading, setLoading] = useState(false); const [error, setError] = useState(null); ``` You define several state variables: * `walletAddress` to store the user-inputted wallet address. * `balances` to store the fetched balances. * `loading` to indicate the loading state while fetching data. * `error` to display any errors that occur during the fetching process. #### Handle balance check ```javascript Javascript const handleCheckBalances = async () => { if (!ethers.isAddress(walletAddress)) { setError("Invalid Ethereum address"); return; } setLoading(true); setError(null); setBalances(null); // Clear previous balances try { const response = await fetch(`/api/balances`, { method: "POST", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ walletAddress }), }); if (!response.ok) throw new Error("Failed to fetch balances"); const data = await response.json(); setBalances(data); } catch (err) { setError(err.message); } setLoading(false); }; ``` This function handles the balance-checking process and calls the API that uses ethers and the `ChainstackProvider`, you take care of that in the following section: 1. **Validation**: Checks if the input is a valid Ethereum address. 2. **Loading state**: Sets the loading state to true and clears previous balances and errors. 3. **API interaction**: Sends a `POST` request to the `/api/balances` endpoint with the wallet address. 4. **Response handling**: Processes the response, updating the `balances` state or sets an error message if the request fails. 5. **Loading state**: Resets the loading state to false after the request completes. #### User interface ```javascript Javascript return (

Chainstack Powered

Multi-chain Wallet Balance Aggregator

This simple DApp uses the ChainstackProvider from ether.js to interact with those chains.

Learn more about the ChainstackProvider from ether.js on the{" "} Chainstack documentation .

setWalletAddress(e.target.value)} className="w-full p-2 mb-4 border rounded bg-gray-700 border-gray-600 text-white" /> {error &&

{error}

} {balances && (

Ethereum:{" "} {parseFloat(balances.ethereum).toFixed(5)} ETH

Polygon:{" "} {parseFloat(balances.polygon).toFixed(5)} MATIC

BNB Smart Chain:{" "} {parseFloat(balances.bnb).toFixed(5)} BNB

Arbitrum:{" "} {parseFloat(balances.arbitrum).toFixed(5)} ETH

)}
); ```
The UI consists of: * A header with titles and descriptions. * An input field for the wallet address. * A button to trigger the balance check. * Conditional rendering to display errors or balances. This simple breakdown focuses on the key functionality: interacting with the backend API to fetch and display wallet balances. Next, you get into the backend logic in `route.js` to understand how the balances are retrieved from multiple blockchain networks using ChainstackProvider. ## The backend API Now let's work on the heart of the DApp, the API endpoint using the `ChainstackProvider` to fetch data from the chains. In `src/app` create a new directory for the API route. The final path will be `...src/app/api/balances/route.ts`. Create each directory, `api`, `balances`, and then the `route.js` file in it. Once you have `route.js` paste this code into it: ```javascript route.ts const ethers = require("ethers"); const providers = { ethereum: new ethers.ChainstackProvider("mainnet"), polygon: new ethers.ChainstackProvider("matic"), bnb: new ethers.ChainstackProvider("bnb"), arbitrum: new ethers.ChainstackProvider("arbitrum"), }; async function getBalance(network, walletAddress) { const provider = providers[network]; const balance = await provider.getBalance(walletAddress); return ethers.formatEther(balance); // Convert balance from wei to ether } async function getAllBalances(walletAddress) { const balances = {}; for (const network in providers) { balances[network] = await getBalance(network, walletAddress); } return balances; } export async function POST(req) { const { walletAddress } = await req.json(); if (!walletAddress) { return new Response( JSON.stringify({ error: "Wallet address is required" }), { status: 400 } ); } try { const balances = await getAllBalances(walletAddress); return new Response(JSON.stringify(balances), { status: 200 }); } catch (error) { console.error("Error fetching balances:", error); return new Response(JSON.stringify({ error: "Internal Server Error" }), { status: 500, }); } } ``` ### Breakdown of `route.ts` The `route.ts` file handles the backend logic for your DApp. It interacts with multiple blockchain networks using `ChainstackProvider` to fetch the wallet balances. Here’s a detailed breakdown of the important parts of the code: #### Import ethers.js ```javascript Javascript const ethers = require("ethers"); ``` You import the ethers library, which provides the necessary tools to interact with Ethereum and other blockchain networks. #### Initialize providers ```javascript Javascript // Initialize a provider for each supported chain const providers = { ethereum: new ethers.ChainstackProvider("mainnet"), polygon: new ethers.ChainstackProvider("matic"), bnb: new ethers.ChainstackProvider("bnb"), arbitrum: new ethers.ChainstackProvider("arbitrum"), }; ``` You create an object `providers` that holds instances of `ChainstackProvider` for each supported blockchain network. This allows you to connect to Ethereum, Polygon, BNB Smart Chain, and Arbitrum networks. #### Fetch balance for a single network ```javascript Javascript async function getBalance(network, walletAddress) { const provider = providers[network]; const balance = await provider.getBalance(walletAddress); return ethers.formatEther(balance); // Convert balance from wei to ether } ``` The `getBalance` function takes a network name and a wallet address as parameters. It uses the corresponding provider to fetch the balance of the given wallet address and converts the balance from `wei` to `ether` using `ethers.formatEther`. #### Fetch balances for all networks ```javascript Javascript async function getAllBalances(walletAddress) { const balances = {}; for (const network in providers) { balances[network] = await getBalance(network, walletAddress); } return balances; } ``` The `getAllBalances` function takes a wallet address as a parameter. It iterates over all the networks defined in the `providers` object, fetches the balance for each network using the `getBalance` function, and stores the results in the `balances` object. #### Handle the POST request ```javascript Javascript export async function POST(req) { const { walletAddress } = await req.json(); if (!walletAddress) { return new Response( JSON.stringify({ error: "Wallet address is required" }), { status: 400 } ); } try { const balances = await getAllBalances(walletAddress); return new Response(JSON.stringify(balances), { status: 200 }); } catch (error) { console.error("Error fetching balances:", error); return new Response(JSON.stringify({ error: "Internal Server Error" }), { status: 500, }); } } ``` The `POST` function handles incoming POST requests to your API endpoint: 1. **Parse the request**: It extracts the `walletAddress` from the request body. 2. **Validate**: If the `walletAddress` is missing; it returns a 400 status with an error message. 3. **Fetch balances**: It calls the `getAllBalances` function to fetch the balances for all networks. 4. **Return the response**: It returns the balances with a 200 status if successful. If an error occurs, it logs it and returns a 500 status with an error message. The `route.ts` file handles the backend logic for your DApp. It connects to multiple blockchain networks using the`ChainstackProvider` to fetch the wallet balances and returns the results to the front end. This is a good way to keep your endpoints secure, as the front end does not call the RPC node directly but calls something similar to a proxy server instead. ## Run the DApp And now you have a complete DApp; it was indeed super easy with the `ChainstackProvider`. Now, to run it, simply start the development server: ``` npm run dev ``` Your DApp is now running on `http://localhost:3000`. You can use this address to test it: `0x95aD61b0a150d79219dCF64E1E6Cc01f0B64C4cE`. ## Conclusion In this tutorial, you've walked through creating a multi-chain wallet balance aggregator DApp using Ethers.js and `ChainstackProvider`. By integrating these powerful tools into a Next.js project, you've seen for yourself how easy it is to fetch and display wallet balances from multiple blockchain networks. This project showcases the simplicity and efficiency of using ChainstackProvider to interact with various blockchain networks, making it an excellent choice for developers looking to build decentralized applications. # Expanding your blockchain horizons: The eth_getBlockReceipts emulator Source: https://docs.chainstack.com/docs/expanding-your-blockchain-horizons-the-eth_getblockreceipts-emulator **TLDR** * Demonstrates building a stand-in for the eth\_getBlockReceipts method (available natively on Erigon and newer Geth versions) using the eth\_getBlockByNumber and eth\_getTransactionReceipt calls on any EVM chain. * Uses Node.js and web3.js to fetch all transaction receipts for a specified block, grouping them into a single array – effectively replicating the one-call convenience of eth\_getBlockReceipts. * Highlights the parsing of receipt logs for more user-friendly output and includes a helper function to selectively extract specific fields (e.g. gasUsed) from each receipt. * Offers a practical solution to unify transaction + receipt data retrieval on clients that lack the native eth\_getBlockReceipts endpoint. ## Main article The `eth_getBlockReceipts` method is a powerful tool in the Ethereum ecosystem that offers a window into the inner workings of the network. Retrieving the receipts of all transactions within a block provides insight into the status and outcome of each transaction. Whether you're developing a decentralized exchange, a contract auditing tool, or just curious about the Ethereum network, the `eth_getBlockReceipts` method is an essential resource that can help you uncover the details and outcomes of transactions on the blockchain. It's like having an x-ray of the Ethereum network, revealing what's happening beneath the surface and providing a clear picture of the network's activity. The caveat is that this method is only available by querying nodes running the [Erigon client](/docs/protocols-clients). This guide will show you how you can emulate this method in essentially any EVM-compatible network, even if the node is not running on Erigon. Read the [Erigon vs. Geth](https://chainstack.com/ethereum-clients-geth-and-erigon/) guide to get a better understanding of these two popular Ethereum clients. This article is code-centered, and you should read the [Uncovering the power of eth\_getBlockReceipts](/docs/uncovering-the-power-of-ethgetblockreceipts) guide to have more theory insights. ### Update on eth\_getBlockReceipts As of September 2023, the `eth_getBlockReceipts` method is also available on the Geth client from V [1.13.0](https://github.com/ethereum/go-ethereum/releases/tag/v1.13.0). ## Prerequisites To start with a JavaScript development project, you'll need to install `node.js`, a powerful JavaScript runtime environment that enables developers to run JavaScript code outside a web browser. For this project, it's recommended to use at least version 16. You can [download it from their website](https://nodejs.org/en/download/). With `node.js` installed, you're ready to start using JavaScript. However, you'll need access to a blockchain node to query the data. Here's where Chainstack comes in to save the day. Simply follow these steps to sign up and deploy your own blockchain node with Chainstack for free: ### Disclaimer This tool was developed and tested using an Avalanche endpoint, but you can choose any EVM-compatible network. ## The project Now that you have all the tools required, you are ready to create your `eth_getBlockReceipt` emulator. ## Initialize an npm project An npm (Node Package Manager) project is a software project managed using the `npm` platform. npm is the default package manager for the JavaScript runtime environment node.js and is needed to manage the dependencies and packages used in a project. At the heart of an `npm` project is the `package.json` file, a blueprint that outlines the packages and dependencies your project requires. This file acts as a roadmap, guiding npm in managing your project's dependencies. `npm` makes it easy for developers to share and reuse code, and its vast library of packages can be easily installed and used in any project. This allows developers to focus on writing their own code and let `npm` manage external dependencies. To initialize an `npm` project, open your terminal in the root directory of your project and run the following command: ```bash Shell npm init ``` This command will prompt a few questions, answer them, and it will create a `package.json` file. Note that these answers do not affect the functionality of your project, so don’t stress too much about what you input. You can also run the command with the `-y` flag, which will skip all questions. ```bash Shell npm init -y ``` You have now created an `npm` project and are ready to start development. ## Install the dependencies As previously mentioned, creating this tool requires some npm packages. Specifically the `dotenv` and `web3` packages. During development, we used the following versions: * dotenv ^16.0.3 * web3 ^1.8.1 Install those packages by running the following: ```bash Shell npm i web3 dotenv ``` [web3.js](https://web3js.readthedocs.io/en/v1.8.2/) is a tool for interacting with an EVM-compatible blockchain, while dotenv is a package for managing configuration settings in a secure and convenient manner. Together, these two packages provide a powerful and flexible toolkit for developing DApps. ## The code This `eth_getBlockReceipt` emulator will be developed on a single JavaScript file, so go ahead and create a file named `index.js` in the root directory of your project. The tool will do the following: * Create a provider instance and import packages. * Generate an eth\_getBlockReceipt-like object with a function. * Loop into the object and extract specific fields with a function. * Run the program with the main function. ### Import packages and create a provider instance The first step of any tool using the web3.js library is to import the necessary packages and create a provider instance. So add the following code at the top of your `index.js` file. ```js index.js const Web3 = require("web3"); require('dotenv').config(); // Create provider instance const NODE_URL = process.env.CHAINSTACK_NODE_URL; const web3 = new Web3(NODE_URL); ``` This creates a `web3` instance that can access a high-level API for interacting with a blockchain; essentially, it’s your connection to the blockchain node. This tool uses an environment variable to import the node URL to keep sensitive information safe, so it is more difficult to inadvertently share it. This is done using the `dotenv` package. Create a file named `.env` in the root directory and create a variable holding your Chainstack node URL. ```js .env CHAINSTACK_NODE_URL="YOUR_ENDPOINT_URL_HERE" ``` It is important that the environment variable has the same name as the variable imported in the `index.js` file. ### Generate an eth\_getBlockReceipt-like object Now, let’s start working on the bulk of the logic for this project. Create a new `async` function named `getBlockReceipts` which takes two parameters: the `provider` and the target `block` for retrieving receipts. Then paste the following code into the `index.js` file: ```js index.js async function getBlockReceipts(provider, block) { const extractTransactions = await provider.eth.getBlock(block, false) let receipts = []; for (const transaction of extractTransactions.transactions) { // Get the transaction receipt const txReceipt = await web3.eth.getTransactionReceipt(transaction); for (let log of txReceipt.logs) { let logData = { address: log.address, topics: log.topics, data: log.data, blockNumber: log.blockNumber, transactionHash: log.transactionHash, transactionIndex: log.transactionIndex, blockHash: log.blockHash, logIndex: log.logIndex, removed: log.removed, id: log.id } txReceipt.logs = logData; } // replace the original logs array with the logData receipts.push(txReceipt); } return receipts; } ``` Let’s go over the logic here. The first step is to retrieve all of the transactions hashed from the desired block. This can be achieved by utilizing the [`eth_getBlockByNumber`](/reference/ethereum_node_api/blocks_info/eth_getBlockByNumber) method from the Ethereum JSON-RPC API. When calling this method, set the *full transactions* flag to `false` to ensure that only the transaction hashes are returned in an array format. The transaction hashes retrieved in this step will be stored in the `extractTransactions` constant for further processing. ```js JavaScript const extractTransactions = await provider.eth.getBlock(block, false) ``` The next step will loop through the array of hashes and call the [`eth_getTransactionReceipt`](/reference/ethereum_node_api/transaction_info/eth_getTransactionReceipt) method on each hash, do some parsing, and store the result in an array called `receipts`. ```js JavaScript let receipts = []; for (const transaction of extractTransactions.transactions) { // Get the transaction receipt const txReceipt = await web3.eth.getTransactionReceipt(transaction); for (let log of txReceipt.logs) { let logData = { address: log.address, topics: log.topics, data: log.data, blockNumber: log.blockNumber, transactionHash: log.transactionHash, transactionIndex: log.transactionIndex, blockHash: log.blockHash, logIndex: log.logIndex, removed: log.removed, id: log.id } txReceipt.logs = logData; } // replace the original logs array with the logData receipts.push(txReceipt); } ``` The second loop for `(let log of txReceipt.logs)` will extract the logs data and replace the `logs object` from the original receipt. This step aims to provide a more user-friendly representation of the logs data. Without this processing, the `logs` data in the transaction receipt would only be returned as an array of objects, `[object, object]`, which may require additional steps from the end-user to extract the relevant information. By replacing the original logs object with the `logData` object, the final response includes a more intuitive representation of the log data, eliminating the need for additional processing. Once all the logs in the transaction receipt have been processed, the code proceeds to the next iteration of the loop and repeats the same process for the next transaction. The final step pushes the unpacked transaction receipt into the `receipts` array, and after all of the transactions have been processed, the function returns the `receipts` array. ### Support function to isolate fields This script also includes a support function called `getElement` which takes the receipts array returned by `getBlockReceipts` and the name of a field as the input parameters. Paste the following function into the `index.js` file. ```js index.js // Extract a specific field and return an array async function getElement(receipts, field) { return receipts.map(receipt => receipt[field]); } ``` The function returns an array that contains the values of the specified field from each object in the `receipts` array. This is achieved through the `map` method, which iterates over each element in the `receipts` array and returns the specified field's value for that element. In essence, the `getElement` function provides a convenient way to extract a specific field from a collection of objects and returns the extracted values as an array. For example: ```js JavaScript const logData = await getElement(receipts, 'gasUsed'); // Response = [ 396022, 1442653, 664107 ] ``` ### Main function and full code The final step of the script is the `main()` function, which runs the full code and demonstrates the logic. Paste the following at the bottom of the file. ```js index.js async function main() { const blockNumber = 25792736; // Retrieve the transactions receipts const receipts = await getBlockReceipts(web3, blockNumber); console.log(receipts); // Use the getElement function to extract a specific field from each receipt. const logData = await getElement(receipts, 'to'); console.log(logData); } main(); ``` This function can be seen as the execution of the program where: * A block number is specified as input to the function. * The `getBlockReceipts` function is invoked within the function. * The results of the `getBlockReceipts` function are output to the console. * The second part of the function calls the `getElement` function. * The `to` field is extracted from each receipt using the `getElement` function. ### Full code The previous sections show the script broken down into all its components to make it easier to understand. Here, you can find the full code: ```js index.js const Web3 = require("web3"); require('dotenv').config(); // Create provider instance const NODE_URL = process.env.CHAINSTACK_NODE_URL; const web3 = new Web3(NODE_URL); // Create receipts object async function getBlockReceipts(provider, block) { // Extract the transactions from the block const extractTransactions = await provider.eth.getBlock(block, false) let receipts = []; for (const transaction of extractTransactions.transactions) { // Get the transaction receipt const txReceipt = await web3.eth.getTransactionReceipt(transaction); // Parse the logs information for (let log of txReceipt.logs) { let logData = { address: log.address, topics: log.topics, data: log.data, blockNumber: log.blockNumber, transactionHash: log.transactionHash, transactionIndex: log.transactionIndex, blockHash: log.blockHash, logIndex: log.logIndex, removed: log.removed, id: log.id } txReceipt.logs = logData; } // replace the original logs array with the logData receipts.push(txReceipt); } return receipts; } // Extract a specific field and return an array async function getElement(receipts, field) { return receipts.map(receipt => receipt[field]); } async function main() { const blockNumber = 25792736; const receipts = await getBlockReceipts(web3, blockNumber); console.log(receipts); // Use the getElement function to extract a specific field from each receipt. const logData = await getElement(receipts, 'to'); console.log(logData); } main(); ``` ## Run the program Save the file and run the program by executing the following command: ```bash Shell node index ``` If this program is run on block `25792736` on the Avalanche mainnet, the response will be the following: ```bash Shell [ { blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', blockNumber: 25792736, contractAddress: null, cumulativeGasUsed: 396022, effectiveGasPrice: 27500000000, from: '0x6e752dcb0acb921c1fa446992c590a28661f27ca', gasUsed: 396022, logs: { address: '0xB31f66AA3C1e785363F0875A1B74E27b85FD66c7', topics: [Array], data: '0x0000000000000000000000000000000000000000000000027eaff286463ffb12', blockNumber: 25792736, transactionHash: '0x3708f39015b7816a156d0fc24ab7f658bcac130ebdd63e62ae93b9c8093ad41e', transactionIndex: 0, blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', logIndex: 10, removed: false, id: 'log_e7f37c48' }, logsBloom: '0x00000000000000000020000000000001000000040000000020000000000000000000000000400040004000000000000100000000000020020000420000200000040080000000080800000008000000008000000000400000000000000408020000000011000008400000000000004000000000000000040000000010000800010008000002000000080000000000080000000010000000000000000000000000020400000000000000000000000004000000000000004100000000000000000080000002021004000000000001000000000000000000008000001802000000000010000000000004000400000008000000000000000200020000000000000100', status: true, to: '0x1111111254eeb25477b68fb85ed929f73a960582', transactionHash: '0x3708f39015b7816a156d0fc24ab7f658bcac130ebdd63e62ae93b9c8093ad41e', transactionIndex: 0, type: '0x0' }, { blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', blockNumber: 25792736, contractAddress: null, cumulativeGasUsed: 1838675, effectiveGasPrice: 26500000000, from: '0x0d6c6017b639c3ee31c79f8a300acd5cbd1ab866', gasUsed: 1442653, logs: { address: '0x83a283641C6B4DF383BCDDf807193284C84c5342', topics: [Array], data: '0x00000000000000000000000000000000000000000000003291a743cf2f536ff5', blockNumber: 25792736, transactionHash: '0x182eab9950cf7d6711222a437331995e9484b15126abd575c529bda13cc26017', transactionIndex: 1, blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', logIndex: 15, removed: false, id: 'log_98091197' }, logsBloom: '0x00000000000000000000000000040000001000000000000000000000000000000080001008020000000000000040800000000000000000000000000000000000000040000000000000000008020000000000000000000000000000000000008000000000000008000000008010020001000000000000000000000110000000000000000000020000100000000000080000000090000000000000000000002800000000008000000000000040000000000000000000000010000000020400000000040002000000000000000000000000200000010000100000000080000000000000040000040000008000800000000000000000002000000000000000000000', status: true, to: '0x0efc8ef83d7318121449e9c5dbdf7135bcc1fa90', transactionHash: '0x182eab9950cf7d6711222a437331995e9484b15126abd575c529bda13cc26017', transactionIndex: 1, type: '0x2' }, { blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', blockNumber: 25792736, contractAddress: null, cumulativeGasUsed: 2502782, effectiveGasPrice: 26500000000, from: '0xab1d3dd66e0f0799d09ca530c30e8f0b90d87f85', gasUsed: 664107, logs: { address: '0xc8cEeA18c2E168C6e767422c8d144c55545D23e9', topics: [Array], data: '0x0000000000000000000000000000000000000000000000034f3202e0e51db900', blockNumber: 25792736, transactionHash: '0x9cdcd3970e0666dabbbdbc386425f1760830087215be4b495c7a4d4a27a596a7', transactionIndex: 2, blockHash: '0x451155957eee73e4ea17edd5a26e4aeaff30cc828b2a4a81f2197d7d980cd00e', logIndex: 40, removed: false, id: 'log_a3adc356' }, logsBloom: '0x000080002000000000000000000000000080001000004000000000000001010200000000000080000000000002000000000000000000000000040000002400000001000000000000000000080000000000000000008400100000000080008000000004000200000000000000000008000000000000080000000000100000000100400010800000000000000000000000000000010000000000000800000000800200000000000000000000000000000000000200000000000002000040000000000400020000000000000000000200010000000000000000000000000000600001100c0000000000000000000004000000200000040000400000000000000000', status: true, to: '0xc8ceea18c2e168c6e767422c8d144c55545d23e9', transactionHash: '0x9cdcd3970e0666dabbbdbc386425f1760830087215be4b495c7a4d4a27a596a7', transactionIndex: 2, type: '0x2' } ] [ '0x1111111254eeb25477b68fb85ed929f73a960582', '0x0efc8ef83d7318121449e9c5dbdf7135bcc1fa90', '0xc8ceea18c2e168c6e767422c8d144c55545d23e9' ] ``` ## Conclusion In conclusion, this script effectively leverages the power of the Web3 library and its methods. With this function, you can now use `eth_getBlockReceipts` even if your node is not running the Erigon client. At the same time, you just learned that, if a specific method is not available, you can always build it yourself. ### See also ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Exploring Bitcoin transactions with `getrawtransaction` Source: https://docs.chainstack.com/docs/exploring-bitcoin-transactions-with-getrawtransaction **TLDR** * `getrawtransaction` retrieves raw or decoded Bitcoin transaction data by `txid`, optionally with a specific block hash. * Verbose mode (`true`) returns a detailed JSON, while omitting it yields the raw hex string. * Transactions typically remain in the mempool for around 14 days; dropped transactions require resubmission with adequate fees. * Bitcoin Core runs many operations single-threaded, so complex or parallel requests may slow under heavy load. ## Main article In Bitcoin blockchain development, the **`getrawtransaction`** method serves as a fundamental tool for retrieving detailed information about specific transactions. This method provides developers with insights into transaction details, offering a closer look into the intricacies of Bitcoin's decentralized ledger. Before you begin testing the **`getrawtransaction`** method, it's essential to have access to a functioning Bitcoin node. Chainstack offers a convenient and efficient way to deploy and manage Bitcoin nodes. Follow these detailed steps to sign up on Chainstack, deploy your node, and access your endpoint credentials: ## Exploring **`getrawtransaction`** The **`getrawtransaction`** method is a crucial tool in Bitcoin blockchain development, allowing developers to access detailed information about transactions. This method is used to retrieve data either in a raw format or as a decoded, more readable JSON object, depending on the parameters provided. ### Parameters * `txid`: The transaction ID, a unique identifier for the transaction. * `verbose` (optional): A boolean flag. When set to `true`, the method returns a JSON object with detailed transaction data. If set to `false` or omitted, it returns the raw transaction data in hexadecimal format. #### Example Usage To fetch the detailed transaction data using cURL: ```bash Bash curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'content-type: text/plain;' \ --data '{"jsonrpc": "1.0", "id": "1", "method": "getrawtransaction", "params": ["dd3025c6da8f546fcdb059428b74bf560efe0b360e90e46bd428de0905fdb3f2", true]}' ``` Replace `YOUR_CHAINSTACK_ENDPOINT` with the endpoint from your [Console](/docs/manage-your-node#view-node-access-and-credentials). ### Response The response object varies based on the `verbose` parameter: * If `verbose` is `false` or omitted, the response is a string containing the raw transaction data in hexadecimal format. * If `verbose` is `true`, the response is a JSON object with detailed transaction information, including: * `txid`: The transaction ID. * `hash`: The transaction hash; differs from `txid` for segwit transactions. * `version`: The version of the transaction. * `size`: The size of the transaction in bytes. * `vsize`: The virtual size of the transaction (considering segwit discount). * `weight`: A metric for the transaction's size. * `locktime`: The locktime of the transaction. * `vin`: An array of input objects, each containing details like scripts and sequences. * `vout`: An array of output objects, detailing where the bitcoins are going, including value and scripts. * Additional fields like `blockhash`, `confirmations`, and `time` are included if the transaction is confirmed. ### Understanding the parameters #### 1. **Detailed Transaction Information: `verbose` Parameter** The **`verbose`** parameter is optional and determines the format of the response. When set to **`true`**, it returns a JSON object with detailed transaction information, including inputs, outputs, and transaction metadata. If set to **`false`** or omitted, the response is the raw hexadecimal transaction data. ##### Example without `verbose`: ```bash Bash curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'content-type: text/plain;' \ --data '{"jsonrpc": "1.0", "id": "1", "method": "getrawtransaction", "params": ["dd3025c6da8f546fcdb059428b74bf560efe0b360e90e46bd428de0905fdb3f2"]}' ``` This command returns the raw transaction data in hexadecimal format. To decode it, use the `decoderawtransaction` method. ```jsx jsx curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'content-type: text/plain;' \ --data '{"jsonrpc": "1.0", "id": "1", "method": "decoderawtransaction", "params": ["020000000001010000000000000000000000000000000000000000000000000000000000000000ffffffff31031f960c04affc9b652f466f756e6472792055534120506f6f6c202364726f70676f6c642f2a9e8f693dfb000000000000ffffffff023df0692a0000000016001435f6de260c9f3bdee47524c473a6016c0c055cb90000000000000000266a24aa21a9edc3d6ba5653cb4443ee0ef0f6778e5cee98e4e2eac4282120750f91f41a1e087a0120000000000000000000000000000000000000000000000000000000000000000000000000"]}' ``` ### 2. **Fetch Transaction by Block: `blockhash` Parameter** The **`blockhash`** parameter is used to retrieve a transaction by specifying the hash of the block in which it's included. This is particularly useful for accessing transactions within a specific block. #### Example using `blockhash`: ```bash Bash curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'content-type: text/plain;' \ --data '{"jsonrpc": "1.0", "id": "1", "method": "getrawtransaction", "params": ["dd3025c6da8f546fcdb059428b74bf560efe0b360e90e46bd428de0905fdb3f2" , true, "000000000000000000015c7c5ce593387df1adbb494cf2a9d261bb56095d1769"]}' ``` This command fetches detailed information about the specified transaction within the given block. ### Real-World Applications of `getrawtransaction` The `getrawtransaction` method is not just a theoretical tool; it finds practical applications in various real-world scenarios. For instance: 1. **Transaction Auditing**: Financial institutions and auditors use `getrawtransaction` to verify the authenticity of transactions. They can inspect details such as input and output addresses, transaction values, and fees to ensure compliance with regulatory standards. 2. **Wallet Functionality**: Cryptocurrency wallet applications leverage this method to fetch transaction details. When a user receives or sends Bitcoin, the wallet can display comprehensive transaction information, enhancing user transparency and trust in the wallet's operations. 3. **Blockchain Analytics**: Companies specializing in blockchain analytics use `getrawtransaction` to gather data for analysis. By decoding transaction details, they can identify patterns, track asset flows, and detect suspicious activities, contributing to anti-money laundering (AML) efforts. 4. **Network Fee Estimation**: Services that provide fee estimation for Bitcoin transactions often analyze past transactions using `getrawtransaction`. By understanding the fee structures of recent transactions, they can more accurately suggest optimal transaction fees for users. ### Limited Lifespan of Transactions in the Mempool It's crucial to understand that Bitcoin's mempool, which is the collection of all unconfirmed transactions waiting to be included in a block, does not retain transactions indefinitely. Typically, transactions remain in the mempool for about 14 days, though this can vary based on node configuration and network conditions. Detailed information about this can be found on our Mempool Configuration Page at [Chainstack Mempool Configuration](/docs/mempool-configuration). Unconfirmed transactions that linger for an extended period may be dropped from the mempool. This is particularly relevant during times of network congestion or if the transaction fee is too low. To prevent transactions from expiring, developers should ensure they are either confirmed in a timely manner or resubmitted with an appropriate fee. ### Single-Threaded Nature of Bitcoin Core An important architectural detail of Bitcoin Core, the reference implementation of the Bitcoin protocol, is its single-threaded nature for processing many of its critical functions. This includes the validation of transactions and blocks. Consequently, Bitcoin Core handles operations sequentially, which may lead to limitations in processing multiple requests concurrently. This design choice impacts how quickly the node can process transactions and blocks, especially under heavy load. It's a factor to consider when building applications that interact with Bitcoin Core, as response times can vary depending on the node's current workload. ### Conclusion The `getrawtransaction` method is an invaluable tool in Bitcoin nodes, providing developers with the ability to retrieve specific transaction information. When used with its parameters - `txid` for the transaction ID, `verbose` for detailed transaction data, and `blockhash` for specifying a transaction within a particular block — it offers comprehensive insights into individual Bitcoin transactions. However, developers must be mindful of the nuances of Bitcoin's network and node operation, such as the mempool's transaction retention policy and the single-threaded nature of Bitcoin Core. These factors underscore the importance of efficient transaction management and a thorough understanding of the underlying system for effective blockchain development. # Fantom methods Source: https://docs.chainstack.com/docs/fantom-methods See also [interactive Fantom API call examples](/reference/getting-started-fantom). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | ftm\_currentEpoch | | | | trace\_block | | | | trace\_transaction | | | | trace\_filter | | | | trace\_get | | | | dag\_getEvent | | | | dag\_getEventPayload | | | | dag\_getHeads | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Fantom tooling Source: https://docs.chainstack.com/docs/fantom-tooling ## Node.js You can build a web app to query data using Node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require('axios'); const main = async () => { try { const result = await axios.post( 'YOUR_CHAINSTACK_ENDPOINT', { query: ` QUERY ` } ); console.log(result.data); } catch(error) { console.error(error); } } main(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint * QUERY — your GraphQL query See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). See also [Using GraphQL with EVM-compatible nodes](https://support.chainstack.com/hc/en-us/articles/4409604331161-Using-GraphQL-with-EVM-compatible-nodes). ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Fantom nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Fantom endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Fantom nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Fantom nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Fantom nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Fantom nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Fantom nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Fantom nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Fantom network ID: * Mainnet: `250` * Testnet: `4002` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Fantom network ID: * Mainnet: `250` * Testnet: `4002` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Fantom ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Fantom network ID: * Mainnet: `250` * Testnet: `4002` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Fantom: ERC-721 Collection Contract with Truffle & OpenZeppelin Source: https://docs.chainstack.com/docs/fantom-tutorial-erc-721-collection-contract-with-truffle-and-openzeppelin **TLDR** * This tutorial shows how to build and deploy a straightforward ERC-721 contract on the Fantom testnet. * You’ll leverage Truffle and OpenZeppelin to set up and compile the NFT collection, which can be minted by anyone. * Once deployed, you can verify the contract on FTMScan for easy interactions like minting or querying balances. * You can eventually take the same approach for Fantom mainnet and list your collection on an NFT marketplace such as Artion. ## Main article ERC-721 is the non-fungible token (NFT) standard for smart contracts. In this tutorial, you will: * Create a simple ERC-721 collection contract that allows anyone to mint new non-fungible tokens in the collection. * Deploy the contract on the Fantom testnet through a node deployed with Chainstack. * Interact with the deployed contract. * See that you can register your collection on an NFT market. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Fantom node. * [Truffle Suite](https://trufflesuite.com/) to create and deploy contracts. * [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/) to use the audited [ERC-721 libraries](https://docs.openzeppelin.com/contracts/4.x/erc721) to create your ERC-721 collection contract. ## Overview To get from zero to a deployed ERC-721 contract on the Fantom testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the Fantom testnet. With Chainstack, access your Fantom node credentials. With OpenZeppelin, create an ERC-721 contract. With Truffle, compile and deploy the contract through your Fantom node. With FTMScan, verify the deployed contract. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Fantom testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Fantom node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install OpenZeppelin Contracts See [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/). ### Install Truffle Suite See [Truffle Suite: Installation](https://trufflesuite.com/docs/truffle/how-to/install/). ### Create the contract 1. On your machine, in the contract directory, initialize Truffle: ```bash Shell truffle init ``` This will generate the Truffle boilerplate structure: ``` . ├── contracts │ └── .gitkeep ├── migrations │ └── .gitkeep ├── test │ └── .gitkeep └── truffle-config.js ``` 2. Go to the `contracts` directory. In the directory, create your ERC-721 contract `Fantom721Collection.sol`. ```solidity solidity //SPDX-License-Identifier: MIT pragma solidity ^0.8; import "@openzeppelin/contracts/token/ERC721/extensions/ERC721URIStorage.sol"; contract Fantom721Collection is ERC721URIStorage { uint256 public tokenCounter; constructor () public ERC721 ("COLLECTION_NAME", "COLLECTION_TICKER"){ tokenCounter = 0; } function createCollectible(string memory tokenURI) public returns (uint256) { uint256 newItemId = tokenCounter; _safeMint(msg.sender, newItemId); _setTokenURI(newItemId, tokenURI); tokenCounter = tokenCounter + 1; return newItemId; } } ``` The contract implementation is the following: * The contract uses OpenZeppelin audited [ERC-721 contract templates](https://docs.openzeppelin.com/contracts/4.x/erc721). * The contract is a mintable collection. Anyone can add a token to the collection through `createCollectible`. * COLLECTION\_NAME — any name to give to your collection * COLLECTION\_TICKER — any ticker for your collection 3. Create `2_deploy_contracts.js` in the `migrations` directory. ```js JavaScript module.exports = function(deployer) { var Fantom721Collection = artifacts.require("./Fantom721Collection.sol"); deployer.deploy(Fantom721Collection); }; ``` This will create the contract deployment instructions for Truffle. ### Compile and deploy the contract 1. Install `HDWalletProvider`. [HDWalletProvider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider) is Truffle's separate npm package used to sign transactions. Run: ```bash Shell npm install @truffle/hdwallet-provider ``` 2. Edit `truffle-config.js` to add: * `HDWalletProvider` * Your Fantom node access and credentials * Your Fantom account that you will use to deploy the contract. ```js JavaScript const HDWalletProvider = require("@truffle/hdwallet-provider"); const private_key = 'YOUR_PRIVATE_KEY'; module.exports = { networks: { testnet: { provider: () => new HDWalletProvider(private_key, "YOUR_CHAINSTACK_ENDPOINT"), network_id: 4002 } }, compilers: { solc: { version: "0.8.9", } } }; ``` where * `testnet` — any network name that you will pass to the `truffle migrate --network` command. * `HDWalletProvider` — Truffle's custom provider to sign transactions. * YOUR\_PRIVATE\_KEY — the private key of your Fantom account that will deploy the contract. The account must have enough FTM funds to run the deployment. See also [Fantom testnet faucet](https://faucet.fantom.network/). * YOUR\_CHAINSTACK\_ENDPOINT — your Fantom node HTTPS endpoint deployed with Chainstack. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [Tools](/docs/fantom-tooling). * `network_id` — the network ID of the Fantom network: mainnet is `250`, testnet is `4002`. * `solc` — the Solidity compiler version that Truffle must use. 3. Run: ```bash Shell truffle migrate --network testnet ``` This will engage `2_deploy_contracts.js` and deploy the contract to the Fantom testnet as specified in `truffle-config.js`. ### Interact with the contract Once your contract is deployed, you can view it online at [FTMScan testnet](https://testnet.ftmscan.com/). For an easy way to interact with your deployed contract, verify it on FTMScan. ### Flatten your contract code Since your ERC-721 contract uses imported OpenZeppelin libraries, you must put all the imports into one `.sol` file to make FTMScan be able to verify it. 1. Install Truffle Flattener. Run: ```bash Shell npm install truffle-flattener ``` 2. Flatten the contract. In the `contracts` directory, run: ```bash Shell npx truffle-flattener Fantom721Collection.sol > FlatFantom721Collection.sol ``` 3. Clean up the licensing information. The flattened contract will have the same licensing note imported from each of the files. Multiple licensing notes in one file break the FTMScan verification, so you have to leave one licensing note for the entirety of the flattened contract. The easiest way to clean up is to search for the `SPDX` mentions in the file and remove all of them except for the very first one. ### Verify the deployed contract on FTMScan At this point, you have your flattened and cleaned-up contract ready for the FTMScan verification. Go to [FTMScan testnet](https://testnet.ftmscan.com/). Find your deployed contract. The address of your contract should have been printed by Truffle at the end of the deployment in the `contract address` field. On the contract page on FTMScan, click **Contract** > **Verify and Publish**. In **Compiler Type**, select **Solidity (Single file)**. In **Compiler Version**, select **v0.8.9**. This is the version this tutorial used to compile the contract. In **Open Source License Type**, select **MIT License (MIT)**. Click **Continue**. Keep the **Optimization** option set to **No** as Truffle does not use optimization by default. Paste the entirety of your flattened `.sol` contract in the **Enter the Solidity Contract Code below** field. Click **Verify and Publish**. FTMScan will take a few seconds to compile your contract, verify, and publish it. ### Interact with the contract Now that your ERC-721 contract is verified, FTMScan is effectively a front-end instance for your contract. ### Mint an NFT in the collection You can use any account to call the `createCollectible` function. Make sure you have: * MetaMask installed and unlocked as you will need it to call the contract. See [Fantom tooling: MetaMask](/docs/fantom-tooling#metamask). * Testnet FTM on the account to pay for the transaction. See [Fantom testnet faucet](https://faucet.fantom.network/). On FTMScan, on your contract, click **Contract**. Click **Write Contract**. Click **Connect to Web3**. Under **createCollectible**, in the **tokenURI** field, provide any string to serve as metadata for this specific NFT. See also [OpenZeppelin ERC-721](https://docs.openzeppelin.com/contracts/4.x/erc721) for a metadata example. Click **Write**. This will send a transaction to mint in NFT in your contract collection and distribute the token to the account called `createCollectible`. ### Check the balances Check the NFT balance of an address: On FTMScan, on your contract, click **Contract**. Click **Read Contract**. Scroll to the **balanceOf** field. In the **owner (address)** field, provide the address of the account you used to deploy the contract. Click **Query**. Check the number of minted NFTs On FTMScan, on your contract, click **Contract**. Click **Read Contract**. Check the **tokenCounter** field. ### Listing on an NFT market Having gone through the tutorial to understand the basics of creating an NFT collection, you can amend the contract to your needs and deploy it on the Fantom mainnet. Once deployed, you can list the collection at an NFT marketplace—[Artion](https://artion.io/). See [Artion: Register Collection](https://artion.io/collection/register). ## Conclusion This tutorial guided you through the basics of creating and deploying a contract in the ERC-721 non-fungible token standard. The contract that you created is a collection that anyone on the Fantom network can interact with to add their tokens to the collection. When you are ready, you can also deploy your own ERC-721 contract on the Fantom mainnet and list the collection on a Fantom NFT marketplace. This tutorial uses testnet, however, the exact same instructions and sequence work on the mainnet. Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Faucet Source: https://docs.chainstack.com/docs/faucets Get your testnet tokens for various networks at the [Chainstack faucet](https://faucet.chainstack.com/). # Features availability Source: https://docs.chainstack.com/docs/features-availability-across-subscription-plans Overview of Chainstack features availability across different subscription plans, including node customization options, subgraph capabilities, and enterprise-specific offerings. ## Trader node transactions [Trader node transactions](/docs/warp-transactions) (ex. Warp transactions) are available starting from the [paid plans](https://chainstack.com/pricing/) for the following protocols: * Ethereum * Solana * BNB Smart Chain ## Debug and trace APIs [Debug and trace APIs](/docs/debug-and-trace-apis) are available starting from the [paid plan](https://chainstack.com/pricing/). For per protocol availability, see [Available clouds, regions, and locations](/docs/nodes-clouds-regions-and-locations). ## Mempool See [Mempool configurations](/docs/mempool-configuration). ## Subgraphs Configure your own subgraphs and deploy it to the The Graph instance maintained by Chainstack. Charged in request units and compute hours: 20 request units per 1 request + \$0.10 per hour. For the availability by subscription plans, see the [pricing page](https://chainstack.com/pricing/). ### Dedicated subgraph indexer Your own dedicated The Graph instance maintained by Chainstack. A dedicated indexer is available on the [Enterprise plan](https://chainstack.com/pricing/). A dedicated indexer can deploy multiple subgraphs across different chains. **Interested in a dedicated indexer?** To learn more about or request a dedicated indexer, contact [Chainstack support](https://support.chainstack.com). The available dedicated indexer types and their hardware requirements are: | Setup | Postgres (CPUs) | Postgres (memory, GB) | Postgres (disk space, TB) | VMs (CPUs) | VMs (memory, GB) | | -------- | ---------------- | --------------------- | ------------------------- | ---------- | ---------------- | | Standard | 8 | 30 | 1 | 12 | 46 | | Medium | 16 | 64 | 2 | 32 | 64 | | Large | 72 | 468 | 3.5 | 48 | 184 | ## Private networking Private networking is available on the [Enterprise plan](https://chainstack.com/pricing/). Private networking is an exclusive functionality available to nodes hosted on AWS. It leverages the advanced technology of [AWS PrivateLink](https://aws.amazon.com/privatelink/) to achieve a significant reduction in latency. To set up private networking for your node, your DApp must be located in the same region as your AWS node, e.g. us-east-1. When using private networking, node interaction will be performed through AWS internal infrastructure and will not be exposed to the public internet. This approach enhances security measures and minimizes processing time, delivering optimal performance. ## Node customization Node customization is available on the [Enterprise plan](https://chainstack.com/pricing/) for [dedicated nodes](/docs/dedicated-node). Below you can find the current customizations that can be used for your node. Feel free to check with us for any additional customization that's not on the list, as these are just examples. ### Transaction settings * **txpool.pricebump** — the minimum increase in transaction fee required to replace a pending transaction with the same nonce. The default value is `10%`. Applicable for EVM-based protocols. * **rpc.txfeecap** — a limit on the transaction fee. The default value is `100 base currency`. Applicable for EVM-based protocols. * **rpc.gascap** — a limit on the gas that can be used to run `eth_call` and `eth_estimateGas`. The default value is `0.5 base currency`. Applicable for EVM-based protocols. * **txlookuplimit** — the maximum number of blocks whose transaction indices are reserved. By default, for Ethereum and Polygon, the number is `2350000`. **Base currency** is the native token of a protocol. ### Node and network customizations * **Dedicated gateway** — provides a [dedicated connection to the node](/docs/trader-node#dedicated-gateways), eliminating some limitations, such as the need to reconnect. * **Custom EVM tracers** — We can enable [custom JavaScript tracers](/reference/custom-js-tracing-ethereum) for your dedicated EVM node. * **Scaling node resources** — options to scale up node resources like storage, CPU, and RAM. * **Tailored load balancing** — a dedicated gateway that balances the request load between multiple dedicated nodes. * **Private networking** — connect your application running on AWS to a Chainstack node running on AWS through a private network. This eliminates the need for your application and the node to communicate over the internet, reducing latency and increasing speed. * **Indexing** for Bitcoin nodes. For any customizations not listed here, contact [Chainstack support](https://support.chainstack.com/). We are always happy to explore your case and help you. # Fetching transactions to and from a specific address with eth_getBlockByNumber Source: https://docs.chainstack.com/docs/fetching-transactions-to-and-from-a-specific-address-with-eth_getblockbynumber **TLDR** * Fetching Ethereum transactions for a specific address can be done with Python, the Web3 library, and a Chainstack node. * Use eth\_getBlockByNumber to inspect each block’s transactions, filter by from and to addresses, then compile relevant tx hashes. * You can speed up retrieval by parallelizing block fetches – multi-threading can significantly cut down the total runtime. * This approach is crucial for auditing, real-time monitoring, or any scenario requiring granular, address-based blockchain data. ## Introduction In the rapidly evolving world of blockchain technology, Ethereum has emerged as a leading platform for decentralized applications (DApps), smart contracts, and token transactions. As the ecosystem grows, so does the need for tools and methods that provide insights into network activity. One such tool is the ability to fetch transactions to and from specific Ethereum addresses, an invaluable capability for a range of applications—from auditing and analytics to monitoring and compliance. Whether you're a DApp developer looking to debug contract interactions, a financial analyst tracking asset flows, or an Ethereum user wanting to keep tabs on your transactions, understanding how to fetch transaction data programmatically is essential. This article aims to fill that knowledge gap by providing a step-by-step guide on how to fetch transactions for a specific Ethereum address using Python and Chainstack. We'll walk you through the entire process, from setting up your development environment to customizing and optimizing your code for various use cases. ## Setting up the environment Before diving into the code, it's crucial to set up a proper development environment. This ensures you have all the necessary tools and configurations to run the code smoothly. Below are the components you'll need and the steps to set them up. ### Python and Web3 library 1. **Install Python**. If you haven't already, download and install Python from the [official website](https://www.python.org/downloads/). 2. **Create a virtual environment**. In your project directory, create a new virtual environment: ```shell Shell python3 -m venv tx_monitor ``` Then activate it: ```shell Shell source tx_monitor/bin/activate ``` 3. **Install the Web3 library**. Open your terminal and run the following command to install the Web3 library: ```shell Shell pip install web3 ``` ### Blockchain RPC endpoint with Chainstack You'll need access to an RPC node to interact with a blockchain. Chainstack provides a straightforward way to deploy and manage blockchain nodes. 1. **Sign up for Chainstack**. Visit the [Chainstack signup page](https://console.chainstack.com/user/account/create) and follow the instructions to create an account. Try out the new [social login](https://twitter.com/ChainstackHQ/status/1709988403893223599) feature. 2. **Deploy a node**. Once your account is set up, you can deploy an Ethereum node. Follow the [Chainstack documentation](/docs/manage-your-networks#join-a-public-network) for a step-by-step guide on how to do this. 3. **Access node credentials**. After deploying your node, you'll need the RPC URL to interact with the Ethereum network. You can find this information in the Chainstack dashboard. For more details, see [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). In this article, we use a BNB Smart Chain node as an example. ## The code’s logic Now, let’s go over the logic we implement in this project. The code will take the following steps: 1. **Initialization**. Set up the BNB Smart Chain node connection and specify the block range and address to monitor. 2. **Core functionality**, Use the `eth_getBlockByNumber` method indirectly through Web3's `get_block` function to fetch each block within the specified range. Learn more about [eth\_getBlockByNumber](/reference/ethereum_node_api/blocks_info/eth_getBlockByNumber). 3. **Transaction filtering**. For each fetched block, scan through the list of transactions and identify those that involve the specified Ethereum address. It does this by checking the `from` and `to` fields in each transaction. 4. **Result compilation**. All transactions involving the specified address are compiled into a list printed out at the end. ### Importance of `eth_getBlockByNumber` The `eth_getBlockByNumber` method is crucial here as it allows the script to fetch the entire block data, including the full list of transactions within each block. By doing so, the script can then iterate through these transactions to identify those that involve the specified address. This method provides a way to access historical data on the blockchain, making it a valuable tool for auditing and monitoring activities. In a new file, paste the following code: ```python Python from web3 import Web3 from web3.middleware import geth_poa_middleware rpc_url = "YOUR_CHAINSTACK_ENDPOINT" # Change it to your Chainstack's node URL from_block = 32720146 # Define the block interval you have interest on to_block = 32720209 your_address = "0x2D4C407BBe49438ED859fe965b140dcF1aaB71a9" # Define the address of interest w3 = Web3(Web3.HTTPProvider(rpc_url)) w3.middleware_onion.inject(geth_poa_middleware, layer=0) # This is needed for some specific protocols given some ExtraBytes in the response. For polygon, for example, this is needed def get_transactions_for_address(address, from_block, to_block): transactions = [] for block_number in range(from_block, to_block + 1): print(f'Inspecting block {block_number}') block = w3.eth.get_block(block_number, full_transactions=True) if block is not None and 'transactions' in block: for tx in block['transactions']: if address in [tx['from'], tx['to']]: transactions.append(tx['hash'].hex()) print(f'TX involving {your_address} here') return transactions if __name__ == "__main__": print(f'Scanning for TXs involving {your_address}') transactions = get_transactions_for_address(your_address, from_block, to_block) if transactions: print(f"Transactions involving address {your_address}:") for tx_hash in transactions: print(tx_hash) else: print(f"No transactions found for address {your_address} in the specified block range.") ``` ## Fetching transactions with precision At the heart of our script is the `get_transactions_for_address` function, a meticulously designed piece of logic that scans the blockchain to identify transactions involving a specific address within a user-defined range of blocks. Here's an overview of its operation: * The function embarks on a block-by-block journey, traversing from the starting block number defined in `from_block` to the ending block number in `to_block`. * The function scrutinizes every transaction within each block, examining the sender (`from`) and recipient (`to`) addresses. The transaction hash is captured and stored in a list if it identifies a transaction where the specified Ethereum address is either the sender or the recipient. This function is a robust yet straightforward mechanism for compiling a transaction history for a given Ethereum address. Doing so offers invaluable insights for auditing, monitoring, or any other use cases requiring a detailed transaction history. ## The significance of transaction tracking ### Transaction monitoring: A multi-faceted utility The capability to accurately retrieve transactions associated with a particular address is not just a feature—it's an essential tool with diverse applications, such as: * **For wallet users**. This functionality enables wallet users to meticulously monitor incoming and outgoing transactions, thereby providing a reliable way to verify that all transactions have been executed as expected. * **For developers**. This tool is especially beneficial for developers who must keep tabs on interactions with their deployed smart contracts. It serves as a real-time monitoring system and an auditing mechanism to ensure the contracts operate as designed. ### Real-world applications To further illustrate the utility of this tool, let's dive into some scenarios where it can be particularly impactful: * Token asset management — the ability to track token transfers to and from a specific address is invaluable for managing your digital assets effectively. It provides a transparent view of your token portfolio's activity. * Transaction verification — for businesses and individual users, confirming the successful receipt of funds is paramount. This tool simplifies that process by providing a straightforward way to verify transaction completion. * Smart contract oversight — for developers and auditors alike, this tool is important for continuously monitoring and auditing smart contract interactions. It provides a granular view of transactional data, aiding in both development and compliance efforts. ## Fine-tuning and performance enhancement ### Strategic block range selection One cornerstone of maximizing this script's utility is the judicious selection of the `from_block` and `to_block` parameters. These values define the scope of your transaction search, and their optimal setting depends on your specific use case. Whether you're interested in a brief snapshot of recent activity or an exhaustive historical audit, adjusting these parameters allows you to focus your query accordingly. ### Advanced transaction filtering For those looking to go beyond basic transaction tracking, the script can be further customized by implementing additional filters. For example, you could refine your search by filtering transactions based on their value or by isolating only those transactions that interact with a particular smart contract. This level of granularity enables you to extract precisely the data you're interested in, making your monitoring efforts more targeted and efficient. ## Leveraging parallel computing for efficiency Performance optimization becomes a key consideration because the `get_block` method can be resource-intensive—especially when dealing with a large range of blocks. One effective strategy for speeding up the data retrieval process is through parallelization. By employing multi-threading, you can distribute the workload across multiple threads, reducing overall runtime. The script includes a parallelized version of the core function, demonstrating how multi-threading can significantly improve performance. Learn more about multithreading in Python in [Mastering multithreading in Python for Web3 requests: A comprehensive guide](/docs/mastering-multithreading-in-python-for-web3-requests-a-comprehensive-guide). ```python Python from web3 import Web3 from web3.middleware import geth_poa_middleware import concurrent.futures import time # Use the same RPC URL as in your original code rpc_url = "YOUR_CHAINSTACK_ENDPOINT" # Use the same block range as in your original code from_block = 32720146 to_block = 32720209 # Use the same Ethereum address as in your original code your_address = "0x2D4C407BBe49438ED859fe965b140dcF1aaB71a9" w3 = Web3(Web3.HTTPProvider(rpc_url)) w3.middleware_onion.inject(geth_poa_middleware, layer=0) def get_transactions_for_block_range(address, from_block, to_block, block_number, num_threads): transactions = [] for block_num in range(from_block + block_number, to_block + 1, num_threads): block = w3.eth.get_block(block_num, full_transactions=True) if block is not None and 'transactions' in block: for tx in block['transactions']: if address in [tx['from'], tx['to']]: transactions.append(tx['hash'].hex()) print(f'TX involving {address} found') return transactions if __name__ == "__main__": latencies = [] for num_threads in [1, 2, 4, 8]: start_time = time.time() with concurrent.futures.ThreadPoolExecutor(max_workers=num_threads) as executor: futures = [] for i in range(num_threads): futures.append( executor.submit( get_transactions_for_block_range, your_address, from_block, to_block, i, num_threads ) ) transactions = [] for future in concurrent.futures.as_completed(futures): transactions.extend(future.result()) end_time = time.time() latency = end_time - start_time latencies.append((num_threads, latency)) if transactions: print(f"Transactions involving address {your_address} with {num_threads} threads:") for tx_hash in transactions: print(tx_hash) else: print(f"No transactions found for address {your_address} in the specified block range.") print("\nPerformance Metrics:") for num_threads, latency in latencies: print(f"Threads: {num_threads}, Latency: {latency:.2f} seconds") ``` This code will output the difference found while simultaneously using 1, 2, 4, and 8 threads. Here is the result we found: ```python Python Performance Metrics: Threads: 1, Latency: 22.53 seconds Threads: 2, Latency: 12.35 seconds Threads: 4, Latency: 6.62 seconds Threads: 8, Latency: 3.33 seconds ``` The code tests the performance with 1, 2, 4, and 8 threads, allowing us to observe how the system scales with increasing threads. Here's a breakdown of the results: * **Single-threaded (1 thread)**. When running the code with a single thread, it took 22.53 seconds to complete the task. This serves as our baseline for performance comparison. * **Dual-threaded (2 threads)**. With two threads, the latency dropped to 12.35 seconds. This is almost a 45% reduction in time compared to the single-threaded execution, showcasing the benefits of parallelization. * **Quad-threaded (4 threads)**. Utilizing four threads further reduced the latency to 6.62 seconds. This is roughly a 70% reduction compared to the single-threaded baseline. * **Octa-threaded (8 threads)**. Finally, with eight threads, the latency was minimized to 3.33 seconds. This is an astonishing reduction of about 85% compared to the single-threaded execution. These results demonstrate the power of multithreading in optimizing computational tasks. The more threads we use, the lower the latency, up to a point. This is a classic example of how parallel computing can significantly speed up inherently parallelizable tasks, like fetching transactions from different blocks in a blockchain. ### Hardware constraints The number of threads you can effectively use is often limited by the hardware on which the code is running. Most modern CPUs have multiple cores, and each core can run one or more threads. It's generally a good idea to align the number of threads with the number of available CPU cores for optimal performance. ### Scalability testing The numbers 1, 2, 4, and 8 were chosen to provide a broad spectrum for scalability testing. Starting with a single thread provides a baseline performance metric. Doubling the number of threads at each step (1 to 2, 2 to 4, and 4 to 8) allows us to observe how the system scales with increasing threads. This is a common approach to gauge the "speedup" factor and to understand if the application benefits from parallelization. ### Diminishing returns It's essential to note that adding more threads doesn't always result in linear performance improvement. Due to factors like thread management overhead and resource contention, there's a point beyond which adding more threads may yield diminishing returns or even degrade performance. That's why it's useful to test with various numbers of threads to find the "sweet spot." ### Task granularity The granularity of the task at hand also influences the optimal number of threads. If the task can be broken down into smaller, independent sub-tasks (as is the case with fetching transactions from different blocks), it's more likely to benefit from multi-threading. However, if the task involves many shared states or resources, adding more threads might lead to issues like race conditions or deadlocks. ## Conclusion The capacity to retrieve transactions associated with a particular address is an indispensable tool for anyone engaged in the blockchain space. This utility transcends roles, offering valuable insights for wallet users, developers, and auditors alike. It not only enhances your understanding of transactional flows but also fortifies the transparency and accountability that are the cornerstones of blockchain technology. Through thoughtful customization and optimization, you can fine-tune this code to serve many applications. Whether you're tracking fund movements, auditing smart contracts, or monitoring network activity, the code provides a robust foundation upon which you can build. The introduction of multi-threading options showcases how performance can be significantly improved, allowing for more efficient data retrieval and analysis. This adaptability underscores the code's versatility, making it a reliable real-time monitoring and historical data analysis solution. # Fetching transfer events with web3.js getPastEvents for a BAYC NFT Source: https://docs.chainstack.com/docs/fetching-transfer-events-with-getpastevents-for-a-bayc-nft **TLDR** * Set up a node.js script using web3.js to connect to Ethereum via Chainstack endpoints. * Fetch real-time crypto prices through Chainlink’s AggregatorV3Interface contracts, removing reliance on external APIs. * Conversion involves retrieving integer values from the contract and formatting them into human-readable outputs. * Call prices at intervals to keep data updated without single-point-of-failure risks. ## Main article As the Ethereum ecosystem matures and sees more advanced DApps and smart contract interactions, efficiently retrieving historical data becomes paramount. One of the tools developers can leverage for this purpose is the `getPastEvents` function from the web3.js library. This tool, while powerful, has nuances and limitations that developers must be aware of to ensure efficient application development. ## About `getPastEvents` The `getPastEvents` function is a part of the web3.js library, which provides an interface for developers to interact with the Ethereum blockchain. It's specifically tailored to fetch past events emitted by Ethereum smart contracts, making it a go-to method for DApp developers needing historical contract event data. Read [Tracking some Bored Apes: The Ethereum event logs tutorial](/docs/tracking-some-bored-apes-the-ethereum-event-logs-tutorial) to learn more about event logs. ## Best practices when using `getPastEvents` To maximize the efficiency and reliability of your event data retrieval, follow these guidelines: 1. **Limit the block range**. Always adhere to recommended block range limits. This minimizes the risk of time-consuming queries or receiving excessively large data sets. 2. **Use Filters**. `getPastEvents` offers filtering options. Leverage them to refine the events you want, making queries more efficient. 3. **Pagination**. If you must retrieve many events, consider breaking down your requests—query in chunks to prevent potential timeouts or massive responses. 4. **Error Handling**. Be prepared for potential failures due to network glitches or other unforeseen issues. Implement robust error handling and a retry mechanism. By following these guidelines and understanding the intricacies of `getPastEvents`, developers can efficiently and effectively integrate historical event data retrieval into their DApps. Check out the [Understanding eth\_getLogs limitations](/docs/understanding-eth-getlogs-limitations) guide to learn more about event retrieval best practices. ## Real-world example: Retrieving `Transfer` events for a specific Bored Ape Bored Ape Yacht Club (BAYC) is a well-known collection of NFTs on the Ethereum blockchain. Each Bored Ape is a unique digital asset; ownership transfers are recorded as `Transfer` events on the blockchain. In this example, we'll demonstrate using the `getPastEvents` function to retrieve all `Transfer` events for a Bored Ape. ## Prerequisites Read [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) to learn how to manage a node.js project. ## Get an Ethereum node Follow these steps to deploy an Ethereum node: [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ## Setup For this example, we will be using node.js, so let’s set up a project. ### Step 1. Setup Initialize a new node.js project by running: ```shell Shell npm init -y ``` Install the web3.js library: ```shell Shell npm install web3 ``` ### Step 2. Writing the script In the directory where you created the new project, create a new file named `index.js` and paste the following code into it. This will set up the BAYC ABI, initialize the contract, and fetch `Transfer` events. ```jsx index.js const { Web3 } = require("web3"); const url = "YOUR_CHAINSTACK_NODE"; // Replace with your Chainstack Ethereum node endpoint const web3 = new Web3(new Web3.providers.HttpProvider(url)); // ABI for the Bored Ape Yacht Club contract, only including the Transfer event const BAYC_ABI = [ { anonymous: false, inputs: [ { indexed: true, internalType: "address", name: "from", type: "address", }, { indexed: true, internalType: "address", name: "to", type: "address", }, { indexed: true, internalType: "uint256", name: "tokenId", type: "uint256", }, ], name: "Transfer", type: "event", }, ]; const BAYC_CONTRACT_ADDRESS = "0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D"; const contract = new web3.eth.Contract(BAYC_ABI, BAYC_CONTRACT_ADDRESS); async function fetchTransfersForTokenId(tokenId) { try { // Fetch the latest block number and calculate the target block const latestBlock = await web3.eth.getBlockNumber(); const target = Number(latestBlock) - 10000; // Fetch Transfer events for the given token ID const events = await contract.getPastEvents("Transfer", { filter: { tokenId: tokenId }, fromBlock: target, toBlock: "latest", }); console.log( `Total transfers for Bored Ape ${tokenId}: ${events.length} transfers` ); // Iterate through the events and log details for (let event of events) { console.log( `From: ${event.returnValues.from} To: ${event.returnValues.to} at block: ${event.blockNumber}` ); } } catch (error) { console.error(`Error fetching transfers for token ${tokenId}:`, error); } } // Fetch bids for BAYC NFT ID 7924 fetchTransfersForTokenId(7924); ``` ### Step 3. Run the Script Execute the script with: ```shell Shell node index.js ``` This will display the transfer history of the past 10,000 blocks for the BAYC token with ID 7924, detailing from which address to which address the token was transferred and the block at which the transfer occurred. Note that there might not be any transfers as we are only querying the past 10,000 blocks. ## Understanding the code This code provides a practical example of using web3.js to fetch and analyze specific contract events on the Ethereum blockchain, demonstrating a common pattern used in blockchain development and analysis. 1. **Defining BAYC ABI**. The ABI (application binary interface) for the BAYC contract is defined, focusing on the `Transfer` event. The ABI is a critical component that enables the script to interact with the smart contract's functions and events. 2. **Defining BAYC contract address**. The address of the BAYC contract on the Ethereum blockchain is specified. 3. **Initializing contract instance**. A contract instance is created using the ABI and contract address. This instance provides methods to interact with the contract, including fetching past events. 4. **Defining the main function (`fetchTransfersForTokenId`)**: * **Fetching latest block number**. The script fetches the latest block number from the Ethereum blockchain. * **Calculating target block**. The target block is calculated by subtracting 10,000 from the latest block number. This defines the range of blocks to query for past events. * **Fetching transfer events**. The `getPastEvents` method is called on the contract instance to fetch all `Transfer` events related to the specified token ID within the block range. * **Logging transfers**. The script logs the total number of transfers for the specified Bored Ape and iterates through the events, logging details such as the sender, receiver, and block number. * **Error handling**. If an error occurs during the process, it's caught and logged to the console. 5. **Calling the main function**. Finally, the `fetchTransfersForTokenId` function is called with a specific token ID (7924) to fetch and log the transfer events for that Bored Ape. Note how we are checking the past 10,000 blocks, not the entire chain. If you want to index the events from the entire chain, you will need multiple rounds or a [Subgraph](/docs/subgraphs-tutorial-a-beginners-guide-to-getting-started-with-the-graph). ## Conclusion Fetching historical event data from the Ethereum blockchain is a common requirement for developers working with DApps and smart contracts. The `getPastEvents` function in the web3.js library offers a powerful and flexible way to retrieve such data, but it also comes with nuances and limitations that must be carefully managed. In this guide, we explored the `getPastEvents` function, focusing on its application in retrieving `Transfer` events for a specific Bored Ape Yacht Club NFT. We discussed best practices for `getPastEvents`, including limiting the block range, using filters, implementing pagination, and handling errors. We also provided a step-by-step example to demonstrate how to set up a node.js project, initialize the web3.js library, and write a script to fetch and display transfer events for a BAYC NFT. ### About the author Technical Support Engineer @ Chainstack JUST BUIDL IT! [](https://github.com/0x6564) [](https://twitter.com/edeenn22) [](https://www.linkedin.com/in/edindr/) # Filecoin tooling Source: https://docs.chainstack.com/docs/filecoin-tooling ### No Filecoin support Chainstack deprecated support for Filecoin nodes. This page here is for legacy and in case you may find it useful. ## MetaMask You can set your [MetaMask](https://metamask.io/) to interact through your Filecoin nodes.. Open your MetaMask and click the network selector. In the network selector, click **Custom RPC**. In the **New RPC URL** field, enter the node endpoint endpoint. In the **Chain ID** field, enter the ID of the network: * Calibration Testnet: `314159` Click **Save**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Filecoin nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Filecoin endpoint instead of FILECOINT\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "FILECOIN_ENDPOINT"), network_id: "*" }, } } }; ``` ## Remix IDE To make Remix IDE interact with the network through a Filecoin node: Get [MetaMask](https://metamask.io/) and set it to interact through a Filecoin node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Deploy & run transactions**. This will engage MetaMask and make Remix IDE interact with the network through a Filecoin node. ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Filecoin nodes. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('FILECOIN_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where FILECOIN\_ENDPOINT is your node HTTPS endpoint. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('FILECOIN_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where FILECOIN\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Filecoin nodes. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('FILECOIN_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * FILECOIN\_ENDPOINT — your node HTTPS endpoint * HOSTNAME — your node HTTPS endpoint hostname ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('FILECOIN_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * FILECOIN\_ENDPOINT — your node WSS endpoint * HOSTNAME — your node WSS endpoint hostname See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Filecoin nodes. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'FILECOIN_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * FILECOIN\_ENDPOINT — your node HTTPS endpoint * NETWORK\_ID — Filecoin network ID: * Calibration Testnet: `314159` ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('FILECOIN_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * FILECOIN\_ENDPOINT — your node WSS endpoint * NETWORK\_ID — Filecoin network ID: * Calibration Testnet: `314159` # Filecoin: Deploy a deal-making contract using Hardhat Source: https://docs.chainstack.com/docs/filecoin-tutorial-deploy-a-deal-making-contract-on-filecoin-with-hardhat ### Filecoin support is deprecated Chainstack deprecated support for Filecoin nodes. This page here is for legacy and in case you may find it useful. ## Introduction This tutorial will show you how to deploy a deal-making smart contract on the Calibration Testnet. The Calibration Testnet supports the FEVM implementation. ### Filecoin Virtual Machine and Filecoin EVM The **Filecoin Virtual Machine**, or FVM, serves as the backbone of the Filecoin network, providing a powerful runtime environment for the execution of smart contracts, known as **actors**. Actors can be written in Solidity and, in the future, in any language that compiles to WebAssembly, empowering developers to establish and enforce a set of rules to store and retrieve data on the Filecoin network. The FVM acts as a gatekeeper, ensuring the integrity of stored data and enforcing the terms of storage deals, such as data retention and retrieval times, making the Filecoin network a safe and reliable platform for decentralized data storage. The **Filecoin Ethereum Virtual Machine**, or FEVM, brings the power of the Ethereum Virtual Machine (EVM) to the Filecoin network. The FEVM is virtualized as a runtime layer on top of the Filecoin Virtual Machine, allowing for the execution of EVM smart contracts on the network. With the FEVM, developers can quickly and easily start writing actors on the Filecoin blockchain, utilizing all of the familiar tools, packages, and languages they are used to while having access to Filecoin's unique storage capabilities, opening up new possibilities and opportunities for DApp development. Follow the FEVM implementation progress on the [Filecoin docs](https://docs.filecoin.io/developers/smart-contracts/concepts/filecoin-evm/#fevm-and-native-fvm). ### Filecoin actors In the Filecoin network, **actors** are a crucial component and play a role analogous to smart contracts in the Ethereum Virtual Machine. These actors are integral to the system's operations, as any change to the state of the Filecoin blockchain necessitates the initiation of an actor method invocation. The code that defines an actor is divided into various methods. When messages are sent to an actor, they include information specifying which method(s) to call and the input parameters for those methods. Moreover, actor code interacts with a runtime object that contains information about the overall state of the network, such as the current epoch, cryptographic signatures, and proof validations. Just like smart contracts on other blockchain platforms, actors in the Filecoin network must pay a gas fee. This fee, denominated in Filecoin's native cryptocurrency (FIL), is used to offset the cost of a transaction, which includes the network resources used to process the transaction. Every actor in the Filecoin network has several attributes associated with it: a Filecoin balance, a state pointer, a code that identifies its type, and a nonce, which tracks the number of messages sent by this actor. Actors in the Filecoin network fall into two categories: 1. **Built-in actors** — these are pre-programmed actors that the Filecoin network team writes and deploys directly into the network. They come pre-installed and are designed to execute specific functions within the network. For instance, the `StorageMinerActor` is a built-in actor that handles storage mining operations and the collection of proofs. 2. **User actors** — these are akin to smart contracts that developers can create and deploy on the Filecoin network via the FVM. User actors are developed by third-party developers and offer a broad spectrum of functionalities within the network. They provide the flexibility for developers to create custom solutions and applications on the Filecoin network. ## Prerequisites * A Filecoin node * [Hardhat](https://hardhat.org/) to create and deploy contracts * [yarn installed](https://classic.yarnpkg.com/lang/en/docs/install/) on your machine ## Overview To get from zero to an emitted deal proposal on the Calibration Testnet, do the following: Get a Filecoin node endoint. Clone the FEVM deal-making kit from Filecoin’s GitHub. Fund your wallet from the Filecoin faucet. Prepare files for storage on the Filecoin network, converting them in `.car`. Deploy the deal-making contract on the Filecoin Calibration Testnet through a node. Make a deal proposal for the Boost storage providers to pick up. Add your [Filecoin endpoint to MetaMask](/docs/filecoin-tooling#metamask). ### Clone the FEVM deal-making kit from Filecoin’s GitHub The Filecoin team has developed a comprehensive FEVM deal-making kit. This kit provides developers with a suite of tools designed to facilitate the deployment of deal-making smart contracts and the creation of deal proposals, thereby streamlining the process of engaging with the Filecoin network. Clone the [FEVM deal-making kit](https://github.com/filecoin-project/fvm-starter-kit-deal-making) repository. Note that this kit includes a submodule to convert files into `.car`; use the following command to clone the submodule as well: ```shell Shell git clone --recurse-submodules https://github.com/filecoin-project/fvm-starter-kit-deal-making.git ``` Move the terminal into the root folder of the project: ```shell Shell cd fvm-starter-kit-deal-making ``` Then install the dependencies running the yarn command: ```shell Shell yarn install ``` ### Set up environment variables and network configuration In the `.env.example` file, delete its content and paste the following: ```yaml .env.example PRIVATE_KEY="YOUR_PRIVATE_KEY" CALIBRATION_URL="FILECOIN_ENDPOINT" CHAIN_ID:314159 ``` Add the private key of the account you intend to use for deploying the smart contract and interacting with the network, along with the URL of the Calibration Testnet. This step is crucial for establishing a secure connection using environment variables. Finally, save the file and rename it to `.env`. ### Fund your wallet from the Filecoin faucet Before proceeding further, ensure you have some testnet funds from the Filecoin faucet. * [Filecoin faucet](https://faucet.calibration.fildev.network/funds.html) Clicking the above link will direct you to a page where you can input your wallet address. This faucet accepts both Ethereum format addresses and Filecoin's f4address format. The FEVM deal-making kit comes equipped with a Hardhat task that allows for the conversion of an Ethereum address to an f4address using a private key. If you wish to determine your f4address, execute the command below after configuring your private key in the `.env` file. ```shell Shell yarn hardhat get-address ``` You will receive a response like the following: ```shell Shell Ethereum address (this addresss should work for most tools): 0x8f8e7012F8F974707A8F11C7cfFC5d45EfF5c2Ae f4address (also known as t4 address on testnets): f410fr6hhaexy7f2ha6upchd477c5ixx7lqvoim54aeq ``` You can input either the Ethereum address or the f4address into the faucet. The faucet will provide a message ID. This ID can be used in the [Filecoin Explorer](https://calibration.filscout.com/en) to view the details of the transaction. ### Prepare files for storage on the Filecoin network Files intended for upload to the Filecoin network must be in the `.car` format. Prior to proposing a deal, these files need to be converted, and certain necessary information must be obtained. The most straightforward method to accomplish this is by utilizing the [FVM Data Depot](https://data.lighthouse.storage/) page, which we will explore in this tutorial. Alternatively, you can use the included submodule in the repository to generate `.car` files. Instructions for this process can be found within the [repository](https://github.com/filecoin-project/fvm-starter-kit-deal-making#option-b-use-the-generate-car-tool-locally). Login into the FVM Data Depot page and click **Upload New File**. Once the process is done, you can access the page with the information we’ll need. To propose a deal, we will need the following: * **Piece CID** * **Piece Size** * **CAR Size** * **URL** ### Edit `hardhat.config.js` Before proceeding with the contract deployment, it is necessary to modify the config file in order to incorporate the Calibration endpoint from the environment variables. To accomplish this, please follow the steps below: 1. Locate the hardhat.config.js file. 2. Remove all existing content from the file. 3. Copy and paste the following code into the hardhat.config.js file: ``` require("@nomicfoundation/hardhat-toolbox") require("hardhat-deploy") require("hardhat-deploy-ethers") require("./tasks") require("dotenv").config() const PRIVATE_KEY = process.env.PRIVATE_KEY /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { solidity: { version: "0.8.17", settings: { optimizer: { enabled: true, runs: 1000, details: { yul: false }, }, }, }, defaultNetwork: "Calibration", networks: { Calibration: { chainId: process.env.CHAIN_ID, url: process.env.CALIBRATION_URL, accounts: [PRIVATE_KEY], }, FilecoinMainnet: { chainId: 314, url: "https://api.node.glif.io", accounts: [PRIVATE_KEY], }, }, paths: { sources: "./contracts", tests: "./test", cache: "./cache", artifacts: "./artifacts", }, } ``` This setup will pick up the Calibration endpoint from the environment variables. ### Deploy the deal-making contract on the Filecoin Calibration Testnet Now it’s time to deploy the smart contract, which you can find in the `contracts` directory. You will be deploying the `DealClient.sol` smart contract. Here is a high-level overview of the contract: 1. **Imports and structs**. The contract imports several libraries and contracts, including types and utilities, from the `@zondax/filecoin-solidity` package. It also defines several structs to model deal requests and extra parameters associated with these requests. 2. **DealClient contract**. The main contract, `DealClient`, is designed to facilitate the creation and management of storage deals on the Filecoin network. It includes several mappings to keep track of deal requests, piece requests, piece providers, piece deals, and piece statuses. 3. **Deal proposal creation**. The `makeDealProposal` function allows the contract owner to create a new deal proposal. The proposal is stored in the `dealRequests` array, and various mappings are updated to track the new proposal. 4. **Deal proposal retrieval**. The `getDealProposal` function allows anyone to retrieve a deal proposal given its ID. It returns a CBOR-encoded representation of the proposal. 5. **Deal authentication and notification**. The `authenticateMessage` and `dealNotify` functions are designed to be called by the Filecoin market actor as part of the deal publishing process. They validate the deal proposal and update the contract's state to reflect the published deal. 6. **Deal activation status**. The `updateActivationStatus` function can be called to retrieve the activation status of a deal and update the contract's state accordingly. 7. **Balance management**. The `addBalance` and `withdrawBalance` functions allow the contract owner to add funds to the storage market actor's escrow and withdraw funds from it. 8. **DataCap reception**. The `receiveDataCap` function is designed to be called by the Filecoin datacap actor. It emits an event to signal that datacap has been received. 9. **Filecoin method handling**. The `handle_filecoin_method` function is a universal entry point for calls coming from built-in Filecoin actors. It dispatches these calls to the appropriate function based on the method number. The contract is deployed by the `0_deploy.js` script in the `deploy` directory. This is a standard Hardhat deploying script, and it will work to deploy your contract. If you want to also convert the new contract address to f4address to find it in the explorer, as it does not support Ethereum addresses, delete the code from the deploy file and paste the following: ```javascript 0_deploy.js require("hardhat-deploy") require("hardhat-deploy-ethers") const fa = require("@glif/filecoin-address"); const { networkConfig } = require("../helper-hardhat-config") const private_key = network.config.accounts[0] const wallet = new ethers.Wallet(private_key, ethers.provider) module.exports = async ({ deployments }) => { console.log("Wallet Ethereum Address:", wallet.address) //deploy DealClient const DealClient = await ethers.getContractFactory('DealClient', wallet); console.log('Deploying DealClient...'); const dc = await DealClient.deploy(); await dc.deployed() //Convert Ethereum address to f4 address const f4Address = fa.newDelegatedEthAddress(dc.address).toString(); console.log('DealClient deployed to:', dc.address); console.log('f4Address:', f4Address); } ``` This code only adds a line to convert the Ethereum address to f4address, which you can use to find the contract in the [Filecoin explorer](https://calibration.filscout.com/en). Run the following command to deploy the `DealClient` smart contract: ```shell Shell yarn hardhat deploy ``` The console will return something similar to the following: ```bash Bash Wallet Ethereum Address: 0x8f8e7012F8F974707A8F11C7cfFC5d45EfF5c2Ae Deploying DealClient... DealClient deployed to: 0x702E0755450aFb6A72DbE3cAD1fb47BaF3AC525C f4Address: f410foaxaovkfbl5wu4w34pfnd62hxlz2yus4fdk622a ``` Now you can find the smart contract you deployed in the [Filecoin explorer](https://calibration.filscout.com/en). ### Make a deal proposal Before invoking the `makeDealProposal` function from the smart contract, it's important to understand how the payload is constructed and the various parameters involved. To this end, you'll find a struct named `DealRequest` that outlines these parameters. The `DealRequest` struct looks like this: ```solidity DealRequest /* * The DealRequest struct represents a user's request for this contract to initiate a deal. * This structure is designed in alignment with Filecoin's Deal Proposal, with the exception * of the provider field. The provider is omitted because any provider can respond to a deal * broadcast by this contract. */ struct DealRequest { bytes piece_cid; uint64 piece_size; bool verified_deal; string label; int64 start_epoch; int64 end_epoch; uint256 storage_price_per_epoch; uint256 provider_collateral; uint256 client_collateral; uint64 extra_params_version; ExtraParamsV1 extra_params; } ``` Let’s go over the parameter: * `piece_cid` — a unique identifier of the piece obtained when converting the file to `.car`. * `piece_size` — represents the size of the piece in bytes. * `verified_deal` — indicates whether the deal has been allocated `DataCap` or not. `DataCap` boosts the "quality-adjusted power" of storage providers by a factor of 10, resulting in better block rewards. Notaries receive batches of `DataCap` that they can distribute to clients. `DataCap` is consumed when used for making storage deals. * `label` — refers to the `DataCID` obtained during the conversion of the file to `.car`. * `start_epoch` — represents the epoch at which you want the storage to begin. * `end_epoch` — signifies the epoch at which you want the storage to conclude. * `storage_price_per_epoch` — the price offered per epoch in `attoFil`, similar to wei in Ethereum. This price corresponds to the cost per gigabyte of storage every 30 seconds (one epoch). A value of `1000000000000000000` means the deal offers 1 FIL per gigabyte stored every 30 seconds. The price can be set to zero, indicating it is free. * `provider_collateral` — the collateral that the storage provider must provide for the deal. It can be set to zero. * `client_collateral` — the collateral that you, as the client, must provide for the deal. It can also be set to zero. * `extra_params_version` — generally set to 1. * `ExtraParamsV1` — includes additional parameters: * `car_link` — the link to the `.car` file obtained during the conversion. * `car_size` — the size of the `.car` file obtained during the conversion. * `skip_ipni_announcement` — indicates whether the deal should be announced to IPNI indexers or not. * `remove_unsealed_copy` — specifies whether the storage provider should remove an unsealed copy. Now that you have a good idea about the deal-making parameters, we can send it to the network. The Filecoin kit includes a Hardhat task to facilitate these steps; in the terminal, send the following command: ```shell Shell yarn hardhat make-deal-proposal --contract 0x702E0755450aFb6A72DbE3cAD1fb47BaF3AC525C --piece-cid baga6ea4seaqcgnwcifpjmcxqxpmmvcayct2msrj2wzhyrm7we6jl4c5i3gq7eda --piece-size 2097152 --verified-deal false --label "baga6ea4seaqcgnwcifpjmcxqxpmmvcayct2msrj2wzhyrm7we6jl4c5i3gq7eda" --start-epoch 587734 --end-epoch 600000 --storage-price-per-epoch 1000000000000000000 --provider-collateral 2000000000000000000 --client-collateral 0 --extra-params-version 1 --location-ref "YOUR_CAR_FILE_LINK" --car-size 1282438 --skip-ipni-announce false --remove-unsealed-copy false ``` Make sure to edit this request, including the address of the contract you just deployed and the info gathered when converting the file to `.car`. This will return a message with your proposal ID: ```bash Bash Making deal proposal on network Calibration Complete! Event Emitted. ProposalId is: 0xfd6419d07e4c269e58d0c63969756c2124155b4a8d6dd08b8cd46e3a9acbf625 ``` To track the progress of your transaction, you can use the Filecoin explorer and review the details of the [transaction made for this tutorial](https://calibration.filscout.com/en/message/bafy2bzaceccg4fmbd4rna5ohthc7t2nui3jz7t54ahlafxxe3rd7tgkhj4npa). This will provide you with insights into how your transaction was executed. Once your transaction is confirmed, the [Boost storage providers](https://boost.filecoin.io/) will be able to proceed and pick up your deal. ## Conclusion In this tutorial, we walked through the deployment of a deal-making smart contract on the Filecoin Calibration Testnet, leveraging the Filecoin Ethereum Virtual Machine (FEVM). We delved into the concepts of the Filecoin Virtual Machine (FVM) and actors in the Filecoin network, equipping you with an understanding of the underlying architecture. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Fuse tooling Source: https://docs.chainstack.com/docs/fuse-tooling ### No Fuse support Chainstack deprecated support for Fuse nodes. This page here is for legacy and in case you may find it useful. ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Fuse nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Fuse endpoint instead of YOUR\_NODE\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { fuse: { provider: () => new HDWalletProvider(mnemonic, "YOUR_NODE_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Fuse nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { fuse: { url: "YOUR_NODE_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_NODE\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network fuse` and Hardhat will deploy using your node. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Fuse node: Get [MetaMask](https://metamask.io/) and set it to interact through a Fuse node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Fuse node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Fuse nodes. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_NODE_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_FUSE\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_NODE_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_FUSE\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Fuse nodes. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_NODE_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_FUSE\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_NODE_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_FUSE\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Fuse nodes. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_FUSE\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Fuse nodes. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Fuse nodes. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_FUSE_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_FUSE\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Fuse network ID: * Mainnet: `122` * Testnet: `123` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_NODE_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_NODE\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Fuse network ID: * Mainnet: `122` * Testnet: `123` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Fuse ID name="NETWORK_NAME" host= YOUR_NODE_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `fuse-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Fuse)**. * YOUR\_NODE\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Fuse network ID: * Mainnet: `122` * Testnet: `123` Example to run the deployment script: ```shell Shell brownie run deploy.py --network fuse-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Fuse node. ### Forge Use [forge](https://github.com/foundry-rs/foundry/tree/master/crates/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_NODE_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_FUSE\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/foundry-rs/foundry/tree/master/crates/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_NODE_ENDPOINT ``` where YOUR\_FUSE\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Fuse: Simple MultiSig Contract with Hardhat Source: https://docs.chainstack.com/docs/fuse-tutorial-simple-multisig-contract-with-hardhat ### Fuse support is deprecated Chainstack deprecated support for Fuse nodes. This page here is for legacy and in case you may find it useful. You can create a simple multiSig contract that requires confirmations from multiple owners to withdraw funds from it. In this tutorial, you will: * Create a simple multiSig contract. * Deploy the contract on the Fuse Spark testnet through a Fuse node. * Interact with the deployed contract. ## Prerequisites * A public Fuse endpoint. * [Hardhat](https://hardhat.org/) to compile and deploy the contract. * [Hardhat ABI Exporter](https://www.npmjs.com/package/hardhat-abi-exporter) to export the ABI to interact with the contract. * [MetaMask](https://metamask.io/) to interact with the contract through your Chainstack node. ### Get your Fuse node endpoint For example, get a community public endpoint. ### Install Hardhat See [Hardhat documentation](https://hardhat.org/hardhat-runner/docs/getting-started). ### Initialize a Hardhat project In your project directory, run `npx hardhat`. Select **Create a JavaScript project**. ### Create and compile the multiSig contract In the `contracts` directory, create `multiSig.sol`. ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.9; contract multiSigWallet { event Deposit(address indexed sender, uint amount, uint balance); event SubmitTransaction(address indexed owner, uint indexed txIndex, address indexed to, uint value); event ConfirmTransaction(address indexed owner, uint indexed txIndex); event RevokeConfirmation(address indexed owner, uint indexed txIndex); event ExecuteTransaction(address indexed owner, uint indexed txIndex); address[] public owners; mapping(address => bool) public isOwner; uint public numConfirmationsRequired; struct Transaction { address to; uint value; bool executed; uint numConfirmations; } mapping(uint => mapping(address => bool)) public isConfirmed; Transaction[] public transactions; modifier onlyOwner() { require(isOwner[msg.sender], "not owner"); _; } modifier txExists(uint _txIndex) { require(_txIndex < transactions.length, "tx does not exist"); _; } modifier notExecuted(uint _txIndex) { require(!transactions[_txIndex].executed, "tx already executed"); _; } modifier notConfirmed(uint _txIndex) { require(!isConfirmed[_txIndex][msg.sender], "tx already confirmed"); _; } constructor(address[] memory _owners, uint _numConfirmationsRequired) { require(_owners.length > 0, "At least 1 owner required"); require(_numConfirmationsRequired > 0 && _numConfirmationsRequired <= _owners.length, "invalid number of required confirmations"); for (uint i = 0; i < _owners.length; i++) { address owner = _owners[i]; require(owner != address(0), "invalid owner"); require(!isOwner[owner], "owner not unique"); isOwner[owner] = true; owners.push(owner); } numConfirmationsRequired = _numConfirmationsRequired; } receive() external payable { emit Deposit(msg.sender, msg.value, address(this).balance); } function submitTransaction( address _to, uint _value ) public onlyOwner { uint txIndex = transactions.length; transactions.push( Transaction({ to: _to, value: _value, executed: false, numConfirmations: 0 }) ); emit SubmitTransaction(msg.sender, txIndex, _to, _value); } function confirmTransaction(uint _txIndex) public onlyOwner txExists(_txIndex) notExecuted(_txIndex) notConfirmed(_txIndex) { Transaction storage transaction = transactions[_txIndex]; transaction.numConfirmations += 1; isConfirmed[_txIndex][msg.sender] = true; emit ConfirmTransaction(msg.sender, _txIndex); } function executeTransaction(uint _txIndex) public onlyOwner txExists(_txIndex) notExecuted(_txIndex) { Transaction storage transaction = transactions[_txIndex]; address tempaddr = transaction.to; require( transaction.numConfirmations >= numConfirmationsRequired, "cannot execute tx" ); transaction.executed = true; (bool success, ) = tempaddr.call{value: transaction.value}(""); require(success, "tx failed"); emit ExecuteTransaction(msg.sender, _txIndex); } function revokeConfirmation(uint _txIndex) public onlyOwner txExists(_txIndex) notExecuted(_txIndex) { Transaction storage transaction = transactions[_txIndex]; require(isConfirmed[_txIndex][msg.sender], "tx not confirmed"); transaction.numConfirmations -= 1; isConfirmed[_txIndex][msg.sender] = false; emit RevokeConfirmation(msg.sender, _txIndex); } function getOwners() public view returns (address[] memory) { return owners; } function getTransactionCount() public view returns (uint) { return transactions.length; } function getTransaction(uint _txIndex) public view returns ( address to, uint value, bool executed, uint numConfirmations ) { Transaction storage transaction = transactions[_txIndex]; return ( transaction.to, transaction.value, transaction.executed, transaction.numConfirmations ); } } ``` This is your multiSig contract: * The addresses that the deployer passes as parameters are set as co-owners of the multiSig Wallet. Along with that, the deployer passes an integer as the parameter. This integer will be the minimum number of confirmations a transaction will require to be approved. This integer must be greater than zero and must be less than or equal to the number of co-owners of the contract. * `receive` is a special function provided by Solidity that allows the contract to accept protocol native tokens without a specially defined function. It is declared without the keyword `function` and must be external and payable. * After declaring a struct to store the transaction data, there are two mappings to store the validity of the contract owners and to store all confirmations to a particular transaction. * The functions defined mostly restrict access to the owners, even though anyone can read the data. The function `executeTransaction` uses the low-level method `call()` to transfer the protocol native tokens from the contract to the required address if the transaction is approved. To compile the contract, run `npx hardhat compile`. ### Fund your account Fund the account that you will use to deploy the contract with SPARK—the native token of the Fuse Spark testnet. Use the [Fuse Spark testnet faucet](https://get.fusespark.io/). ### Set up Hardhat to work through your Fuse node In your project directory, open for editing `hardhat.config.js`. ```js JavaScript require("@nomicfoundation/hardhat-toolbox"); require('hardhat-abi-exporter'); module.exports = { solidity: "0.8.10", networks: { spark: { url: "YOUR_NODE_ENDPOINT", accounts: ['YOUR_PRIVATE_KEY'] }, }, abiExporter: [ { path: './abi/pretty', pretty: true, }, { path: './abi/ugly', pretty: false, }, ] }; ``` where * YOUR\_NODE\_ENDPOINT — your Fuse node HTTPS endpoint. See also [Fuse tooling](/docs/fuse-tooling). * YOUR\_PRIVATE\_KEY — the private key of your Fuse account that will deploy the contract. The account must have enough funds to run the deployment. See also [Fuse Spark testnet faucet](https://get.fusespark.io/). ### Deploy the multiSig contract Set up the deployment script at `scripts/deploy.js`. ```javascript Javascript сonst main = async () => { const [deployer] = await hre.ethers.getSigners(); console.log("Deploying contracts with account: ", deployer.address); const Contract = await hre.ethers.getContractFactory("multiSigWallet"); const contract = await Contract.deploy(["OWNER", "OWNER"], CONFIRMATION_NUMBER); await contract.deployed(); console.log("Contract address: ", contract.address); }; const runMain = async () => { try { await main(); process.exit(0); } catch (error) { console.error(error); process.exit(1); } }; runMain(); ``` where * OWNER — the addresses that co-own the contract and whose confirmations are required to withdraw the funds. * CONFIRMATION\_NUMBER — the number of confirmations required by the contract owners to withdraw the funds. Example of a contract owned by `0x7B397Bd7042560cdaE08C674Ef554e5C3239bC10` and `0xFDa85C3404dC00fFBe2A18615ba55380cB42c8Fb` and that requires a confirmation from both owners (`2`): ```js JavaScript const main = async () => { const [deployer] = await hre.ethers.getSigners(); console.log("Deploying contracts with account: ", deployer.address); const Contract = await hre.ethers.getContractFactory("multiSigWallet"); const contract = await Contract.deploy(["0x7B397Bd7042560cdaE08C674Ef554e5C3239bC10", "0xFDa85C3404dC00fFBe2A18615ba55380cB42c8Fb"], 2); await contract.deployed(); console.log("Contract address: ", contract.address); }; const runMain = async () => { try { await main(); process.exit(0); } catch (error) { console.error(error); process.exit(1); } }; runMain(); ``` Deploy the contract by running: ```bash Shell npx hardhat run --network spark scripts/deploy.js ``` ### Interact with the contract Once your contract is deployed, you can view it online at [Fuse Spark testnet explorer](https://explorer.fusespark.io/). You are now going to verify the contract in the explorer to be able to use the explorer as a web app and easily interact with the contract online. ### Verify the deployed contract on the explorer Go to [Fuse Spark testnet explorer](https://explorer.fusespark.io/). Find your deployed contract. The address of your contract is printed in the terminal by Hardhat at the end of the deployment. On the contract page in the explorer, click **Code** > **Verify & Publish**. In **Contract Name**, provide the name of your contract. In our example, the name is `multiSigWallet`. In **Compiler**, select the same compiler version that was used in the Hardhat configuration file. In our example, it is `v0.8.10`. In **EVM Version**, select **default**. In **Optimization**, select **No**. In **Enter the Solidity Contract Code**, paste the contract code. In ABI-encoded Constructor Arguments, provide constructor values: * Copy the ugly ABI version from your project directory. For this example, it is `abi/ugly/contracts/multiSig.sol/multiSigWallet.json`. * Go to [Online ABI Encoding Service](https://abi.hashex.org/). * Enter your ABI and click **Parse**. Remove the first and last `{}` and `"abi":`. * Enter the contructor parameters that you provided in your deployment script in `deploy.js`. * Copy the resulting value and put it in the **ABI-encoded Constructor Arguments** field in the explorer. Click **Verify & publish**. The explorer will take a few seconds to compile your contract, verify, and publish it. ### Interact with the contract Set up your MetaMask instance to work through your Fuse node. See [Fuse tooling: MetaMask](/docs/fuse-tooling#metamask). Using MetaMask, send some funds to the contract. Now that your multiSig contract is verified, you can use the explorer to interact with it. 1. In the explorer, on your contract, click **Write Contract**. 2. In your MetaMask, make sure you have the same address selected as the one that one of the contract owners. 3. Click **Connect wallet**. This will connect your MetaMask instance with the contract owner as the active address. 4. In **submitTransaction**, provide an address to send some funds to and the amount of funds in Wei. You can also use the [online unit converter](https://eth-converter.com/). 5. Click **Write**. 6. Once the transaction is included in a block, confirm it by providing the transaction index in **confirmTransaction** and clicking **Write**. Since this is the first transaction on the contract, the index is `0`. 7. Connect to the contract on the explorer with the other account that you provided as owner when deploying the contract. 8. Again, confirm the transaction through **confirmTransaction** and the index `0`. 9. Once the confirmation transaction is included in a block, execute it through **executeTransaction** and the index `0`. This will withdraw the funds from the multiSig contract. ## Conclusion This tutorial guided you through the basics of creating and deploying a simple multiSig contract on the Fuse Spark testnet through your Fuse node. You have also interacted with the contract, funded it, and withdrawn the funds with multiSig confirmations using the explorer as a web app and MetaMask as your interaction tool that works through your Fuse node. This tutorial uses testnet, but the exact same instructions work on the mainnet. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Geth vs Erigon: Deep dive into RPC methods on Ethereum clients Source: https://docs.chainstack.com/docs/geth-vs-erigon-deep-dive-into-rpc-methods-on-ethereum-clients **TLDR** * Geth and Erigon share most Ethereum JSON-RPC API calls but differ in some advanced functionality. * Geth provides a few extra methods for mining and wallet-bound calls (e.g., eth\_sendTransaction), while Erigon supports others like eth\_getBlockReceipts and unique debugging/trace methods. * Identify which client your provider runs to ensure compatibility and avoid missing methods. * Explore the Chainstack API reference for more detail and use web3\_clientVersion to check your client programmatically. ## Main article The JSON-RPC API is an essential tool for blockchain development as it provides a standardized way to interact with blockchain networks through HTTP or WebSocket connections. At the time of writing, there are seven clients for Ethereum, which is a great achievement in terms of decentralization. However, this can cause a problem for developers because different clients often implement different RPC API sets, leading to confusion. For example, if your DApp was developed on Erigon and you plan to switch to a new RPC provider running Geth, it's possible that the core API used in your application won't be available on the new node. To avoid compatibility issues, it's important to identify the current client you're using and the available API methods. This article compares the available RPC APIs for two Ethereum clients, Erigon and Geth, that are available on Chainstack. It is important to note that all tests were performed using **Erigon version 2.42** and **Geth version 1.11.5**, which were the most up-to-date versions as of May 2023. It is possible that there may be changes in the future. ### Find what client your node is running With Chainstack, you can easily view this information in the console. However, if you prefer a programmatic approach, consider using the `web3_clientVersion` method. This RPC method returns the client version as a response. Discover how to use `web3_clientVersion` and explore code examples in the [Ethereum node API reference](/reference/ethereum_node_api/client_info/web3_clientVersion). ## Standard Ethereum JSON RPC methods available The Ethereum JSON-RPC is a standard collection of methods that all execution clients must implement. The detailed specification can be found in the [Ethereum documentation](https://ethereum.github.io/execution-apis/api-documentation/). Most of the standard methods are implemented in both Geth and Erigon. However, the Erigon team has either deprecated or not implemented a few of them. ### Methods available on both Erigon and Geth | | | | | | ------------------------- | -------------------------------------- | ---------------------------------------- | ------------------------------- | | eth\_blockNumber | eth\_syncing | eth\_gasPrice | eth\_maxPriorityFeePerGas | | eth\_feeHistory | eth\_getBlockTransactionCountByHash | eth\_getUncleCountByBlockHash | eth\_getUncleCountByBlockNumber | | eth\_getTransactionByHash | eth\_getTransactionByBlockHashAndIndex | eth\_getTransactionByBlockNumberAndIndex | eth\_getTransactionReceipt | | eth\_estimateGas | eth\_getBalance | eth\_getCode | eth\_getTransactionCount | | eth\_getStorageAt | eth\_call | eth\_createAccessList | eth\_newFilter | | eth\_newBlockFilter | eth\_newPendingTransactionFilter | eth\_getFilterLogs | eth\_getFilterChanges | | eth\_uninstallFilter | eth\_getLogs | eth\_sendRawTransaction | | Check the Chainstack [Ethereum node API reference](/reference/ethereum-getting-started) to learn how these methods work. ### Standard methods available on Geth only | | | | | | -------------------- | -------------------- | ------------------------ | --------------------- | | eth\_accounts | eth\_sendTransaction | eth\_sign | eth\_signTransaction | | eth\_mining | eth\_hashrate | eth\_getWork | eth\_submitWork | | eth\_submitHashrate | eth\_sign | eth\_signTransaction | eth\_getProof | | eth\_sendTransaction | debug\_getRawBlock | debug\_getRawTransaction | debug\_getRawReceipts | | debug\_getRawHeader | debug\_getBadBlocks | | | The following methods require the client to be associated with a wallet address. They are either deprecated or not implemented by Erigon. * eth\_accounts * eth\_sendTransaction * eth\_sign * eth\_signTransaction The following methods are only available in miner mode: * eth\_mining * eth\_hashrate * eth\_getWork * eth\_submitWork * eth\_submitHashrate ## Non-standard methods available on both Geth and Erigon In addition to the standard methods, Geth and Erigon implement their own RPC methods. Erigon was originally a fork of Geth, so both clients share the majority of the RPC APIs. | | | | | ----------------------------------------- | ------------------------------------------- | -------------------------------- | | eth\_subscribe | eth\_unsubscribe | | | web3\_clientVersion | web3\_sha3 | | | txpool\_content | txpool\_status | | | net\_listening | net\_peerCount | net\_version | | eth\_getUncleByBlockHashAndIndex | eth\_getUncleByBlockNumberAndIndex | eth\_getRawTransactionByHash | | eth\_getRawTransactionByBlockHashAndIndex | eth\_getRawTransactionByBlockNumberAndIndex | eth\_createAccessList | | debug\_accountRange | debug\_getModifiedAccountsByNumber | debug\_getModifiedAccountsByHash | | debug\_traceBlockByNumber | debug\_traceBlockByHash | debug\_traceTransaction | | debug\_storageRangeAt | debug\_traceCall | | The `eth_subscribe` and `eth_unsubscribe` RPC methods, available exclusively through WebSocket Secure (WSS), allow subscribing to real-time data streams for events like new blocks and pending transactions. The `web3_clientVersion` function delivers client information, while `web3_sha3` calculates the Keccak hash for a given string. Utilize `txpool_content` and `txpool_status` RPC methods for accessing transactions in the mempool. RPC methods within the `net` namespace facilitate the monitoring of a node's P2P status. Leveraging non-standard methods in the `eth` namespace is advantageous for obtaining extra information that is not accessible via standard RPC methods but is highly valuable for developers. The `debug` methods are designed for advanced users and fulfill a range of purposes, including gathering execution traces for single or multiple transactions. ### Non-standard methods available on Geth only | | | | | | ---------------------------------- | --------------------------- | --------------------- | ------------------------------- | | txpool\_contentFrom | txpool\_inspect | | | | debug\_backtraceAt | debug\_blockProfile | debug\_chaindbCompact | debug\_chaindbProperty | | debug\_cpuProfile | debug\_dbAncient | debug\_dbAncients | debug\_dbGet | | debug\_dumpBlock | debug\_freeOSMemory | debug\_freezeClient | debug\_gcStats | | debug\_getAccessibleState | debug\_getBadBlocks | debug\_getRawBlock | debug\_getRawHeader | | debug\_getRawTransaction | debug\_getRawReceipts | debug\_goTrace | debug\_intermediateRoots | | debug\_memStats | debug\_mutexProfile | debug\_preimage | debug\_printBlock | | debug\_seedHash | debug\_setBlockProfileRate | debug\_setGCPercent | debug\_setHead | | debug\_setMutexProfileFraction | debug\_setTrieFlushInterval | debug\_stacks | debug\_standardTraceBlockToFile | | debug\_standardTraceBadBlockToFile | debug\_startCPUProfile | debug\_startGoTrace | debug\_stopCPUProfile | | debug\_stopGoTrace | debug\_traceBadBlock | debug\_traceBlock | debug\_traceBlockFromFile | | debug\_traceChain | debug\_verbosity | debug\_vmodule | debug\_writeBlockProfile | | debug\_writeMemProfile | debug\_writeMutexProfile | | | Geth provides two additional methods in the `txpool` namespace: `txpool_contentFrom` and `txpool_inspect`. `txpool_contentFrom` retrieves the transactions contained within the txpool, and `txpool_inspect` lists a textual summary of all transactions. In addition, Geth exposes many execution-time node tuning through the debug namespace. Some of these methods are destructive to the node itself so it should be used with caution. For example: * `debug_freezeClient` forces a temporary client freeze. * `debug_setHead` sets the current head of the local chain by block number. * `debug_setTrieFlushInterval` configures how often in-memory state tries are persisted to disk. If this value is set to 0, the node will essentially turn into an archive node. Geth also provides handy trace methods in the `debug` namespace. If you are interested in how they work, the guide [Deep Dive into Ethereum debug\_trace APIs](https://chainstack.com/deep-dive-into-ethereum-trace-apis/) may be useful for you. ### Non-standard methods available on Erigon only | | | | | | ------------------------ | ----------------------------------- | ------------------------- | ------------------------------ | | eth\_getBlockReceipts | eth\_protocolVersion | eth\_callMany | eth\_callBundle | | debug\_accountAt | debug\_traceCallMany | | | | trace\_call | trace\_callMany | trace\_rawTransaction | trace\_replayBlockTransactions | | trace\_replayTransaction | trace\_block | trace\_filter | trace\_get | | trace\_transaction | | | | | erigon\_getHeaderByHash | erigon\_getBlockReceiptsByBlockHash | erigon\_getHeaderByNumber | erigon\_getLogsByHash | | erigon\_forks | erigon\_getBlockByTimestamp | erigon\_BlockNumber | erigon\_getLatestLogs | Erigon inherits some debug and trace methods from Nethermind and Flashbot, which provides more possibilities than a Geth node. For example, `debug_traceCallMany` and `trace_callMany` are handy methods that don't exist on Geth. These two methods perform the same function, allowing users to send multiple transactions in a batch to oversee their execution. The transactions are executed in sequence, with each transaction depending on the resulting state of the previous transactions. ## Last but not least If you would like to learn more about how these methods work in detail, you can visit Chainstack’s [API documentation](/reference/ethereum-getting-started). Additionally, there is a [Chainstack Postman collection](/reference/ethereum-rpc-methods-postman-collection) for you to try out these RPC methods. Hope this article is helpful for you. If you have any questions, feel free to ping me on my social media or in Chainstack's [Telegram](https://t.me/chainstack) or [Discord](https://discord.gg/Cymtg2f7pX). Happy coding, cheers! ### See also ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Getting started with Foundry Source: https://docs.chainstack.com/docs/getting-started-with-foundry * Foundry is a complete toolkit that simplifies Ethereum smart contract development with modular components like Forge (build/test), Anvil (local node), Cast (CLI interactions), and Chisel (Solidity REPL). * You can compile and deploy contracts using Solidity-based scripts, specify private keys via CLI flags, and switch between local or custom RPC endpoints. * Cast offers flexible commands for sending transactions, reading contract data, or converting values, making it a powerful tool for rapid iteration and debugging. * Built-in testing (Forge) supports local, forked, and coverage tests, making it straightforward to manage dependencies, run end-to-end scenarios, and ensure code reliability. ## Main article If you're diving into smart contract development, you'll find Foundry to be an incredibly handy companion. It's more than just a tool; it's a full-fledged framework that makes smart contracts' development, testing, and deployment smoother and more efficient. Whether you're new to blockchain or an experienced developer, Foundry is designed to be a reliable and comprehensive toolkit that fits right into your workflow. Foundry is comprised of several key components, each serving a distinct purpose in the smart contract development lifecycle: 1. **Forge**: Forge is the heart of the Foundry framework, acting as a powerful compilation and testing tool. It's instrumental in compiling smart contracts, running a suite of tests (including fuzz and property-based tests), and ensuring the contracts are robust and secure before deployment. 2. **Anvil**: Anvil is a local node tailored for development. This component is invaluable for developers who need a quick and easy way to test their contracts in a local blockchain environment. Anvil enables rapid iteration and debugging without connecting to the main networks or testnets. 3. **Cast**: Cast is a versatile tool within Foundry designed to interact with Ethereum. It facilitates a range of actions, from sending transactions and querying blockchain data to manipulating local Ethereum states. This utility makes interacting with deployed contracts easier and performs various blockchain-related tasks. 4. **Chisel**: Chisel enriches the Foundry suite as a Solidity REPL (Read-Eval-Print Loop), enabling developers to interactively test and experiment with Solidity code snippets. It's ideal for immediate feedback and debugging in real-time, and it's compatible both within and outside Foundry projects. This guide covers installing Foundry, setting up, compiling, deploying, and interacting with smart contracts. ## Initializing a Project ### Creating a New Project * In an empty directory, initialize a new Foundry project: ```bash Shell forge init ``` * To create a new directory with the project: ```bash Shell forge init PROJECT_NAME ``` * **Note:** The `src` directory is where the smart contracts are placed. ## Compiling Contracts * To compile the contracts, run either: ```bash Shell forge build ``` or ```bash Shell forge compile ``` * The `out` directory will generate a JSON file containing compilation data, such as ABI. ## Setting Up a Local Blockchain * Use Anvil to start a local blockchain for testing: ```bash Shell anvil ``` * The local blockchain will run at `127.0.0.1:8545`, and you can add it to MetaMask for ease of testing. ## Deploying Contracts ### Deploying Locally or to a Custom RPC * To deploy smart contracts, use `forge create`. Forge defaults to the Anvil local blockchain, but other RPCs can be specified using the `--rpc-url` flag. Get a free RPC from [Chainstack](https://chainstack.com/). * Deploying locally with Anvil running: ```bash Shell forge create CONTRACT_NAME ``` * Deploying to a custom endpoint: ```bash Shell forge create CONTRACT_NAME --rpc-url YOUR_ENDPOINT ``` * This will likely not work because it needs a private key to deploy. ```bash Shell Error: Error accessing local wallet. Did you set a private key, mnemonic or keystore? Run `cast send --help` or `forge create --help` and use the corresponding CLI flag to set your key via: --private-key, --mnemonic-path, --aws, --interactive, --trezor or --ledger. Alternatively, if you're using a local node with unlocked accounts, use the --unlocked flag and either set the `ETH_FROM` environment variable to the address of the unlocked account you want to use, or provide the --from flag with the address directly. ``` ### Options for Specifying Private Key * Run the create command with the `--interactive` flag for a prompt to add a private key: ```bash Shell forge create CONTRACT_NAME --interactive ``` * Or, directly include the private key in the command: ```bash Shell forge create CONTRACT_NAME --private-key YOUR_PRIVATE_KEY ``` ## Writing Deploy Scripts * Scripts in Foundry are written in Solidity. We'll use a Solidity script to deploy a contract. By convention, script files end with `.s.sol`. * **Example:** Deploying `SimpleStorage.sol`. ### Creating the Deploy Script * Create a file named `deploySimpleStorage.s.sol` with the following content: ```sol sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.19; import {Script} from "forge-std/Script.sol"; import {SimpleStorage} from "../src/SimpleStorage.sol"; contract DeploySimpleStorage is Script { function run() external returns (SimpleStorage) { vm.startBroadcast(); SimpleStorage simpleStorage = new SimpleStorage(); vm.stopBroadcast(); return simpleStorage; } } ``` * Deploy the contract using the script: ```bash Shell forge script script/DeploySimpleStorage.s.sol --rpc-url YOUR_RPC --broadcast --private-key YOUR_PRIVATE_KEY ``` Let's break down its key components and functionalities: 1. **Pragma Directive**: * `pragma solidity ^0.8.19;`: Specifies that the script is compatible with Solidity version 0.8.19 or any newer version of the 0.8 series but not version 0.9 or above. 2. **Imports**: * `import {Script} from "forge-std/Script.sol";`: Imports the `Script` class from the `forge-std` library, which is a part of Foundry, a development environment for Ethereum smart contracts. * `import {SimpleStorage} from "../src/SimpleStorage.sol";`: Imports the `SimpleStorage` contract, presumably a custom contract located in the `src` directory. 3. **Contract Declaration**: * `contract DeploySimpleStorage is Script`: Defines a new contract named `DeploySimpleStorage` that inherits from the `Script` class. This setup is typical for deployment scripts in Foundry. 4. **Function Definition**: * `function run() external returns (SimpleStorage)`: The `run` function is the main entry point for the deployment script. It's marked `external` as it's intended to be called externally, and it returns an instance of `SimpleStorage`. 5. **Deployment Process**: * `vm.startBroadcast();`: Initiates a transaction broadcast. The `vm` object is a special component in Foundry, providing various functionalities related to the Ethereum Virtual Machine (EVM). * `SimpleStorage simpleStorage = new SimpleStorage();`: Instantiates the `SimpleStorage` contract. * `vm.stopBroadcast();`: Ends the transaction broadcast. 6. **Return Statement**: * `return simpleStorage;`: Returns the deployed instance of `SimpleStorage`. This script is a typical example of a deployment script used in the Foundry environment for deploying Ethereum smart contracts. It's concise and follows the pattern of starting a broadcast, deploying the contract, and stopping it. The `SimpleStorage` contract, which is not detailed here, would contain the actual business logic or data storage mechanisms. ## Interacting with Contracts Using Cast ### Sending Transactions * To send transactions, use `cast send`: ```bash Shell cast send ADDRESS FUNCTION_SIG PARAMS ``` * **Example:** ```bash Shell cast send 0x5FbDB2315678afecb367f032d93F642f64180aa3 "store(uint256)" 3333 --rpc-url $RPC_URL --private-key $PRIVATE_KEY ``` ### Reading from Contracts * Use `cast call` for reading view functions: ```bash Shell cast call ADDRESS FUNCTION_SIGNATURE ``` * **Example:** ```bash Shell cast call 0x5FbDB2315678afecb367f032d93F642f64180aa3 "retrieve()" ``` You can use cast for conversions, for example, hex to dec: ```bash Shell cast --to-base 0x0000000000000000000000000000000000000000000000000000000000000d05 dec ``` Or use the [Chainstack EVM Swiss Knife](https://web3tools.chainstacklabs.com/hexadecimal-decimal). ## Managing Dependencies ### Installing Smart Contract Dependencies * Use the following command to install dependencies from a repository: ```bash Shell forge install smartcontractkit/chainlink-brownie-contracts --no-commit ``` * Dependencies are added to the `lib` directory. ### Remapping Dependencies * Add remappings in the `foundry.toml` file for syntax convenience: ```toml TOML remappings = ['@chainlink/contracts/=lib/chainlink-brownie-contracts/contracts/'] ``` chainlink/contracts/=lib/chainlink-brownie-contracts/contracts/'] ## Writing and Running Tests ### Example Test Contract * Tests in Foundry are also written in Solidity. Here's an example: ```bash Bash // SPDX-License-Identifier: MIT pragma solidity ^0.8.18; import {Test, console} from "forge-std/Test.sol"; contract FundMeTest is Test { uint256 number = 33; function setUp() external { number = 3333; } function testDemo() public { console.log("The saved number is", number); assertEq(number, 3333); } } ``` * Run tests with: ```bash Shell forge test -vv ``` * The `-vv` flag outputs detailed logs for better insight. Let's break down its components: 1. **License and Solidity Version Declaration**: * `// SPDX-License-Identifier: MIT`: This is a comment specifying the license under which this file is released, in this case, the MIT License. * `pragma solidity ^0.8.18;`: This line specifies the compiler version. The file is compatible with Solidity version 0.8.18 and above within the 0.8.x range. 2. **Imports**: * `import {Test, console} from "forge-std/Test.sol";`: This line imports two elements from the Forge standard library (`forge-std`): * `Test`: A base contract that provides testing functionalities. * `console`: A utility to log output to the console. This is particularly useful for debugging and tracking variable values during test execution. 3. **Test Contract Declaration**: * `contract FundMeTest is Test {`: This line declares a new contract `FundMeTest` which inherits from the `Test` contract. In the context of Forge, this means `FundMeTest` is a test suite. 4. **State Variable**: * `uint256 number = 33;`: A state variable `number` of type `uint256` (unsigned integer of 256 bits) is declared and initialized to 33. This variable is used to demonstrate state manipulation and assertion in the test. 5. **Setup Function**: * `function setUp() external { number = 3333; }`: The `setUp()` function is a special function in the Forge framework that runs before each test function. It's used for initializing or resetting the state. Here, it sets the `number` variable to 3333. 6. **Test Function**: * `function testDemo() public { ... }`: This is the actual test function. In Forge, any function with a name starting with `test` is considered a test case. * `console.log("The saved number is", number);`: This line logs the value of `number` to the console, which is useful for debugging or verifying the test state. * `assertEq(number, 3333);`: This is an assertion statement provided by the `Test` contract. It checks whether the value of `number` is equal to 3333. If the assertion fails (i.e., if `number` is not 3333), the test will fail. ## Testing on a Fork * Run tests on a forked network by adding an RPC URL: ```bash Shell forge test -vvv --fork-url $SEPOLIA_RPC ``` ## Coverage Analysis * Use `forge coverage` to analyze how much of your contracts are tested: ```bash Shell forge coverage --fork-url $SEPOLIA_RPC ``` It will display a nice table: ```bash Shell [⠢] Compiling... [⠢] Compiling 26 files with 0.8.20 [⠆] Solc 0.8.20 finished in 4.07s Compiler run successful! Analysing contracts... Running tests... | File | % Lines | % Statements | % Branches | % Funcs | |---------------------------|---------------|---------------|---------------|--------------| | script/DeployFundme.s.sol | 0.00% (0/3) | 0.00% (0/3) | 100.00% (0/0) | 0.00% (0/1) | | src/FundMe.sol | 16.67% (2/12) | 23.53% (4/17) | 0.00% (0/4) | 25.00% (1/4) | | src/PriceConverter.sol | 0.00% (0/6) | 0.00% (0/11) | 100.00% (0/0) | 0.00% (0/2) | | Total | 9.52% (2/21) | 12.90% (4/31) | 0.00% (0/4) | 14.29% (1/7) | ``` # TON: Deploy a smart contract Source: https://docs.chainstack.com/docs/getting-started-with-ton-deploy-a-smart-contract **TLDR** * You’ll configure a Hardhat project to interact with Aave V3 flash loans on Avalanche’s Fuji testnet. * You’ll use Chainstack for your Avalanche node endpoint and deploy a custom FlashLoan contract. * You’ll borrow USDC, then repay it plus fees in a single transaction, demonstrating a flash loan’s instant, collateral-free mechanics. * By the end, you’ll have a working Aave flash loan flow on a testnet environment ready for deeper custom logic. ## Main article The Open Network ([TON](https://docs.ton.org/)) is a decentralized and open platform comprising several components, including TON Blockchain, TON DNS, TON Storage, and TON Sites. Originally developed by the Telegram team, TON aims to provide fast, secure, and user-friendly decentralized services. Its core protocol, TON Blockchain, connects the underlying infrastructure to form the greater TON Ecosystem. TON is focused on achieving widespread cross-chain interoperability within a highly scalable and secure framework. Designed as a distributed supercomputer or “superserver,” TON provides various products and services to contribute to developing the decentralized vision for the new internet. This makes it well-suited for a wide range of decentralized applications. One of the unique features of TON smart contracts is that they can receive messages and do some action based on it; we'll leverage this in the smart contract we'll develop. This tutorial will guide you through creating, building, testing, deploying, and interacting with a simple storage contract on the TON blockchain. We'll use the [Blueprint SDK](https://docs.ton.org/develop/smart-contracts/sdk/javascript). The Blueprint SDK is an essential tool for developers working with the TON blockchain. It provides a comprehensive suite of tools and libraries that simplify the process of writing, testing, and deploying smart contracts. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a TON testnet node * [Node.js](https://nodejs.org/en) * A TON wallet, we used Tonekeeper. You can choose one on [ton.org](https://ton.org/wallets?locale=en\&pagination%5Blimit%5D=-1). * Some testnet tokens; get some tokens from the [TON faucet](https://t.me/testgiver_ton_bot). ### Run nodes on Chainstack [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Project overview This will be an introductory project where we use the Blueprint SDK to develop, test, and deploy a simple storage-style smart contract where we can save a number and increment a counter. The smart contract is written in the TACT language, TON's smart contract language, and is comparable to Solidity for EVM-based chains. ## Getting started Let's create a new TON project and run the initialization command in a new directory. ```bash Shell npm create ton@latest ``` Select an empty project in TACT. When you create a new project using the Blueprint SDK, the directory structure is organized to help you efficiently manage your smart contract development process. Here's a brief overview of the main directories and their purposes: * **build/**: This directory contains the compiled artifacts of your smart contracts. After you run the build command, the output files will be stored here, including the compiled contract bytecode and other related artifacts. * **contracts/**: This is where you write your smart contracts. All `.tact` files containing the contract code are placed in this directory. For instance, the `SimpleStorage` contract will reside here. * **scripts/**: This directory is used for deployment and interaction scripts. These scripts facilitate deploying your contracts to the TON blockchain and interacting with them. For example, scripts to deploy the contract or fetch data from the contract will be placed here. * **tests/**: This directory holds the test files for your smart contracts. Here, you can write tests to ensure your contracts behave as expected. The default test script verifies contract deployment, and you can extend it to test additional functionalities. * **wrappers/**: This directory contains the TypeScript wrappers generated for your contracts. These wrappers provide a convenient way to interact with your contracts in a type-safe manner from your scripts and tests. ## Contract development Now, we are ready to start developing the contract. You'll find a `.tact` contract in the `contracts` directory, paste the following code in it. ```sol simple_contract.tact import "@stdlib/deploy"; // Allows the contract to receive a custom object of type "Save" specifying the number to save in the contract. message Save { amount: Int as uint32; } // This is an example of a simple storage contract. It has a function that increments the saved number by one when the function is called. contract SimpleContract with Deployable { // Declare variables // Variables structure: name: type id: Int as uint32; // This is an ID for contract deployment savedNumber: Int as uint32; counter: Int as uint32; // init is similar to a contructor in Solidity init(id: Int) { // Init the id assed from the contructor self.id = id; // Initialize the variables to zero when the contract is deployed self.savedNumber = 0; self.counter = 0; } // TON contracts can recevie messages // This function makes an action when a specific message is recevied // In this case, when the contract recevies the message "add 1" will increment the counter variable by 1 receive("add 1"){ self.counter = self.counter + 1; } // This allows the contract to recevie objects, in this case of type "Save" // Save a value in the contract receive(msg: Save){ self.savedNumber = msg.amount; } // Getter function to read the variable get fun Number(): Int { // Int is the type of value returned return self.savedNumber; } // Getter function to read the counter variable get fun Counter(): Int { // Int is the type of value returned return self.counter; } // Getter function for the ID get fun Id(): Int { return self.id; } } ``` Here is a breakdown of how the code works. ### Breakdown of the Contract 1. **Imports and Message Declaration** * The contract imports necessary modules using `import "@stdlib/deploy";`. * A custom message type `Save` is declared, which contains an `amount` field of type `Int` (aliased as `uint32`). ```tact tact import "@stdlib/deploy"; message Save { amount: Int as uint32; } ``` 2. **Contract Declaration** * The `SimpleContract` is declared with the `Deployable` trait, allowing it to be deployed on the blockchain. * The contract contains three variables: `id`, `savedNumber`, and `counter`, all of type `Int` (aliased as `uint32`). ```tact tact contract SimpleContract with Deployable { id: Int as uint32; savedNumber: Int as uint32; counter: Int as uint32; ``` 3. **Initialization** * The `init` function acts as a constructor and initializes the contract with an `id`. We need to give a custom ID because the smart contract address will be determined during deployment based on the code and initial status. * The variables `savedNumber` and `counter` are set to zero upon deployment. ```tact tact init(id: Int) { self.id = id; self.savedNumber = 0; self.counter = 0; } ``` 4. **Message Handlers** * The contract can receive messages to perform specific actions. * The `receive("add 1")` handler increments the `counter` by 1 when it receives the message "add 1". ```tact tact receive("add 1") { self.counter = self.counter + 1; } ``` * The `receive(msg: Save)` handler allows the contract to receive an object of type `Save` and store the `amount` in `savedNumber`. ```tact tact receive(msg: Save) { self.savedNumber = msg.amount; } ``` 5. **Getter Functions** * The contract includes getter functions to retrieve the values of `savedNumber`, `counter`, and `id`. ```tact tact get fun Number(): Int { return self.savedNumber; } get fun Counter(): Int { return self.counter; } get fun Id(): Int { return self.id; } ``` The `SimpleContract` is a straightforward example of a storage contract on the TON blockchain. It demonstrates basic functionalities such as initializing variables, handling messages to perform specific actions, and providing getter functions to retrieve stored values. In the next sections, we will build, test, deploy, and interact with this contract using the Blueprint SDK. ## Build the contract Once we have the contract, we can run the build command to compile it and ensure no error. ```bash Shell npx blueprint build ``` It will compile and build the contract ```bash Shell Using file: SimpleContract Build script running, compiling SimpleContract ⏳ Compiling... > 👀 Enabling debug > SimpleContract: tact compiler > SimpleContract: func compiler > SimpleContract: fift decompiler > Packaging > SimpleContract > Bindings > SimpleContract > Reports > SimpleContract ⚠️ Make sure to disable debug mode in contract wrappers before doing production deployments! Compiled successfully! Cell BOC result: { "hash": "8bb0916eb10debd617ebaba79be7097cc21e41597dc940d16af521dbed9dad25", "hashBase64": "i7CRbrEN69YX66unm+cJfMIeQVl9yUDRavUh2+2drSU=", "hex": "b5ee9c724102110100029f000114ff00f4a413f4bcf2c80b0102016202070298d001d0d3030171b0a301fa400120d74981010bbaf2e08820d70b0a208104ffbaf2d0898309baf2e088545053036f04f86102f862db3c59db3cf2e082c8f84301cc7f01ca000101cb1fc9ed54090301f6eda2edfb0192307fe07021d749c21f953020d70b1fde208210946a98b6ba8ea830d31f018210946a98b6baf2e081d33f0131c8018210aff90f5758cb1fcb3fc9f84201706ddb3c7fe0c0008e2af90182eb0d7299a100d9de4cf674453aeb6aa320067fc00702b62aa29243621d23217dba94a47fdb31e09130e27004013a6d6d226eb3995b206ef2d0806f22019132e2102470030480425023db3c0501cac87101ca01500701ca007001ca02500520d74981010bbaf2e08820d70b0a208104ffbaf2d0898309baf2e088cf165003fa027001ca68236eb3917f93246eb3e2973333017001ca00e30d216eb39c7f01ca0001206ef2d08001cc95317001ca00e2c901fb000600987f01ca00c87001ca007001ca00246eb39d7f01ca0004206ef2d0805004cc9634037001ca00e2246eb39d7f01ca0004206ef2d0805004cc9634037001ca00e27001ca00027f01ca0002c958cc020120080c020fbd10c6d9e6d9e18c090b013ced44d0d401f863d2000194d31f0131e030f828d70b0a8309baf2e089db3c0a0002700002200201200d0e00b9bbbd182705cec3d5d2cae7b1e84ec39d64a851b6682709dd6352d2b647cb322d3af2dfdf1623982702055c01b80676394ce583aae4725b2c382701bd49def954596f1c753d3de0559c32682709d974e5ab34ecb733a0e966d9466e8a480201480f100011b0afbb5134348000600075b26ee3435697066733a2f2f516d576165594177773744717159335651704e5136456232414146466d67416346323365383955655a7947327764820ab944fa3" } Wrote compilation artifact to build/SimpleContract.compiled.json ``` ## Test the contract Once the contract compiles, we can test it; the test file is in the `tests` directory. The default test script verifies that the contract deploys correctly. Let's edit it to also check that the counter and the save features work. Paste the following code. ```typescript SimpleContract.spec.ts import { Blockchain, SandboxContract, TreasuryContract } from '@ton/sandbox'; import { toNano } from '@ton/core'; import { SimpleContract } from '../wrappers/SimpleContract'; import '@ton/test-utils'; // On TON we can test by creating a virtual chain describe('SimpleContract', () => { let blockchain: Blockchain; // Init a virtual chain let deployer: SandboxContract; let simpleContract: SandboxContract; // Init the smart contract instance const contractId = 1648n; // Id for deployment that will be passed in the contructor. Random value in this example beforeEach(async () => { blockchain = await Blockchain.create(); simpleContract = blockchain.openContract(await SimpleContract.fromInit(contractId)); // Init the deployer. It comes with 1M TON tokens deployer = await blockchain.treasury('deployer'); const deployResult = await simpleContract.send( deployer.getSender(), { value: toNano('0.05'), // Value to send to the contract }, { $$type: 'Deploy', // This because the contract inherits the Deployable trait. queryId: 0n, }, ); // Here is the test. In this case it tests that the contract is deployed correctly. expect(deployResult.transactions).toHaveTransaction({ from: deployer.address, to: simpleContract.address, deploy: true, success: true, }); }); it('should deploy', async () => { // the check is done inside beforeEach // blockchain and simpleContract are ready to use console.log('Deploying contract...'); const conttactId = await simpleContract.getId(); console.log(`Fetched ID during deployment: ${conttactId}`); }); it('should increase', async () => { console.log('Testing increase by 1 function...'); const counterBefore = await simpleContract.getCounter(); console.log('counterBefore - ', counterBefore); await simpleContract.send( deployer.getSender(), { value: toNano('0.02'), }, 'add 1', // The message the contract expects ); const counterAfter = await simpleContract.getCounter(); console.log('counterAfter - ', counterAfter); // Check it incremented the value expect(counterBefore).toBeLessThan(counterAfter); }); it('should save the amount', async () => { console.log('Testing increase by given value function...'); const numeberBefore = await simpleContract.getNumber(); const amount = 10n; console.log(`Value to save: ${amount}`); console.log(`Number saved before: ${numeberBefore}`); await simpleContract.send( deployer.getSender(), { value: toNano('0.02'), }, { $$type: 'Save', // This time it's an object and not just text amount: amount, }, ); const numberAfter = await simpleContract.getNumber(); console.log(`Number saved after: ${numberAfter}`); }); }); ``` ### Breakdown of the Test File Here is a quick breakdown, the test and interaction scripts are written in TypeScript. The idea is that the test file spins up a virtual chain to run the tests on with `let blockchain: Blockchain; // Init a virtual chain`. 1. **Imports** * Import necessary modules and utilities from the TON Sandbox, core libraries, and the `SimpleContract` wrapper. ```typescript TypeScript import { Blockchain, SandboxContract, TreasuryContract } from '@ton/sandbox'; import { toNano } from '@ton/core'; import { SimpleContract } from '../wrappers/SimpleContract'; import '@ton/test-utils'; ``` 2. **Describe Block** * Define the test suite for the `SimpleContract`. Inside the `describe` block, we initialize variables for the blockchain, deployer, and contract instances. ```typescript TypeScript describe('SimpleContract', () => { let blockchain: Blockchain; // Init a virtual chain let deployer: SandboxContract; let simpleContract: SandboxContract; // Init the smart contract instance const contractId = 1648n; // Id for deployment that will be passed in the constructor. Random value in this example ``` 3. **beforeEach Hook** * This hook runs before each test. It sets up the blockchain environment, initializes the contract, and deploys it using a deployer with 1M TON tokens available. The deployment is then verified to ensure the contract is deployed successfully. ```typescript TypeScript beforeEach(async () => { blockchain = await Blockchain.create(); simpleContract = blockchain.openContract(await SimpleContract.fromInit(contractId)); // Init the deployer. It comes with 1M TON tokens deployer = await blockchain.treasury('deployer'); const deployResult = await simpleContract.send( deployer.getSender(), { value: toNano('0.05'), // Value to send to the contract }, { $$type: 'Deploy', // This because the contract inherits the Deployable trait. queryId: 0n, }, ); // Here is the test. In this case it tests that the contract is deployed correctly. expect(deployResult.transactions).toHaveTransaction({ from: deployer.address, to: simpleContract.address, deploy: true, success: true, }); }); ``` 4. **Test: should deploy** * This test checks if the contract is deployed correctly by fetching the contract ID. The actual deployment check is handled in the `beforeEach` hook. ```typescript TypeScript it('should deploy', async () => { // the check is done inside beforeEach // blockchain and simpleContract are ready to use console.log('Deploying contract...'); const contractId = await simpleContract.getId(); console.log(`Fetched ID during deployment: ${contractId}`); }); ``` 5. **Test: should increase** * This test verifies the functionality of the `add 1` message. It retrieves the counter value before and after sending the message and checks if it has increased. ```typescript TypeScript it('should increase', async () => { console.log('Testing increase by 1 function...'); const counterBefore = await simpleContract.getCounter(); console.log('counterBefore - ', counterBefore); await simpleContract.send( deployer.getSender(), { value: toNano('0.02'), }, 'add 1', // The message the contract expects ); const counterAfter = await simpleContract.getCounter(); console.log('counterAfter - ', counterAfter); // Check it incremented the value expect(counterBefore).toBeLessThan(counterAfter); }); ``` 6. **Test: should save the amount** * This test checks the functionality of saving a specified amount in the contract. It sends a `Save` message and verifies if the `savedNumber` variable is updated correctly. Run the test with the test command. ```bash Shell npx blueprint test ``` ## Deploy to the TON chain The Blueprint SDK allows you to deploy contracts to the mainnet or testenet and it provides endpoints out of the box, but in this case we want to use the Chainstack endpoint we deployed since performs better and it's more reliable. We can add it in a configuration file, in the project's root create a new file named `blueprint.config.ts` and paste the code. ```typescript blueprint.config.ts import { Config } from '@ton/blueprint'; export const config: Config = { network: { endpoint: 'YOUR_CHAINSTACK_ENDPOINT', type: 'testnet', version: 'v4', //key: 'YOUR_API_KEY', }, }; ``` Now that the custom endpoint is configured, edit the deploy script in`scripts` to include the contract ID; in this case, the ID is a random value, but you might change it based on the resulting address that will give you. ```ts deploySimpleContract.ts import { toNano } from '@ton/core'; import { SimpleContract } from '../wrappers/SimpleContract'; import { NetworkProvider } from '@ton/blueprint'; export async function run(provider: NetworkProvider) { // Edit this ID const contractId = 1648n; const simpleContract = provider.open(await SimpleContract.fromInit(contractId)); await simpleContract.send( provider.sender(), { value: toNano('0.5'), }, { $$type: 'Deploy', queryId: 0n, }, ); // Deploy contract await provider.waitForDeploy(simpleContract.address); console.log(`Deployed at address ${simpleContract.address}`); // run methods on `simpleContract` } ``` Blueprint allows various deployment options; in this case, we'll use the CLI and the run command directly. Find more methods in the [Blueprint SDK docs](https://github.com/ton-org/blueprint?tab=readme-ov-file#deploy-one-of-the-contracts). First, add environment variables from the terminal if you want to use the mnemonic phrase to use your wallet; you can also use Tonkeeper and the app from your phone (more secure). ```bash Shell export WALLET_MNEMONIC="" export WALLET_VERSION="v4" ``` Run the `run` command and follow the instructions, we used the mnemonic deployment in this case. ```bash Shell npx blueprint run ``` Example result: ```bash Shell Using file: deploySimpleContract ? Which network do you want to use? testnet ? Which wallet are you using? Mnemonic Connected to wallet at address: EQDrNXDLYKstXHj5xV6_md1nYvvrb6y6v4bFyTZReZ-vFYdx Sent transaction Contract deployed at address EQDVoYZ96Gtc-nQM0U4-rj0mporVOTlSpmB64Tn6HJax98VN You can view it at https://testnet.tonscan.org/address/EQDVoYZ96Gtc-nQM0U4-rj0mporVOTlSpmB64Tn6HJax98VN Deployed at address EQDVoYZ96Gtc-nQM0U4-rj0mporVOTlSpmB64Tn6HJax98VN ``` ## Interact with the contract Read data from the contract. Make a new file in the `scripts` direcotry named `getCounter.ts`: ```ts ts import { SimpleContract } from '../wrappers/SimpleContract'; import { NetworkProvider } from '@ton/blueprint'; export async function run(provider: NetworkProvider) { const contractId = 1648n; // Random in this case const simpleContract = provider.open(await SimpleContract.fromInit(contractId)); const id = await simpleContract.getId(); const savedNumber = await simpleContract.getNumber(); const counter = await simpleContract.getCounter(); console.log(`Fethching smart contract data...`); console.log(`Contract ID: ${id}`); console.log(`Current saved number: ${savedNumber}`); console.log(`Current counter: ${counter}`); } ``` Run it with the same `run` command and follow the instructions: ```bash Shell npx blueprint run ``` Result: ```bash Shell ? Choose file to use ? Choose file to use getCounter ? Which network do you want to use? ? Which network do you want to use? testnet ? Which wallet are you using? ? Which wallet are you using? Mnemonic Connected to wallet at address: EQDrNXDLYKstXHj5xV6_md1nYvvrb6y6v4bFyTZReZ-vFYdx Fethching smart contract data... Contract ID: 1648 Current counter: 0 ``` Now use the wallet to send a transaction, including the message "Save" and some TON token to save the value. Make a new script in `scripts` named `addValue`: ```ts ts import { toNano } from '@ton/core'; import { SimpleContract } from '../wrappers/SimpleContract'; import { NetworkProvider } from '@ton/blueprint'; export async function run(provider: NetworkProvider) { const contractId = 1648n; const simpleContract = provider.open(await SimpleContract.fromInit(contractId)); const id = await simpleContract.getId(); const counter = await simpleContract.getNumber(); console.log(`Sending increasing value...`); console.log(`Contract ID: ${id}`); console.log(`Current counter: ${counter}`); // Call the Add function and add 7 await simpleContract.send(provider.sender(), { value: toNano('0.02') }, { $$type: 'Save', amount: 7n }); } ``` Follow the same process with the `run` command, then once the transaction is validated, you can run the get script to fetch the updated value. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Global Node Source: https://docs.chainstack.com/docs/global-elastic-node ## What is a Global Node? Global Node is a service providing a load-balanced node that routes requests to the closest available location for a specific protocol, based on the caller's location. This ensures efficient service access for users worldwide by routing requests in an optimized manner. The main advantages of Global Nodes are the following: * Enhanced load balancing — global nodes include a large load balancer that can switch nodes if one fails or lags by more than 40 blocks, thus ensuring uninterrupted service. * Reduced latency — by distributing traffic to the nearest endpoint, global node reduces latency, leading to faster transactions and improved user experience. * Global reach — global nodes can be accessed by anyone from any location in the world. They direct users to the endpoints nearest to their location, which maximizes the availability and responsiveness of services. * High availability — global nodes are designed to have 99.95% availability. This ensures that your DApp continues to run with minimal interruptions. * Instant deployment — unlike regional trader nodes, which take 3-6 minutes to deploy, Global Node is ready in seconds. # Gnosis Chain Source: https://docs.chainstack.com/docs/gnosis-chain See also [interactive Gnosis Chain API call examples](/reference/gnosis-getting-started). ## Execution layer | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blobBaseFee | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | parity\_clearEngineSigner | | | | parity\_enode | | | | parity\_getBlockReceipts | | | | parity\_netPeers | | | | parity\_pendingTransactions | | | | parity\_setEngineSigner | | | | parity\_setEngineSignerSecret | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | ## Consensus layer (Beacon Chain) | Method | Availability | Comment | | -------------------------------------------------------------- | --------------------------------------------- | ------- | | /eth/v1/beacon/blocks/\{block\_id}/attestations | | | | /eth/v1/beacon/blocks/\{block\_id}/root | | | | /eth/v1/beacon/blob\_sidecars/\{\{block\_id}} | | | | /eth/v1/beacon/genesis | | | | /eth/v1/beacon/headers | | | | /eth/v1/beacon/headers/\{block\_id} | | | | /eth/v1/beacon/states/\{state\_id}/committees | | | | /eth/v1/beacon/states/\{state\_id}/finality\_checkpoints | | | | /eth/v1/beacon/states/\{state\_id}/fork | | | | /eth/v1/beacon/states/\{state\_id}/root | | | | /eth/v1/beacon/states/\{state\_id}/sync\_committees | | | | /eth/v1/beacon/states/\{state\_id}/validator\_balances | | | | /eth/v1/beacon/states/\{state\_id}/validators | | | | /eth/v1/beacon/states/\{state\_id}/validators/\{validator\_id} | | | | /eth/v1/beacon/rewards/sync\_committee/\{block\_id} | | | | /eth/v1/beacon/rewards/blocks/\{block\_id} | | | | /eth/v1/beacon/rewards/attestations/\{epoch} | | | | /eth/v1/config/deposit\_contract | | | | /eth/v1/config/spec | | | | /eth/v1/events | | | | /eth/v1/node/peer\_count | | | | /eth/v1/node/peers | | | | /eth/v1/node/syncing | | | | /eth/v1/node/version | | | | /eth/v1/validator/aggregate\_attestation | | | | /eth/v1/validator/blinded\_blocks/\{slot} | | | | /eth/v1/validator/duties/attester/\{epoch} | | | | /eth/v1/validator/duties/proposer/\{epoch} | | | | /eth/v1/validator/sync\_committee\_contribution | | | | /eth/v2/beacon/blocks/\{block\_id} | | | | /eth/v2/debug/beacon/states/\{state\_id} | | | # Gnosis Chain tooling Source: https://docs.chainstack.com/docs/gnosis-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Gnosis Chain nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Gnosis Chain endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Gnosis Chain nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](/docs/gnosis-tooling#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Gnosis Chain nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Gnosis Chain nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Gnosis Chain nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Gnosis Chain nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Gnosis Chain nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Gnosis Chain network ID: * Mainnet: `100` * Testnet: `10200` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Gnosis Chain network ID: * Mainnet: `100` * Testnet: `10200` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Gnosis ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Gnosis Chain network ID: * Mainnet: `100` * Testnet: `10200` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Gnosis Chain: Simple Soulbound Token with Remix and OpenZeppelin Source: https://docs.chainstack.com/docs/gnosis-tutorial-simple-soulbound-token-with-remix-and-openzeppelin **TLDR** * Soulbound tokens are non-transferable ERC-721s, enforced via a transfer override, so they can only be minted or burned. * This tutorial covers creating and deploying a soulbound token contract on the Gnosis Chain Chiado testnet through a Chainstack node. * You’ll use Remix for development, MetaMask for signing, and Blockscout to verify and interact with the contract. * Minting a token assigns it to an address, and that address alone can burn it, ensuring the token never changes ownership. ## Main article Soulbound tokens, [originally proposed by Vitalik Buterin](https://vitalik.ca/general/2022/01/26/soulbound.html), at their core are simply non-transferable NFTs. In this tutorial, you will: * Create an ERC-721 contract that has a transfer override to make the token soulbound. * Deploy the contract on the Gnosis Chain Chiado testnet through a node deployed with Chainstack. * Interact with the deployed contract. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Gnosis Chain node. * [Remix IDE](https://remix.ethereum.org/) to compile the contract and deploy through MetaMask. * [MetaMask](https://metamask.io/) to deploy the contract through your Chainstack node and interact with the contract. ## Overview To get from zero to a deployed soulbound token contract on the Gnosis Chain Chiado testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join Gnosis Chain Chiado testnet. With Chainstack, access your Gnosis Chain node credentials. Set up your MetaMask to work through a Chainstack node. With Remix IDE, create and compile the soulbound contract. With Remix IDE, deploy the contract on the Gnosis Chain Chiado testnet. Issue a soulbound token and burn it. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Gnosis Chiado testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Gnosis Chain node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Set up MetaMask See [Gnosis Chain tooling: MetaMask](/docs/gnosis-tooling#metamask). ### Create and compile the soulbound contract Open [Remix IDE](https://remix.ethereum.org/). On the home page, click **Environments** > **Solidity**. On the left pane, click **File explorers** > **contracts** > **New File**. In the modal, give any name to your contract. For example, `soulbound.sol`. Put in the contract code: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.7; import "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import "@openzeppelin/contracts/utils/Counters.sol"; import "@openzeppelin/contracts/access/Ownable.sol"; contract SoulBoundToken is ERC721, Ownable { using Counters for Counters.Counter; Counters.Counter private _tokenIdCounter; constructor() ERC721("SoulBoundToken", "SBT") {} function safeMint(address to) public onlyOwner { uint256 tokenId = _tokenIdCounter.current(); _tokenIdCounter.increment(); _safeMint(to, tokenId); } function burn(uint256 tokenId) external { require(ownerOf(tokenId) == msg.sender, "Only the owner of the token can burn it."); _burn(tokenId); } function _beforeTokenTransfer(address from, address to, uint256) pure override internal { require(from == address(0) || to == address(0), "This a Soulbound token. It cannot be transferred. It can only be burned by the token owner."); } function _burn(uint256 tokenId) internal override(ERC721) { super._burn(tokenId); } } ``` This is your soulbound token contract: * It uses the audited OpenZeppelin libraries to make the contract of the ERC-721 standard, belonging to the deployer, and increments each issued token ID by 1. * The contract has a modification to prohibit the token transfer, which makes the issued tokens soulbound. * The contract also implements a burn function to allow the owner of the issued token to be able to burn it. Compile the contract. On the left pane, click **Solidity compiler** > **Compile**. ### Fund your account Fund the account that you will use to deploy the contract with xDAI. Use the [xDAI testnet faucet](https://gnosisfaucet.com/). ### Set up Remix IDE to work through your Chainstack node On the left pane, click **Deploy** and switch to **Injected Provider - MetaMask**. ### Deploy the soulbound contract On the left pane: Click **Deploy & run transactions**. In contract, select **contracts/soulbound.sol**. Click **Deploy**. This will engage your MetaMask to deploy the contract to the Gnosis Chain Chiado testnet through your currently selected MetaMask account. Click **Confirm** in the MetaMask modal. ### Interact with the contract Once your contract is deployed, you can view it online at [Blockscout Gnosis Chain Chiado testnet explorer](https://blockscout.com/gnosis/chiado). You are now going to verify the contract in the Blockscout explorer to be able to use the explorer as a web app and easily interact with the contract online. ### Flatten your contract code Since your soulbound contract uses imported OpenZeppelin libraries, you must put all the imports into one `.sol` file to make Blockscout be able to verify it. 1. In your Remix IDE, click **Plugin manager** > **Flattener** > **Activate**. 2. Click **Flattener** > **Flatten contracts/soulbound.sol**. The flattened contract is now in your clipboard. ### Verify the deployed contract on Blockscout explorer Go to [Blockscout explorer](https://blockscout.com/gnosis/chiado). Find your deployed contract. The address of your contract on the left pane of Remix IDE under Deployed Contracts. On the contract page on Blockscout, click **Code** > **Verify & Publish**. Select **Via flattened source code**. In **Contract Name**, provide the name of your contract. In our example, the name is `SoulBoundToken`. In **Compiler**, select the same compiler version that was used in Remix IDE. In **Optimization**, select **No**. In **Enter the Solidity Contract Code**, paste the flattened contract code. Click **Verify & publish**. Blockscout will take a few seconds to compile your contract, verify, and publish it. ### Issue a soulbound token Now that your soulbound contract is verified, you can check Blockscout to interact with it. On Blockscout, on your contract, click **Write Contract**. In your MetaMask, make sure you have the same address selected as the one that deployed the contract. Click **Connect wallet**. This will connect your MetaMask instance with the contract owner as the active address. In **safeMint**, provide an address that you own and to which you will issue a soulbound token. Click **Write**. This will issue a soulbound token to the provided address. ### Burn the soulbound token Now that your other account has a soulbound token, you can burn it. In your MetaMask instance, switch to the account that has a soulbound token tied to it. 1. On Blockscout, on your contract, click **Write Contract**. 2. In your MetaMask, make sure you have the address selected that owns the issued soulbound token. 3. Click **Connect wallet**. This will connect your MetaMask instance with the token owner as the active address. 4. In **burn**, provide the token ID. If this is the first issued token, the ID is `0`. 5. Click **Write**. This will send the soulbound token from the current owner to the address `0x0000000000000000000000000000000000000000`. ## Conclusion This tutorial guided you through the basics of creating and deploying a simple soulbound contract on the Gnosis Chain Chiado testnet through your Chainstack-deployed node. You have also interacted with the contract, issued, and burned the token using Blockscout as a web app and MetaMask as your interaction tool that works through your Chainstack-deployed Gnosis Chain node. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Get the most out of Chainstack platform API Source: https://docs.chainstack.com/docs/guide-get-the-most-out-of-the-chainstack-platform-api **TLDR** * Generate and rotate API keys regularly, and safeguard them as environment variables to reduce unauthorized access risks. * Always consult the Chainstack API reference for endpoint details, request/response formats, and error codes. * Implement robust error handling (status codes, response parsing, retry logic) to deal with network or server issues gracefully. * Securely handle sensitive data (private keys, tokens) by encrypting storage, masking logs, and restricting access. * Thoroughly test API integrations and keep tabs on performance and error rates to maintain reliability in production. ## Introduction The Chainstack API offers you a comprehensive set of tools, making it easier to manage your organization, projects, networks, nodes, and identities programmatically. For a secure and efficient integration with the Chainstack platform, it's good to follow some best practices. This article will give you some top tips for using the Chainstack API effectively. ## Create and safeguard your API keys To use the Chainstack API, you'll need an API key to authenticate your requests. These keys are like access passes to your organization's resources, so they hold a lot of power. To keep your account and data secure, here's what you need to do: * Get into the habit of rotating your API keys. Just like changing your passwords regularly, it's important to update your API keys on a routine basis. This practice is a solid defense strategy that helps to lower the risk of unauthorized access. If your key accidentally leaks, it won't be of much use to anyone if it's already been changed. Learn how to [generate a new API key](/reference/platform-api-getting-started#create-api-key). * Treat your API keys like a secret treasure. Your API keys are precious, so handle them with care. Embedding them directly into your codebase or sharing them in plaintext is like leaving your house key under the doormat—not the best idea. Instead, consider using environment variables or a secure key management system. Read our [Guide to environment variables](/docs/how-to-store-your-web3-dapp-secrets-guide-to-environment-variables) to have more details. ## Take advantage of the API reference To get the most out of the Chainstack API, it's a good idea to keep the official API reference documentation handy. This guide is like your API encyclopedia, packed with details about each API operation, including the parameters you'll need, the responses you can expect, and some handy examples. Here's what you can do with the API reference: * **Understand endpoint functionality**. The API reference is your go-to guide for understanding what each endpoint does. It's like a map, showing you which API endpoints to use for different tasks, whether you're creating projects, managing networks, or chatting with blockchain nodes. * **Learn request and response structures**. The API reference is your textbook for learning about the structure and formatting of API requests and responses. It tells you what parameters you need, their data types, and any other options or constraints. Knowing how to construct valid requests and handle the data you get back is key to being an API pro. * **Explore available query parameters**. Sometimes, the Chainstack API lets you use query parameters to tweak the behavior of certain operations. The API reference is like your instruction manual, explaining what query parameters are available and what they do. This way, you can fine-tune your requests to get exactly what you want. * **Decode error codes and messages**. The API reference also has a section on the error codes and messages you might run into while using the Chainstack API. It's like your troubleshooting guide, helping you understand what went wrong and how to fix it or work around it. It's essential to regularly refer to the API reference, as it's consistently updated to mirror any modifications or advancements in the Chainstack API. By leaning on the official documentation, you can stay in the loop and current, guaranteeing your integration with the Chainstack API remains precise, streamlined, and in sync with the newest features and enhancements. Utilizing the insights from the API reference empowers you to make educated decisions, craft well-structured API requests, manage responses aptly, and troubleshoot any challenges that might surface during the integration journey. Here you can find the [Chainstack platform API reference](https://api.chainstack.com/reference/). ## Understand and leverage API endpoints The Chainstack API offers a range of endpoints to manage different aspects of your blockchain infrastructure. Familiarize yourself with these endpoints and understand their purpose to leverage the full potential of the Chainstack platform. Key areas to explore include: * **Organization** — API endpoints for managing your organization's settings, billing, and user roles. * **Projects** — APIs to create, configure, and interact with blockchain projects. * **Networks** — endpoints to manage the networks within your projects, such as creating, connecting, and modifying networks. * **Nodes** — APIs for managing blockchain nodes, including creation, configuration, and monitoring. * **Identities** — endpoints to manage cryptographic identities, including key pair generation and certificate issuance. ### Examples of API calls Here you will find a collection of examples in JavaScript and Python to familiarize yourself with the Chainstack platform API. #### API key authentication The Chainstack API relies on API keys to authenticate requests. To provide your API key, include it in the `Authorization` header. The header's value should comprise the `Bearer` prefix followed by the secret key generated through the platform's user interface. Here's an example using curl: ```bash cURL curl -X GET 'https://api.chainstack.com/v1/organization/' \ --header 'Authorization: Bearer YOUR_API_KEY' ``` Note that all API requests must be made over HTTPS for security reasons. #### API calls using JavaScript This example demonstrates how you can engage with the Chainstack platform API using node.js and the Axios library. It's a practical illustration of how to communicate with the API using code. Note that this example employs the `dotenv` package to load sensitive data from environment variables. This is an effective strategy to ensure the security of your secret keys, for example: `BEARER_TOKEN=”YOUR_API_KEY"` Make sure to install `axios` and `dotenv` before running the code: ```shell Shell npm i axios dotenv ``` ```js JavaScript require('dotenv').config(); const axios = require('axios'); async function getOrganization() { try { const response = await axios.get('https://api.chainstack.com/v1/organization/', { headers: { 'Authorization': `Bearer ${process.env.BEARER_TOKEN}` } }); console.log(response.data); } catch (error) { console.error(error); } } getOrganization(); ``` This particular example uses the **Get Organization name and ID** endpoint to fetch information about the organization. It's a straightforward way to incorporate API calls into your routine tasks. #### API calls using Python This example demonstrates how you can interact with the Chainstack platform API using Python and the requests library. It's a practical illustration of how to communicate with the API using Python code. Note that this example uses the `python-dotenv` package to load sensitive data from environment variables. This is an effective strategy to ensure the security of your secret keys. Make sure to install `requests` and `python-dotenv` before running the code: ```shell Shell pip install requests python-dotenv ``` ```python Python import os from dotenv import load_dotenv import requests load_dotenv() def get_nodes(): try: response = requests.get( 'https://api.chainstack.com/v1/nodes/', headers={'Authorization': f'Bearer {os.getenv("BEARER_TOKEN")}'} ) response.raise_for_status() print(response.json()) except requests.exceptions.RequestException as err: print(f"An error occurred: {err}") get_nodes() ``` This specific example calls the **List all Nodes** endpoint to fetch data about the nodes deployed by your organization. This makes it easy to introduce API calls into your Python workflow. ## Best practices for error handling in API requests When interacting with APIs, it's crucial to have robust error-handling mechanisms in place. This allows you to manage any errors or exceptions that might crop up gracefully. Here are some practices to consider: ### 1: Handle HTTP status codes Always check for and handle HTTP status codes returned by the API. These codes give you a heads-up about the type of response you're dealing with. For instance, a 200 status code means your request was successful, a 400 indicates a client error, and 500 points to a server error. ### 2: Analyze error responses Make sure to parse and scrutinize error responses returned by the API. The Chainstack API typically dishes out informative error messages that can help you get to the bottom of the issue. ### 3: Implement retry logic Consider setting up retry logic for certain types of errors, like network timeouts or temporary server issues. But remember, the API has rate limits. So, ensure your retry mechanism is designed to handle these limits appropriately. ## Securely manage sensitive information In your interactions with the Chainstack API, you'll likely handle sensitive information such as private keys, certificates, or access tokens. It's crucial to handle this data with utmost care to uphold a high level of security. Here are some practices to consider: * **Be cautious with logs and error messages**. It's easy to inadvertently expose sensitive data in logs or error messages. Always double-check what you're logging and avoid including sensitive information. This is especially important in debug logs, which can often be overlooked. * **Encrypt and safeguard stored data**. Any sensitive data stored on disk or in databases should be encrypted and protected. This adds an extra layer of security and ensures that even if someone gains unauthorized access to the storage, they won't be able to read the data without the encryption key. * **Use secure connections**. When interacting with the Chainstack API, always use secure connections (HTTPS). This encrypts the data in transit, preventing anyone from intercepting and reading your data as it travels over the network. * **Regularly review access control policies**. Access control policies determine who can access your sensitive data. Regularly reviewing and updating these policies ensures that only authorized individuals have access. This is particularly important in dynamic environments where roles and responsibilities can change frequently. By following these practices, you can significantly reduce the risk of a security breach and ensure that your sensitive data remains secure. ## Test and monitor API integrations Before deploying your application to production, thoroughly test your integration with the Chainstack API. Test various scenarios and edge cases to validate the correctness and robustness of your implementation. Additionally, implement monitoring mechanisms to track API usage, performance metrics, and error rates. This allows you to proactively identify and address any issues that may arise. ## Conclusion By following these best practices, you can effectively leverage the Chainstack API to manage your organization's blockchain infrastructure securely and efficiently. Remember to create and secure your API keys, familiarize yourself with the available API endpoints, refer to the API reference documentation, implement proper error handling, securely manage sensitive information, and thoroughly test and monitor your API integrations. Following these guidelines will help you build reliable and robust applications that integrate seamlessly with the Chainstack platform. # Handle real-time data using WebSocket with JavaScript and Python Source: https://docs.chainstack.com/docs/handle-real-time-data-using-websockets-with-javascript-and-python **TLDR** * Demonstrates how to connect to the Chainstack API over WebSockets for real-time blockchain data using web3.js, ethers.js, and Python. * Explains the advantages of WebSocket vs. HTTP (bidirectional, persistent connection). * Includes examples of implementing auto-reconnect logic to handle unexpected disconnections or timeouts. * Highlights the typical connection limits and the importance of retry mechanisms to maintain an uninterrupted data stream. ## Main article The Chainstack API simplifies access to your EVM node through either HTTPS or WebSocket connections. This tutorial illustrates how to utilize a WebSocket connection to retrieve real-time data employing web3.js and ethers.js, as well as Python. It also explains how to incorporate WebSocket reconnect logic to ensure that the connection is automatically reestablished if it happens to drop for any reason. ## Difference between HTTP and WebSocket protocols **HTTP** (hypertext transfer protocol) operates on a unidirectional basis, whereas **WebSocket** offers bidirectional communication. In the traditional client-server model using HTTP, each request made by the client initiates a new connection. This connection is then closed once the client receives a response from the server. This cyclical process of establishing and terminating connections can be inefficient and resource-intensive, especially for applications requiring real-time updates or frequent data exchanges. > For example, how your browser requests data to the server to display this article. On the contrary, WebSocket enhances the client-server communication process by initiating a connection just once, which is then kept open and reused for as long as either the server or the client maintains it. This persistent connection supports a more efficient data exchange between the server and the client without constantly opening and closing connections. This key advantage of WebSocket, connection reusability, is a game-changer for real-time applications. Alongside this, it features a "keep-alive" mechanism, ensuring the connection stays active, reducing the chances of timeouts or connection drops. This constant open channel allows for seamless data transfer, promoting faster, more dynamic interactions. ## HTTP and WebSocket connection limitations ### HTTP * 60 seconds timeout for the idle connection, so the connection will be terminated if idle for 60 seconds. * Each connection supports a maximum of 1,000 requests. After serving these 1,000 requests over a single HTTP connection, the current connection is concluded, and a fresh connection is established for subsequent requests. ### WebSocket * 3,600 seconds (1 hour) timeout for the idle connection, so the connection will be terminated if idle for 3,600 seconds, equivalent to 1 hour. * A maximum of 500 concurrent connections can be maintained over WebSocket. ## Managing WebSocket reconnections While WebSockets maintain a persistent, two-way connection, preparing for scenarios where the connection might be interrupted is crucial. The examples in this guide all integrate a WebSocket reconnection logic designed to automatically reestablish the connection should it become unexpectedly terminated. ### Common WebSocket error codes Below are some of the common codes and keywords that indicate the WebSocket was disconnected and you need to implement a reconnect logic: * Code `1006` (abnormal closure) * `ECONNRESET` Implement WebSocket reconnect logic or implement a [dedicated gateway](/docs/trader-node#dedicated-gateways). ## Retrieve real-time data with WebSocket The Chainstack API allows you to use subscriptions to retrieve real-time data about blocks, pending transactions, and logs. Leveraging subscriptions, you can keep your application informed with recent developments, ensuring an accurate and timely representation of the blockchain state. Learn more about subscriptions in the [API documentation](/reference/ethereum-web3js-subscriptions-methods). ### Real-time data with WebSocket and web3.js The web3.js library integrates subscription functionalities, enabling you to get real-time data from the blockchain effortlessly. Learn how to set up the best [node.js environment](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) for your JavaScript applications. First, install the web3.js library by running the following command: ```shell Shell npm i web3 ``` The following example demonstrates using the web3.js library to receive real-time block headers, including WebSocket reconnect logic. ```javascript Javascript const { Web3 } = require("web3"); const NODE_URL = "YOUR_CHAINSTACK_WSS_ENDPOINT"; // Reconnect options const reconnectOptions = { autoReconnect: true, // Automatically attempt to reconnect delay: 5000, // Reconnect after 5 seconds maxAttempts: 10, // Max number of retries }; const web3 = new Web3( new Web3.providers.WebsocketProvider(NODE_URL, undefined, reconnectOptions) ); async function subscribeToNewBlocks() { try { // Create a new subscription to the 'newBlockHeaders' event const event = "newBlockHeaders"; const subscription = await web3.eth.subscribe(event); // Changed to 'newHeads' console.log(`Connected to ${event}, Subscription ID: ${subscription.id}`); // Attach event listeners to the subscription object for 'data' and 'error' subscription.on("data", handleNewBlock); subscription.on("error", handleError); } catch (error) { console.error(`Error subscribing to new blocks: ${error}`); } } // Fallback functions to react to the different events // Event listener that logs the received block header data function handleNewBlock(blockHeader) { console.log("New block header:", blockHeader); } // Event listener that logs any errors that occur function handleError(error) { console.error("Error when subscribing to new block header:", error); } subscribeToNewBlocks(); ``` Learn more about the [web3.js `newBlockHeaders` subscription](/reference/ethereum-subscribenewblockheaders). ### Real-time data with WebSocket and ethers.js Install the ethers.js and ws libraries: ```shell Shell npm i ethers ws ``` The following is an example of establishing a WebSocket connection using ethers.js, designed to subscribe to new block headers, which also incorporates a mechanism for automatic WebSocket reconnection. ```javascript Javascript const ethers = require("ethers"); const WebSocket = require("ws"); const NODE_URL = "YOUR_CHAINSTACK_WSS_ENDPOINT"; function createWebSocket() { const ws = new WebSocket(NODE_URL); ws.on("close", () => { console.log("Disconnected. Reconnecting..."); setTimeout(() => { provider = new ethers.WebSocketProvider(createWebSocket()); startListening(); }, 3000); }); ws.on("error", (error) => { console.log("WebSocket error: ", error); }); return ws; } let provider = new ethers.WebSocketProvider(createWebSocket()); function startListening() { provider.on("block", async (blockNumber) => { console.log("New block number:", blockNumber); const block = await provider.getBlock(blockNumber); console.log("Block details:", block); }); } startListening(); ``` ### Real-time data with WebSocket and Python Install the websockets library: ```shell Shell pip install websockets ``` The following is an example of establishing a WebSocket connection using Python, designed to subscribe to new block headers, incorporating an automatic WebSocket reconnection mechanism. ```python Python # Import required libraries import asyncio import json import websockets # Replace with your own Ethereum node WebSocket URL eth_node_ws_url = 'YOUR_CHAINSTACK_WSS_ENDPOINT' async def subscribe_to_blocks(ws_url): # Continuously try to connect and subscribe while True: try: # Establish a WebSocket connection to the Ethereum node async with websockets.connect(ws_url) as websocket: # Send a subscription request for the Transfer event logs await websocket.send(json.dumps({ "id": 1, "method": "eth_subscribe", "params": [ "newHeads" ], "jsonrpc": "2.0" })) # Wait for the subscription response and print it subscription_response = await websocket.recv() print(f'Subscription response: {subscription_response}') # Continuously process incoming logs while True: # Receive a log entry and parse it as JSON log = await websocket.recv() log_data = json.loads(log) # Print the log data print(f'New log: {log_data}') print("#"*10) # If there's an exception (e.g., connection error), attempt to reconnect except Exception as e: print(f'Error: {e}') print('Reconnecting...') await asyncio.sleep(5) # Execute the subscription function asyncio.run(subscribe_to_blocks(eth_node_ws_url)) ``` ## Conclusion This guide has walked you through leveraging the power of WebSocket over traditional HTTP, utilizing JavaScript and Python, and interacting with the Chainstack API to retrieve real-time data. We also explored how to implement WebSocket reconnect logic. # Harmony methods Source: https://docs.chainstack.com/docs/harmony-methods | Method | Availability | Comment | | ----------------------------------------------- | --------------------------------------------- | ------- | | hmy\_getBalanceByBlockNumber | | | | hmy\_getTransactionCount | | | | hmy\_getBalance | | | | hmy\_getFilterLogs | | | | hmy\_newFilter | | | | hmy\_newPendingTransactionFilter | | | | hmy\_newBlockFilter | | | | hmy\_getFilterChanges | | | | hmy\_getLogs | | | | hmy\_getStakingTransactionByBlockHashAndIndex | | | | hmy\_getStakingTransactionByBlockNumberAndIndex | | | | hmy\_getStakingTransactionByHash | | | | hmy\_getCurrentTransactionErrorSink | | | | hmy\_getPendingCrossLinks | | | | hmy\_getPendingCXReceipts | | | | hmy\_getCXReceiptByHash | | | | hmy\_pendingTransactions | | | | hmy\_sendRawStakingTransaction | | | | hmy\_getTransactionsHistory | | | | hmy\_sendRawTransaction | | | | hmy\_getTransactionReceipt | | | | hmy\_getBlockTransactionCountByHash | | | | hmy\_getBlockTransactionCountByNumber | | | | hmy\_getTransactionByHash | | | | hmy\_getTransactionByBlockNumberAndIndex | | | | hmy\_getTransactionByBlockHashAndIndex | | | | hmy\_getBlockByNumber | | | | hmy\_getBlockByHash | | | | hmy\_getBlocks | | | | hmy\_estimateGas | | | | hmy\_getStorageAt | | | | hmy\_call | | | | hmy\_getCode | | | | hmy\_isLastBlock | | | | hmy\_epochLastBlock | | | | hmy\_latestHeader | | | | hmy\_getShardingStructure | | | | hmy\_blockNumber | | | | hmy\_syncing | | | | hmy\_gasPrice | | | | net\_peerCount | | | | hmy\_getEpoch | | | | hmy\_getLeader | | | | hmy\_getCirculatingSupply | | | | hmy\_getTotalSupply | | | | hmy\_getStakingNetworkInfo | | | | hmy\_getAllValidatorInformation | | | | hmy\_getAllValidatorInformationByBlockNumber | | | | hmy\_getCurrentUtilityMetrics | | | | hmy\_getDelegationsByValidator | | | | hmy\_getDelegationsByDelegatorAndValidator | | | | hmy\_getDelegationsByDelegator | | | | hmy\_getValidatorMetrics | | | | hmy\_getMedianRawStakeSnapshot | | | | hmy\_getElectedValidatorAddresses | | | | hmy\_getAllValidatorAddresses | | | | hmy\_getCurrentStakingErrorSink | | | | hmy\_getValidatorInformation | | | | hmy\_getValidators | | | | hmy\_getSignedBlocks | | | | hmy\_isBlockSigner | | | | hmy\_getBlockSigners | | | | trace\_block | | | | trace\_transaction | | | # Harmony tooling Source: https://docs.chainstack.com/docs/harmony-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Harmony nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Harmony endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Harmony nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Harmony nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Harmony nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Harmony nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Harmony nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Harmony nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Harmony network ID: * Mainnet Shard 0: `1666600000` * Devnet Shard 0: `1666900000` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Harmony network ID: * Mainnet Shard 0: `1666600000` * Devnet Shard 0: `1666900000` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Harmony ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Harmony network ID: * Mainnet Shard 0: `1666600000` * Devnet Shard 0: `1666900000` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Harmony: A simple metaverse contract with Foundry Source: https://docs.chainstack.com/docs/harmony-tutorial-a-simple-metaverse-contract-with-foundry **TLDR** * This tutorial showcases how to create an ERC-721 “Polyland” contract on Harmony devnet that treats each land patch as an NFT. * Polyland can hold a finite set of triangular patches, and the owner (contract deployer) can mint them to different accounts until the supply is capped. * Foundry is used to compile and deploy the contract, then the Harmony explorer is used to verify it and distribute patches. * Once deployed, each patch’s owner and properties (triangle edges) can be queried on-chain, illustrating the fundamentals of a metaverse asset contract. ## Main article The blockchain part of any metaverse that involves decentralization is object ownership. On Harmony, object ownership can be realized with the HRC-721 token standard, commonly known as NFT. A very basic example of realizing a metaverse on blockchain is creating a plot of land and distributing the patches of it to different owners—all through an NFT contract. In this tutorial, you will: * Create a simple contract called Polyland. The Polyland contract represents a plot of land. * Program Polyland to consist of patches of land. * Deploy Polyland on the Harmony devnet through a node deployed with Chainstack. * Distribute patches of Polyland to different accounts on the Harmony devnet. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Harmony node. * [Foundry](https://github.com/foundry-rs/foundry) to create and deploy contracts. ## Overview To get from zero to a deployed metaverse contract and patches of land distributed on the Harmony devnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the Harmony devnet. With Chainstack, access your Harmony node credentials. With OpenZeppelin, create an HRC-721 contract. With Foundry, flatten, compile, and deploy the contract through your Harmony node. Verify the contract on the Harmony explorer. Using the Harmony explorer as a web app, distribute the patches of land to accounts. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Harmony devnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Harmony node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install Foundry See [Foundry](https://github.com/gakonst/foundry). ### Create the contract Initialize your project with Foundry: ```bash Shell forge init polyland ``` This will create the project directory `polyland` an initialize it. Go to the `polyland/src/` directory. In the directory, create your metaverse contract: `polyland.sol`. ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8; import "@openzeppelin/contracts/token/ERC721/ERC721.sol"; import "@openzeppelin/contracts/utils/Counters.sol"; import "@openzeppelin/contracts/access/Ownable.sol"; contract Polyland is ERC721, Ownable { using Counters for Counters.Counter; Counters.Counter private supply; uint256 public maxSupply = 4; struct Triangle { string name; int8 edge1; int8 edge2; int8 edge3; } Triangle[] public triangles; constructor() ERC721("Polyland", "PLLND") { triangles.push(Triangle("Triangle0", 0,0,0)); triangles.push(Triangle("Triangle1", 1,1,1)); triangles.push(Triangle("Triangle2", 2,2,2)); triangles.push(Triangle("Triangle3", 3,3,3)); triangles.push(Triangle("Triangle4", 4,4,4)); } modifier supplyCap { require(supply.current() <= maxSupply, "All patches minted."); _; } function totalSupply() public view returns (uint256) { return supply.current(); } function getTriangles() public view returns (Triangle[] memory) { return triangles; } function mintTriangle(address account) public onlyOwner supplyCap returns (uint256) { supply.increment(); uint256 newPatchId = supply.current(); _mint(account, newPatchId); return newPatchId; } } ``` The contract implementation is the following: * The contract uses the OpenZeppelin audited [ERC-721 contract templates](https://docs.openzeppelin.com/contracts/4.x/erc721). * The contract consists of a `Triangle` object with three `edge` properties. The triangle is a patch of land that has three edges. * The `constructor` function sets the contract up with four triangles. Since this is an array and starts with 0, while the ID of the minted patch starts with 1, the first element is set to `Triangle0`. `Triangle0` is the default first element that will not represent a patch of land in the Polyland metaverse. * Through the `maxSupply` variable and the `supplyCap` modifier, the number of patches available to mint is capped at `4`. * Only the address that deploys the contract can mint the patches of land. Thus, the contract represents a plot of land called Polyland that consists of four triangular patches of land. Set up OpenZeppelin with Foundry Install OpenZeppelin with Foundry: ```bash Shell forge install openzeppelin/openzeppelin-contracts ``` In the project directory, create a `remappings.txt` file with the following contents: ```js JavaScript @openzeppelin/=lib/openzeppelin-contracts/ ``` Flatten the contract. Flatten the contract to make it easier to verify on the [Harmony Testnet explorer](https://explorer.testnet.harmony.one). Run: ```bash Shell forge flatten polyland.sol > polylandFlat.sol ``` Deploy the contract: ```bash Shell forge create Polyland --contracts /root/polyland/src/polylandFlat.sol --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT -- legacy ``` where * `Polyland` — the name of the contract as provided in the contract code `contract Polyland is ERC721, Ownable` * `/root/polyland/src/polylandFlat.sol` — full path to the flattened contract * YOUR\_PRIVATE\_KEY — the private key to the account that deploys the contract. Must be used without the `0x` prefix. Fund the account with devnet ONE using the [testnet faucet](https://faucet.pops.one/). * YOUR\_CHAINSTACK\_ENDPOINT — your Harmony node HTTPS endpoint deployed with Chainstack. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [Harmony tooling](/docs/harmony-tooling). * `--legacy` — the Foundry flag to work with the EVM-based networks that are not [EIP-1559](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-1559.md) activated. Once the contract deploys, note the `solc` and the `Deployed` to values in the output. ### Verify the contract Open the [Harmony Testnet explorer](https://explorer.testnet.harmony.one). Put in the contract address. Click the **Contract** tab. Click **Verify and Publish**. In the `Contract Name` field, put `Polyland`. Set `Chain Type` to `devnet`. In `Compiler`, provide the `solc` version that the contract compiled with. In `Optimizer`, set `Yes`, `200`. Contract bytecode optimization with 200 runs is the default Foundry setting. Paste the entirety of the flattened contract in the contract field and hit **Submit**. This will verify the contract. You can now use the explorer as a web app to interact with the contract. ### Distribute the patches of land 1. On the contract page in the [Harmony Testnet explorer](https://explorer.testnet.harmony.one), click **Write Contract**. 2. In `mintTriangle`, provide an address to distribute a patch of land to. Distribute the patches to different addresses until you hit the cap with the `All patches minted` message. 3. On the **Read Contract** tab, query the `ownerOf` field by putting in the `tokenId` values representing each of the patches of land: `1`, `2`, `3`, `4`. 4. In the `triangles field`, put in the same `tokenId` values to get the data on each of the patches: name and the size of each of the three edges. ## Conclusion This tutorial guided you through the basics of creating and deploying a metaverse contract with object ownership representation. You created your own plot of land, distributed the finite number of land patches to different owners, and retrieved the data for each of the patches: patch size and patch owner. You did all of it using [Foundry](https://chainstack.com/foundry-a-fast-solidity-contract-development-toolkit/). This tutorial uses devnet, however, the exact same instructions and sequence work on the mainnet. ### About the author Developer Advocate @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Harnessing Chainlink Oracles with Chainstack: Fetching real-time crypto prices from Ethereum Source: https://docs.chainstack.com/docs/harnessing-chainlink-oracles-with-chainstack-fetching-real-time-crypto-prices-from-ethereum **TLDR** * Set up a node.js script using web3.js to connect to Ethereum via Chainstack endpoints. * Fetch real-time crypto prices through Chainlink’s AggregatorV3Interface contracts, removing reliance on external APIs. * Conversion involves retrieving integer values from the contract and formatting them into human-readable outputs. * Call prices at intervals to keep data updated without single-point-of-failure risks. ## Main article In today's digital age, accessing real-time and reliable cryptocurrency data is crucial for many applications. While many resort to third-party APIs, these can sometimes introduce risks, such as downtime, potential manipulation, or sudden discontinuation of services. Chainlink, a decentralized oracle network, solves this dilemma as a dependable bridge between on-chain and off-chain data. Paired with the robust infrastructure of an Ethereum node hosted by Chainstack, we can further enhance the reliability of our data source. One advantage of this approach is using decentralized exchange (DeX) oracles, which largely eliminates the risk of relying on single-point-of-failure third-party APIs. This ensures uninterrupted access to critical crypto pricing data and fosters a decentralized ethos in our applications. In this guide, we'll demonstrate how to harness the power of Chainlink oracles by using a Chainstack Ethereum node to fetch real-time crypto prices from the network, giving you a resilient and dependable data source. ## Setting the scene For our purpose, we'll use node.js alongside the web3.js library, allowing us to communicate with the Ethereum blockchain. We aim to fetch prices for five cryptocurrencies: BTC, ETH, LINK, BNB, and LTC against USD. Check out [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project) to learn how to set up a node.js project. ### Prerequisites * [node.js](https://nodejs.org/en/download) library * web3.js library * Contract addresses of pairs you’re interested in * A Chainstack Ethereum node ### Get an Ethereum node Follow these steps to deploy an Ethereum node: [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install web3.js Create a new project in a directory and open a terminal, then run: ```shell Shell npm i web3 ``` ### The code In the same directory, create a new file named `index.js` and paste the following code: ```javascript index.js const { Web3 } = require("web3"); const web3 = new Web3( "YOUR_CHAINSTACK_ENDPOINT" ); // Object with the smart contracts for the pairs const pairs = { "BTC / USD": "0xF4030086522a5bEEa4988F8cA5B36dbC97BeE88c", "ETH / USD": "0x5f4eC3Df9cbd43714FE2740f5E3616155c5b8419", "LINK / USD": "0x2c1d072e956AFFC0D435Cb7AC38EF18d24d9127c", "BNB / USD": "0x14e613AC84a31f709eadbdF89C6CC390fDc9540A", "LTC / USD": "0x6AF09DF7563C363B5763b9102712EbeD3b9e859B", }; // aggregatorV3Interface ABI const aggregatorV3InterfaceABI = [ { inputs: [], name: "decimals", outputs: [{ internalType: "uint8", name: "", type: "uint8" }], stateMutability: "view", type: "function", }, { inputs: [], name: "description", outputs: [{ internalType: "string", name: "", type: "string" }], stateMutability: "view", type: "function", }, { inputs: [{ internalType: "uint80", name: "_roundId", type: "uint80" }], name: "getRoundData", outputs: [ { internalType: "uint80", name: "roundId", type: "uint80" }, { internalType: "int256", name: "answer", type: "int256" }, { internalType: "uint256", name: "startedAt", type: "uint256" }, { internalType: "uint256", name: "updatedAt", type: "uint256" }, { internalType: "uint80", name: "answeredInRound", type: "uint80" }, ], stateMutability: "view", type: "function", }, { inputs: [], name: "latestRoundData", outputs: [ { internalType: "uint80", name: "roundId", type: "uint80" }, { internalType: "int256", name: "answer", type: "int256" }, { internalType: "uint256", name: "startedAt", type: "uint256" }, { internalType: "uint256", name: "updatedAt", type: "uint256" }, { internalType: "uint80", name: "answeredInRound", type: "uint80" }, ], stateMutability: "view", type: "function", }, { inputs: [], name: "version", outputs: [{ internalType: "uint256", name: "", type: "uint256" }], stateMutability: "view", type: "function", }, ]; let conversionRate = {}; async function fetchPrices() { try { for (let pair in pairs) { // Smart contract instance const priceFeed = new web3.eth.Contract( aggregatorV3InterfaceABI, pairs[pair] ); // use eth_call const roundData = await priceFeed.methods.latestRoundData().call(); // Chainlink returns price data with 8 decimal places for accuracy. // We divide by 1e8 to convert it to a human-readable format. const price = Number(roundData.answer) / 1e8; conversionRate[pair] = price.toFixed(2); } console.log("Prices fetched:", conversionRate); } catch (error) { console.error("Error fetching prices:", error); } } // Fetch prices initially and then at regular intervals fetchPrices(); setInterval(fetchPrices, 60 * 1000); // Fetch every minute ``` Add your Chainstack Ethereum endpoint to `web3`, save and run `node index` in the terminal; the console will display an object with the most up-to-date prices. ```shell Shell Prices fetched: { 'BTC / USD': '29385.26', 'ETH / USD': '1848.97', 'LINK / USD': '7.51', 'BNB / USD': '240.95', 'LTC / USD': '83.65' } ``` ### Understanding the code The script starts by importing the `web3` library and creating an instance to an Ethereum node, then the bulk of the logic is as follows: 1. **Specify the Chainlink price feeds**. Ethereum smart contracts represent Chainlink's decentralized price feeds. Each cryptocurrency price feed has a unique contract address. Our script stores these addresses in the `pairs` dictionary with the cryptocurrency pair name as the key. You can find additional price feed contract addresses on Chainlink’s [Price Feed Contract Addresses](https://docs.chain.link/data-feeds/price-feeds/addresses). 2. **Specify the Chainlink price feed contract addresses**. Chainlink's decentralized price feeds are represented through smart contracts. Each price feed for a specific cryptocurrency pair is associated with a unique contract address. In our script, these addresses are stored in a dictionary named `pairs`, where the key is the name of the cryptocurrency pair. Refer to Chainlink's [Price Feed Contract Addresses](https://docs.chain.link/data-feeds/price-feeds/addresses) page for additional price feed contract addresses. 3. **The aggregatorV3Interface ABI**. We add the `aggregatorV3Interface` ABI required to interact with the smart contract. The ABI outlines the functions in the smart contract so the library knows how to use them. 4. **Interact with Chainlink price feeds**. Chainlink price feeds are implemented using the AggregatorV3Interface, which provides several functions for data interaction. Among these functions, `latestRoundData` is the one we use to fetch the latest price information. For a complete reference to the functions and capabilities of the AggregatorV3Interface, you can consult Chainlink's [API Reference](https://docs.chain.link/data-feeds/api-reference) page. 5. **Fetching the Prices**. In our `fetchPrices` function, we iterate through the specified pairs, and for each pair: * Create an instance of the Chainlink price feed contract using the web3.js library. * Call the `latestRoundData` function to retrieve the most recent price data. * Convert the obtained price to a readable format (divide by `1e8` and round to two decimal places) and store it in our `conversionRate` dictionary. * 6. **Continual data refresh**. To ensure we continually have up-to-date price information, we call our `fetchPrices` function every minute using JavaScript's `setInterval` function. ## Conclusion Following the above steps, you've successfully built a robust and lightweight script to fetch real-time cryptocurrency prices directly from the Ethereum blockchain, eliminating the need for third-party APIs or services. With this approach, you ensure data accuracy and gain unparalleled control over how you fetch and manage this data. Happy coding and trading! ### About the author Technical Support Engineer @ Chainstack JUST BUIDL IT! [](https://github.com/0x6564) [](https://twitter.com/edeenn22) [](https://www.linkedin.com/in/edindr/) # How to mint a generative music NFT with Chainstack IPFS storage and Soundraw Source: https://docs.chainstack.com/docs/how-to-mint-generative-music-nft-with-chainstack-ipfs-storage-soundraw **TLDR** * Leverage a multi-layer generation script to create, merge, and store music, icons, text, and shapes as NFTs. * Use Soundraw to generate unique audio clips, then merge them with dynamic images and pinned metadata via Chainstack IPFS. * Finally, mint your NFTs by loading metadata from IPFS, estimating gas, and signing transactions to deploy them on-chain. * This approach showcases a full end-to-end pipeline for producing generative music NFTs, from creation to minting. ## Main article This tutorial builds upon the foundations outlined in the [How to mint a music NFT: Dropping fire tunes with Chainstack IPFS Storage](/docs/how-to-mint-music-nft-with-chainstack-ipfs-storage) and the [How to create generative NFTs: Making abstract art with code](https://chainstack.com/procedurally-generated-nfts/) tutorials. It is strongly recommended that you have completed the tutorials before you proceed, or at the very least have the [music NFT tutorial repo](https://github.com/chainstacklabs/music-nft-minter-tutorial-repo) cloned, so you can use the code base as a starting point. ## Step 1: Process dependencies and initialize parameters Much like the other scripts from the [How to mint a music NFT: Dropping fire tunes with Chainstack IPFS Storage](/docs/how-to-mint-music-nft-with-chainstack-ipfs-storage) tutorial, you will need Hardhat with the web3.js plugin installed, Axios to submit HTTP requests, as well as Dotenv to store and access your DApp secrets. ```shell Shell npm i @nomiclabs/hardhat-web3 axios dotenv ``` Apart from these, however, you will also need an array of different libraries that you will use to make various transformations, generations, and requests. These are: * **Random words** — generate a set of random words based on the desired number of words. * **Text-to-image** — render a given string to an image and export it as an image file. * **Jdenticon** — generate a random icon based on size and seed, then export it as an image file. * **Canvas** — generate 2D graphics via the HTML `` element. * **Image data URI** — decode and encode data URI images. * **Merge images** — merge several images into a single image as its layers. ```shell Shell npm i random-words@1.3.0 text-to-image jdenticon canvas image-data-uri merge-images ``` Make sure to install the `1.3.0` version of the `random-words` library, as `2.0.0` introduces an array of changes that make it less accessible in a node.js workflow. Once you’re done installing the packages, create a new `generate.js` file in your `./scripts` directory, and process all dependencies like so: ```javascript generate.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const fsPromises = fs.promises; const path = require('path'); const https = require('https'); const randomWords = require('random-words'); // 1.3.0 const textToImage = require('text-to-image'); const jdenticon = require('jdenticon'); const { Canvas, Image } = require('canvas'); const ImageDataURI = require('image-data-uri'); const mergeImages = require('merge-images'); const FormData = require('form-data'); const axios = require('axios'); ``` ### Setting up your environment You will also need access to the [Soundraw API](https://soundraw.io/), in order to make audio file generations. Soundraw is an AI-powered platform that allows you to create unique, royalty-free music tailored to your needs. You can generate unlimited songs by simply selecting the mood, genre, and length of the track, and the AI will create the music. The platform enables you to customize the audio files you generate to fit specific requirements, for instance, adjusting the length of an intro or the position of a chorus. You can use the generated music freely without worrying about copyright strikes, making it perfect for YouTube videos, social media, commercials, podcasts, games, apps, and even NFTs. Once you have access to the Soundraw API and an API token, hop on to your `.env` file and add it to the list like so: ```jsx .env # development SEPOLIA="https://your-sepolia-node-endpoint-here.com" MAINNET="https://your-ethereum-mainnet-node-endpoint-here.com" CHAINSTACK="Bearer your.ChainstackAPIkeyHere" PRIVATE_KEY="1337y0urWalletPrivateKeyHere1337" WALLET="0xY0urWalletAddressHere1337" SEPOLIA_CONTRACT="0xY0urSepoliaNFTSmartContractAddressHere" MAINNET_CONTRACT="Y0urEthereumMainnetNFTSmartContractAddressHere" BUCKET_ID="BUCK-1337-8008-1337" FOLDER_ID="FOLD-1337-8008-1337" ETHERSCAN="Y0URETHERSCANAPIKEYHERE" SOUNDRAW="Y0urSoundrawAPIKeyHEre==" ``` Then, return to your `generate.js` file and load the required environment variables after the dependency list: ```jsx generate.js // Load environment variables const address = process.env.WALLET; const soundraw = process.env.SOUNDRAW; ``` ### Initializing generation parameters Next, create a list of variables that will cover all generation parameters. You will need such for storing a random hex, the generation id, the combined random hex output with your wallet address, converted to number string, the digital root of the number string, the generated random word output, the font and background colors hex values, the generated shape parameters, in terms of number of sides, size, center X and Y positions, stroke, and fill hex color values. Here’s how: ```javascript generate.js // Initialize generation parameters let randomHex; let randomStr; let wordNrs; let digiRoot; let wordsOut = ''; let colorHex = '#'; let bgColorHex = '#'; let shapeSides = ''; let shapeSize = ''; let shapeCtrX = ''; let shapeCtrY = ''; let shapeStroke = '#'; let shapeFill = '#'; let idHex = ''; ``` Furthermore, you will also need to do the same to initialize the Soundraw API parameters—mood, genre, theme, length, file format, tempo, and energy levels but this time as constants since you won’t be changing these at any point during the tutorial. For this tutorial, you can exclude the `Muted` energy level, as it creates an empty audio segment, which won’t be necessary for the use case, as well as a sample length of 77 seconds and a default file format of `mp3`. ```javascript generate.js // Soundraw parameters const moods = ["Angry", "Busy & Frantic", "Dark", "Dreamy", "Elegant", "Epic", "Euphoric", "Fear", "Funny & Weird", "Glamorous", "Happy", "Heavy & Ponderous", "Hopeful", "Laid back", "Mysterious", "Peaceful", "Restless", "Romantic", "Running", "Sad", "Scary", "Sentimental", "Sexy", "Smooth", "Suspense"]; const genres = ["Acoustic", "Hip Hop", "Beats", "Funk", "Pop", "Drum n Bass", "Trap", "Tokyo night pop", "Rock", "Latin", "House", "Tropical House", "Ambient", "Orchestra", "Electro & Dance", "Electronica", "Techno & Trance"]; const themes = ["Ads & Trailers", "Broadcasting", "Cinematic", "Corporate", "Comedy", "Cooking", "Documentary", "Drama", "Fashion & Beauty", "Gaming", "Holiday season", "Horror & Thriller", "Motivational & Inspiring", "Nature", "Photography", "Sports & Action", "Technology", "Travel", "Tutorials", "Vlogs", "Wedding & Romance", "Workout & Wellness"]; const length = 77 const fileFormat = "mp3"; const tempo = ["low", "normal", "high"]; const energyLevels = ["Low", "Medium", "High", "Very High"]; ``` With that taken care of, your `generate.js` file in the `/scripts/` directory should look like this: ```jsx generate.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const fsPromises = fs.promises; const path = require('path'); const https = require('https'); const randomWords = require('random-words'); // 1.3.0 const textToImage = require('text-to-image'); const jdenticon = require('jdenticon'); const { Canvas, Image } = require('canvas'); const ImageDataURI = require('image-data-uri'); const mergeImages = require('merge-images'); const axios = require('axios'); // Load environment variables const address = process.env.WALLET; const soundraw = process.env.SOUNDRAW; // Initialize generation parameters let randomHex; let randomStr; let wordNrs; let digiRoot; let wordsOut = ''; let colorHex = '#'; let bgColorHex = '#'; let shapeSides = ''; let shapeSize = ''; let shapeCtrX = ''; let shapeCtrY = ''; let shapeStroke = '#'; let shapeFill = '#'; let idHex = ''; // Soundraw parameters const moods = ["Angry", "Busy & Frantic", "Dark", "Dreamy", "Elegant", "Epic", "Euphoric", "Fear", "Funny & Weird", "Glamorous", "Happy", "Heavy & Ponderous", "Hopeful", "Laid back", "Mysterious", "Peaceful", "Restless", "Romantic", "Running", "Sad", "Scary", "Sentimental", "Sexy", "Smooth", "Suspense"]; const genres = ["Acoustic", "Hip Hop", "Beats", "Funk", "Pop", "Drum n Bass", "Trap", "Tokyo night pop", "Rock", "Latin", "House", "Tropical House", "Ambient", "Orchestra", "Electro & Dance", "Electronica", "Techno & Trance"]; const themes = ["Ads & Trailers", "Broadcasting", "Cinematic", "Corporate", "Comedy", "Cooking", "Documentary", "Drama", "Fashion & Beauty", "Gaming", "Holiday season", "Horror & Thriller", "Motivational & Inspiring", "Nature", "Photography", "Sports & Action", "Technology", "Travel", "Tutorials", "Vlogs", "Wedding & Romance", "Workout & Wellness"]; const length = 77 const fileFormat = "mp3"; const tempo = ["low", "normal", "high"]; const energyLevels = ["Low", "Medium", "High", "Very High"]; ``` ## Step 2: Set up the generative process Once you've prepared the dependencies and set up the generation parameters, the next step is to create a seed. A seed is typically a random set of characters in a string that is used to feed various generation algorithms or libraries, but it can technically take any form. You can create a seed to kickstart your generative process by using the `randomHex()` method in the `utils` section of web3.js. So, go ahead and create a new `generator` asynchronous function as a constant and set the `randomHex` variable’s value to the `randomHex()` method with `20` as its first and only parameter. This will generate a random hex string with 20 bytes in length, starting with `0x` But this `0x` is hardly needed, so at the end of the method call add `.concat(address.slice(2))` to trim the excess: ```jsx generate.js const generator = async() => { // Random generator layer 0: Seed preparations console.log('\nSeed generation started...\n'); // Generate random hex with 20 bytes for symbols (same as wallet addresses) randomHex = web3.utils.randomHex(20).concat(address.slice(2)); console.log('Random hex generated: ' + randomHex + '\n'); } ``` Next, it’s time to create and store a basic ID for each of your generations. You can do this by taking the `randomHex` seed value you just generated, then using the first and last three characters of the hex string to form the IDs: ```jsx generate.js // Generate ids for filenames to organize easier idHex = randomHex.slice(2, 5).concat(randomHex.slice(79, 82)) console.log('Used hex to generate ID: ' + idHex + '\n'); ``` In similar fashion, you can generate hex color values to feed the image generators further in your code. Let’s use a higher degree of randomization here by creating a loop that will run six times to form each of the six characters in a hex color value. You can use the same loop for both font and background colors: ```jsx generate.js // Generate random hex color value by picking random characters from the generated hex string for (var i = 0; i < 6; i++) { colorHex = colorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); bgColorHex = bgColorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); } console.log('Used hex to generate text color: ' + colorHex + ' & background color: ' + bgColorHex + '\n'); ``` Once you’re done with the loop, you can go ahead and convert the random hex seed to a number string, so you can use it in another form to feed some of the generator libraries. To do this, use the web3.js `hexToNumberString()` method like so: ```jsx generate.js // Generate new string by combining the random hex output with wallet address and converting it to number string wordNrs = web3.utils.hexToNumberString(randomHex); console.log('Transformed hex into number string: ' + wordNrs + '\n'); ``` This number string gives you an opportunity to use it for the Soundraw API audio generation for which you will need to create a loop. But before you start with the loop, make sure you have referenced a set of temporary variables that will be needed for it. The first of these is the `categories` array, which is simply a collection of all Soundraw parameter categories, namely `moods`, `genres`, `themes`, and `tempo`. The same applies to `categoryNames`. Then, there is also the `numberOfTimeframes`, which indicates the number of segments you want to have in a generated audio for which you can set individual `energy_levels`. Last, initialize as object a `requestPayload` variable, which you will use to store the contents of your Soundraw API requests. ```jsx generate.js // Select Soundraw parameters based on the wordNrs number string let categories = [moods, genres, themes, tempo]; let categoryNames = ['Mood', 'Genre', 'Theme', 'Tempo']; let numberOfTimeframes = 3; let requestPayload = {}; ``` ### Creating the audio generation layer Once ready, proceed by creating the loop that will be running four consecutive times. Inside it creates a new temporary array `randomIndices` and initializes it as empty. Then, creates another `for` loop that will iterate three times. Within it, set the `randomIndices[index]` value to the integer (`parseInt()`) sum of `0` and a random character (number) from the `wordNrs` string using the `charAt()` method. As parameter use `Math.floor()` to round the output down to the nearest whole number, since array indices are integers. When it comes to`Math.floor()`, add the `Math.random()` method as a parameter, which generates a random floating-point number between 0 and 1 and multiplies this by the length of the `wordNrs` string. Next, sum the values of the `randomIndices`array with `reduce((a, b) => a + b, 0)`and check if the resulting `randomIndex` value is greater or equal to the length of the given category, or less than `0`. If it is, rerun the loop calculations once more with a fresh start from `0` to match the lowest possible values and check again, resulting in `randomIndex` being set to `0` in case of discrepancy again. This will give you a valid index, based on the `wordNrs` seed that can match the entire possible range of each of the `categories` , in turn picking a value for it. Apply this random index to a new temporary `categorySelected` variable by setting its value to `categories[i][randomIndex]` . After that, create an `if-else` statement, where if `categoryNames[i]` is different from `'Tempo'` the `requestPayload[categoryNames[i]` value converted `toLowerCase()` is equal to the `categorySelected` variable you set just now. Then, for the `else` segment of your statement, set the value of the `requestPayload.tempo` once again to `categorySelected`. By doing this, you will select a value for each of the parameter `categories` from its valid array values, while minding the formatting of each without breaking anything. ```jsx generate.js // Create a loop to generate a random index for the current category for (let i = 0; i < 4; i++) { // Create an array that will hold the randomIndex value for each iteration of the following loop let randomIndices = [] // Iterate loop three times to reach all possible options with double-digit values for (let index = 0; index < 3; index++) { randomIndices[index] = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) } // Sum the results from each iteration abd make sure they match the category length let randomIndex = randomIndices.reduce((a, b) => a + b, 0); if (randomIndex >= categories[i].length) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } else if (randomIndex < 0) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } let categorySelected = categories[i][randomIndex]; if (categoryNames[i] !== 'Tempo') { requestPayload[categoryNames[i].toLowerCase()] = categorySelected; } else { requestPayload.tempo = categorySelected; } } ``` Next comes a rather tricky part—how do you set a different energy level for the three timeframes that will define your audio generation. To do that, create an empty `energyLevelsArray` array to store the objects that define the timestamps times for each energy level and another one called `lengths` which will do so for the length of each timeframe. Then, create a `for` loop that will run as many times as the `numberOfTimeframes - 1` and inside it use the `Math.random()` method to generate a relative length of time, based on the total length for each energy level. You only need to create a number of divisions that is one less than the number of timeframes, as this will split the audio into segments equal to the number of timeframes. Once you’re ready with the loop, sort the `lengths` array in ascending order with the `sort()` method to make sure each energy level starts after the previous one ends. ```jsx generate.js // Create arrays for holding the energy level objects and their lengths let energyLevelsArray = []; let lengths = []; for (let j = 0; j < numberOfTimeframes - 1; j++) { lengths.push(Math.random()); } lengths.sort(); ``` Next, create a temporary variable named `previous` and set its value to 0. After that, start a new `for` loop. This time, the loop will adjust the random `lengths` you generated earlier to make sure they add up to the total length of the audio. You will use the `previous` variable to keep track of the cumulative sum of the `lengths` processed in the previous loop. Run the loop as many times as the `numberOfTimeframes` and in each iteration adjust the `j`th element of `lengths` array by multiplying it with the total `length` of the generated audio, as well as subtracting `previous` from it. By doing this, you will get a length time segment that takes into account the total length of your audio and the sum of those before it. Then, update the `previous` to be the `j`th element of `lengths` as the newly computed length for this iteration. In doing so, you will subtract it from the next length in the following iteration of the loop, so they add up accordingly. Last, round the `j`th element of the `lengths` array to a single decimal place as a floating-point number to match the Soundraw formatting requirements. ```jsx generate.js // Adjust the lengths so they are proportional and add up to the audio length accordingly let previous = 0; for (let j = 0; j < numberOfTimeframes; j++) { lengths[j] = lengths[j] * length - previous; previous = lengths[j]; lengths[j] = parseFloat(lengths[j].toFixed(1)); } ``` Once ready, create another temporary variable `currentTime` and initialize it once again as `0` to keep track of the current timestamp in the audio. Apart from that, create one final `for` loop, which will also run as many times as the `numberOfTimeframes`. Inside the loop create a temporary variable `energyStart` and store the starting timestamp for each timeframe, by setting it to the `currentTime` as a floating point rounded to one decimal place. Next, do the same for `energyEnd`, while setting to the sum of the `currentTime` and the length of the timeframe, once again rounded to one decimal place if this is not the last timeframe. If it is, set it to the total length of the piece with `j < numberOfTimeframes - 1 ? parseFloat((currentTime + lengths[j]).toFixed(1)) : length;`. After that, go ahead and update the `currentTime` to be the end of the current energy level, so you can start the next with the following iteration of the loop. Then, pick a random energy level from the `energyLevels` array by using the same logic you applied to `randomIndex` previously, just skip the extra triple loop you used there. Finish the loop off by pushing the `start`, `end`, and `energy` values accordingly. ```jsx generate.js let currentTime = 0; // Generate different energy levels for different timeframes for (let j = 0; j < numberOfTimeframes; j++) { let energyStart = parseFloat(currentTime.toFixed(1)); let energyEnd = j < numberOfTimeframes - 1 ? parseFloat((currentTime + lengths[j]).toFixed(1)) : length; currentTime = energyEnd; // Apply the same logic as for randomIndex previously without the tripple iteration let randomEnergyIndex randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } else if (randomEnergyIndex < 0) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } let selectedEnergy = energyLevels[randomEnergyIndex]; energyLevelsArray.push({ start: energyStart, end: energyEnd, energy: selectedEnergy }); } ``` Lastly, finalize the audio generation process, by updating the `requestPayload` with the new values and print the selected parameters in the console, making them also a part of the generated audio’s filename. You will then use this payload to submit an HTTP request to the Soundraw API to kickstart the audio generation on the server side and get the file in return. ```jsx generate.js // Update the request payload requestPayload.energy_levels = energyLevelsArray; requestPayload.length = length; requestPayload.file_format = fileFormat; // Print selected parameters and make them the audio filename let filename = `${idHex} ${requestPayload.mood} ${requestPayload.genre} ${requestPayload.theme} ${requestPayload.tempo} [${length}s].mp3`; ``` As the final part of the audio generation process, you will need to send a `POST` request to the Soundraw API using `axios`. To do this, you must first prepare the request, by targeting the `compose` endpoint of the API as the `url`and `requestPayload` as the `data`. Make sure you’ve set the `Content-Type` header to `application/json` and the `Authorization` one to your Soundraw API key or `soundraw` in this case. ```jsx generate.js // Submit an axios request to the Soundraw API and fetch the audio file console.log(`Attempting to submit request to Soundraw API with parameters ${JSON.stringify(requestPayload, null, 2)}\n`); axios({ method: 'post', url: 'https://soundraw.io/api/v2/musics/compose', data: requestPayload, headers: { "Content-Type": "application/json", "Authorization": soundraw } }) ``` Now, let's move on to how to handle the response from the Soundraw API. Use the `then` method to specify what to do when the HTTP request is successful. In this case, the code first logs the response data, then uses the 'path' library to create a path for storing the audio file. The audio file is stored in a directory named `audio`. This directory is relative to the `src` directory, which resides in the same directory as the current script. ```jsx generate.js .then(async function (response) { const audioFilePath = path.join('audio', filename); console.log(`Soundraw request successful. Response: ${JSON.stringify(response.data)}`); const formattedAudioFilePath = './src/' + audioFilePath.replace(/\\/g, "/"); // replace backslashes with forward slashes ``` Next, create a writable stream with `fs.createWriteStream` and download the MP3 file from the URL provided in the response. Pipe the response to the file stream, effectively downloading the file. Exporting files to directories that do not exist yet may cause errors because of FS/OS permissions. Try creating them manually if you encounter such issue. After the file finishes downloading, attempt to update the local metadata with a custom function named `updateLocalMetadata`. If an error occurs during this process, log it to the console. ```jsx generate.js const file = fs.createWriteStream(path.join(__dirname, '../src', audioFilePath)); const request = https.get(response.data.mp3_url, function (response) { response.pipe(file).on('finish', async function () { // Call the function to update the JSON file try { console.log(`\nSoundraw audio saved to: ${formattedAudioFilePath}`); await updateLocalMetadata(idHex, mergePath, formattedAudioFilePath, wordsOut, colorHex, digiRoot, requestPayload, length); } catch (err) { console.error(err); } }); }); ``` Lastly, handle errors that may occur during the file write or HTTP request process. Both `request` and `file` objects are event emitters, emitting `error` events when something goes wrong. Listen for these events and, when they happen, log the error to the console and close the file stream. In case the axios request throws any error, catch it and log it to the console using the `catch` method. ```jsx generate.js request.on('error', (err) => { console.error(`Request error: ${err}`); file.end(); }); file.on('error', (err) => { console.error(`File error: ${err}`); file.end(); }); }) .catch(function (error) { console.log(error); }); ``` This is a robust method of making an API request, handling the response, and properly taking care of any errors that might occur during the workflow. Here’s how the entire audio generation process looks, with the `axios` set up included: ```jsx generate.js // Soundraw parameters const moods = ["Angry", "Busy & Frantic", "Dark", "Dreamy", "Elegant", "Epic", "Euphoric", "Fear", "Funny & Weird", "Glamorous", "Happy", "Heavy & Ponderous", "Hopeful", "Laid back", "Mysterious", "Peaceful", "Restless", "Romantic", "Running", "Sad", "Scary", "Sentimental", "Sexy", "Smooth", "Suspense"]; const genres = ["Acoustic", "Hip Hop", "Beats", "Funk", "Pop", "Drum n Bass", "Trap", "Tokyo night pop", "Rock", "Latin", "House", "Tropical House", "Ambient", "Orchestra", "Electro & Dance", "Electronica", "Techno & Trance"]; const themes = ["Ads & Trailers", "Broadcasting", "Cinematic", "Corporate", "Comedy", "Cooking", "Documentary", "Drama", "Fashion & Beauty", "Gaming", "Holiday season", "Horror & Thriller", "Motivational & Inspiring", "Nature", "Photography", "Sports & Action", "Technology", "Travel", "Tutorials", "Vlogs", "Wedding & Romance", "Workout & Wellness"]; const length = 77 const fileFormat = "mp3"; const tempo = ["low", "normal", "high"]; const energyLevels = ["Low", "Medium", "High", "Very High"]; const generator = async() => { // Random generator layer 0: Seed preparations console.log('\nSeed generation started...\n'); // Generate random hex with 20 bytes for symbols (same as wallet addresses) randomHex = web3.utils.randomHex(20).concat(address.slice(2)); console.log('Random hex generated: ' + randomHex + '\n'); // Generate ids for filenames to organize easier idHex = randomHex.slice(2, 5).concat(randomHex.slice(79, 82)) console.log('Used hex to generate ID: ' + idHex + '\n'); // Generate random hex color value by picking random characters from the generated hex string for (var i = 0; i < 6; i++) { colorHex = colorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); bgColorHex = bgColorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); } console.log('Used hex to generate text color: ' + colorHex + ' & background color: ' + bgColorHex + '\n'); // Generate new string by combining the random hex output with wallet address and converting it to number string wordNrs = web3.utils.hexToNumberString(randomHex); console.log('Transformed hex into number string: ' + wordNrs + '\n'); // Select Soundraw parameters based on the wordNrs number string let categories = [moods, genres, themes, tempo]; let categoryNames = ['Mood', 'Genre', 'Theme', 'Tempo']; let numberOfTimeframes = 3; let requestPayload = {}; // Create a loop to generate a random index for the current category for (let i = 0; i < 4; i++) { // Create an array that will hold the randomIndex value for each iteration of the following loop let randomIndices = [] // Iterate loop three times to reach all possible options with double-digit values for (let index = 0; index < 3; index++) { randomIndices[index] = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) } // Sum the results from each iteration abd make sure they match the category length let randomIndex = randomIndices.reduce((a, b) => a + b, 0); if (randomIndex >= categories[i].length) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } else if (randomIndex < 0) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } let categorySelected = categories[i][randomIndex]; if (categoryNames[i] !== 'Tempo') { requestPayload[categoryNames[i].toLowerCase()] = categorySelected; } else { requestPayload.tempo = categorySelected; } } // Create arrays for holding the energy level objects and their lengths let energyLevelsArray = []; let lengths = []; for (let j = 0; j < numberOfTimeframes - 1; j++) { lengths.push(Math.random()); } lengths.sort(); // Adjust the lengths so they are proportional and add up to the audio length accordingly let previous = 0; for (let j = 0; j < numberOfTimeframes; j++) { lengths[j] = lengths[j] * length - previous; previous = lengths[j]; lengths[j] = parseFloat(lengths[j].toFixed(1)); } let currentTime = 0; // Generate different energy levels for different timeframes for (let j = 0; j < numberOfTimeframes; j++) { let energyStart = parseFloat(currentTime.toFixed(1)); let energyEnd = j < numberOfTimeframes - 1 ? parseFloat((currentTime + lengths[j]).toFixed(1)) : length; currentTime = energyEnd; // Apply the same logic as for randomIndex previously without the tripple iteration let randomEnergyIndex randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } else if (randomEnergyIndex < 0) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } let selectedEnergy = energyLevels[randomEnergyIndex]; energyLevelsArray.push({ start: energyStart, end: energyEnd, energy: selectedEnergy }); } // Update the request payload requestPayload.energy_levels = energyLevelsArray; requestPayload.length = length; requestPayload.file_format = fileFormat; // Print selected parameters and make them the audio filename let filename = `${idHex} ${requestPayload.mood} ${requestPayload.genre} ${requestPayload.theme} ${requestPayload.tempo} [${length}s].mp3`; // Submit an axios request to the Soundraw API and fetch the audio file console.log(`Attempting to submit request to Soundraw API with parameters ${JSON.stringify(requestPayload, null, 2)}\n`); axios({ method: 'post', url: 'https://soundraw.io/api/v2/musics/compose', data: requestPayload, headers: { "Content-Type": "application/json", "Authorization": soundraw } }) .then(async function (response) { const audioFilePath = path.join('audio', filename); console.log(`Soundraw request successful. Response: ${JSON.stringify(response.data)}`); const formattedAudioFilePath = './src/' + audioFilePath.replace(/\\/g, "/"); // replace backslashes with forward slashes const file = fs.createWriteStream(path.join(__dirname, '../src', audioFilePath)); const request = https.get(response.data.mp3_url, function (response) { response.pipe(file).on('finish', async function () { // Call the function to update the JSON file try { console.log(`\nSoundraw audio saved to: ${formattedAudioFilePath}`); await updateLocalMetadata(idHex, mergePath, formattedAudioFilePath, wordsOut, colorHex, digiRoot, requestPayload, length); } catch (err) { console.error(err); } }); }); request.on('error', (err) => { console.error(`Request error: ${err}`); file.end(); }); file.on('error', (err) => { console.error(`File error: ${err}`); file.end(); }); }) .catch(function (error) { console.log(error); }); ``` ### Setting up the image generation layers Once you’re ready setting up the audio generation layer, it’s time to move forward with the image one. To do that, you will first need to start doing some calculations for the shape layer. First, set the number of sides for the polygon shape. Extract a random character from the `wordNrs` string, convert it to an integer, and add 1. Note that `Math.random()` generates a random float from 0 to 1, and multiplying it by the length of the string helps in picking a random index from the string. `Math.floor()` ensures that the value is a whole number, to use as an index. Next, generate the stroke and fill colors for the shape. Combine different parts of the `colorHex` and `bgColorHex` to ensure a wide variation. Use the `slice` function to extract parts of the strings. ```jsx generate.js // Begin calculations for random shape layer generation parameters // Randomize shape parameters but ensure they are never zero // Find out the number of sides the shape has by picking a random number from the number string shapeSides = parseInt(1 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))); console.log('Used number string to determine polygon shape sides count: ' + shapeSides + '\n'); // Combine the first three digits of one of the two hex color values picked earlier with the last three of the other for greater variation shapeStroke = shapeStroke.concat(colorHex.slice(4, 7).concat(bgColorHex.slice(1, 4))); shapeFill = shapeFill.concat(bgColorHex.slice(4, 7).concat(colorHex.slice(1, 4))); console.log('Used text & background colors to generate new border: ' + shapeStroke + ' & fill: ' + shapeFill + '\n'); ``` Then, run a loop twice to generate the size and center coordinates of the shape. For each property, randomly pick a digit from `wordNrs` and add it to the current property value. Multiply the result by Pi and add or subtract a constant. Use `Math.abs()` to avoid negative results. ```jsx generate.js // Loop following calculations twice to generate double or higher digit values for the shape for (var i = 0; i < 2; i++) { // Avoid negative results by converting result to absolute value // Pick a random digit from the number string earlier, add the current shapeSize value, serve as float, multiply by Pi and add 10 for sizes between ~50 and ~150 for greater balance shapeSize = Math.abs(10 + Math.PI * parseFloat(shapeSize + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))))); // Same as above except you substract 100 instead of adding 10. This will make the shape roll around the middle shapeCtrX = Math.abs(Math.PI * parseFloat(shapeCtrX + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); shapeCtrY = Math.abs(Math.PI * parseFloat(shapeCtrY + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); } console.log('Used number string to determine polygon shape size: ' + shapeSize + ' X-axis center value: ' + shapeCtrX + ' & Y-axis center value: ' + shapeCtrY + '\n'); ``` After these calculations, define a `digitalRoot` function to reduce `wordNrs` to a single digit. This function works by converting `input` to a string and summing all its digits. If the result is a single digit, return it, otherwise, call the function recursively. ```jsx generate.js // Reduce number string to single digit with the digital root formula function digitalRoot(input) { var nrStr = input.toString(), i, result = 0; if (nrStr.length === 1) { return +nrStr; } for (i = 0; i < nrStr.length; i++) { result += +nrStr[i]; } return digitalRoot(result); } ``` Next, call your `digitalRoot()` function with the `wordNrs` as its only parameter to print the digital root result. Then, go ahead and check if the `result` is odd or even by returning the remainder of that number divided by two. If the number is even, the function will return `0`, because the remainder of an even number divided by two is always zero, and if it is odd, it will return `1`. ```jsx generate.js // Print digital root result digiRoot = digitalRoot(wordNrs); console.log('Calculated digital root of number string: ' + digiRoot + '\n'); // Check if result is odd or even function NrChk(nr) { return nr % 2; } console.log('Checking if digital root is odd or even: ' + NrChk(digiRoot) + '\n'); if (NrChk(digiRoot) > 0) { console.log('Generating 3 random words - digital root is odd\n'); } else { console.log('Generating 2 random words - digital root is even\n'); } ``` Once you are ready, it is time to move forward with the first generation process within the image layer—the text. Set the value of the `randomStr` variable you defined earlier to the `randomWords()` generator with the `boolean` result of the odd/even check function as its solo parameter. Then, add `2` to it the result, as this is the easiest way to generate `2` words for even and `3` words for odd (even 0 + 2 base = 2; odd 1 + 2 base = 3). Don’t forget to split the `randomStr` according to `' ,'` as you will be left with an unformatted comma-separated word list instead. Follow it up by creating a `for` loop that will capitalize the word set and join them as a single string to obtain the `wordsOut` variable’s value. ```jsx generate.js // Random generator layer 1: Text // Generate set of random words - 2 for even 3 for odd. Since result will always be 0 or 1 easiest and fastest way is to just add 2. Replace "," with space for natural appeal randomStr = (randomWords(NrChk(digiRoot) + 2).toString()).split(','); console.log('Random words generated are: ' + randomStr + '\n'); // Capitalize word set and join them as single string for (var i = 0; i < randomStr.length; i++) { randomStr[i] = (randomStr[i].charAt(0)).toUpperCase() + randomStr[i].slice(1); } wordsOut = randomStr.join(' '); console.log('Capitalizing random words string: ' + wordsOut + '\n'); ``` After that, proceed by generating an image from the `wordsOut` word list, using `textToImage`'s `generate()` method, paired with the `idHex` string as its first parameter. As its second, create an object, adding a wide range of options, including `debug: true` as this is the only way you can render it to a file. Make sure you’re also creating a new temporary `textPath` variable to store the file name and location information. You can play around as much as you like with the other options here, just don’t forget to set the values for the `bgColor` and `textColor` keys to the `bgColorHex` and `colorHex` variables you defined and calculated earlier. For this tutorial, use `maxWidth` set to `330`, `textAlign` to `center`, and `verticalAlign` to `top` as this will put the text at the top of the image centered, creating a bar-like element which is a typical place for placing such information. ```jsx generate.js // Generate image from the random words, while using the library's debug mode to render to file var textPath = './src/texts/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Text Layer].png'; console.log('Exporting random words string as image to: ' + textPath + '\n'); const dataUri = await textToImage.generate(idHex + ' ' + wordsOut, { debug: true, debugFilename: textPath, maxWidth: 330, customHeight: 33, fontSize: 18, fontFamily: 'Arial', lineHeight: 22, margin: 5, bgColor: bgColorHex, textColor: colorHex, textAlign: 'center', verticalAlign: 'top', }); ``` Now, let's proceed to the second random generator layer—the Icon. First, set the icon parameters, which include `iconSize` and `iconSeed`. The seed is set to `wordsOut`, meaning that the generated icon will be uniquely based on the random words string. After setting these parameters, use the `jdenticon.toPng()` function to generate the icon and assign it to `iconExport`. So, go ahead and set the `iconSize` to `350` and `iconSeed` to `wordsOut`, so you can use the two to three random words output as the basis for the generation. Then, call the `toPng()` method of the `jdenticon` package with those two parameters. Remember to set the `iconPath` with the location you want it exported to and the appropriate file name convention. Lastly, write the generated file to the location you have selected by calling the `fs.writeFileSync` method. ```jsx generate.js // Random generator layer 2: Icon // Set icon parameters var iconSize = 350; var iconSeed = wordsOut; // Export icon to png const iconExport = jdenticon.toPng(iconSeed, iconSize); var iconPath = './src/icons/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Icon Layer].png'; console.log('Using random words string as seed to generate icon at: ' + iconPath + '\n'); fs.writeFileSync(iconPath, iconExport); ``` With that behind you, move on by creating the final third layer of the image—the shape. To do that, create a new `shapeCanvas` constant, setting its value to a `new Canvas()` with `350, 350` as its parameter. Then, create another one for `shapeContext` and place `shapeCanvas.getContext('2d');` as its value. This will allow you to start drawing a path on the canvas, meaning you can call the `beginPath()` method from the `shapeContext`. ```jsx generate.js // Random generator Layer 3: Shape // Create new canvas object and set the context to 2d const shapeCanvas = new Canvas(350, 350); const shapeContext = shapeCanvas.getContext('2d'); // Start drawing path on canvas console.log('Using polygon settings to draw path points & paint shape...\n'); shapeContext.beginPath(); ``` Afterwards, start moving the `shapeContext` path to four randomly generated points, in order to draw a path. And considering the example formula used to pick these points for the tutorial was defined without too much in-depth evaluation prior, feel free to play around with it to achieve more interesting points for the polygon shape. Just do remember to connect the path points, based on the number of sides you picked for the shape earlier, using a `for` loop, iterating as many times as the number of sides. To do that, use the `lineTo()` method of the `shapeContext` with the sum of the `shapeCtrX` value and the `shapeSize` one, multiplied by the cosine of the loop `index` multiplied by `2` and `Pi`, divided by the sides. ```jsx generate.js // Pick four incomprehensively generated points for the drawing path. Feel free to play around with the formula until you get desireable results shapeContext.moveTo(shapeCtrX + shapeSize * (Math.floor(Math.random() * 100 * Math.cos(shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * 10 * Math.sin(shapeSides * shapeSize))), shapeCtrX + shapeSize * (Math.floor(Math.random() * 1000 * Math.tan(shapeCtrY * shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * (1 / Math.tan(shapeCtrX * shapeSides))))); // Connect the path points according to randomly picked number of sides for the polygon for (var i = 1; i <= shapeSides; i++) { shapeContext.lineTo(shapeCtrX + shapeSize * Math.cos(i * 2 * Math.PI / shapeSides), shapeCtrY + shapeSize * Math.sin(i * 2 * Math.PI / shapeSides)); } ``` Then, proceed by closing the drawing path, thus completing the polygon. Follow up by applying the `shapeStroke` and `shapeFill` values you defined earlier as the `strokeStyle` and `fillStyle` . Last, record the shape’s `dataURI` to an image buffer, so you can export it to an image file. Make sure you’re also setting the `shapePath` with the appropriate location and file name, before you write it to file using the `fs.writeFileSync()` method. ```jsx jsx // Close drawing path to complete the drawn object then proceed with applying border width and color, as well as fill color shapeContext.closePath(); shapeContext.strokeStyle = shapeStroke; shapeContext.fillStyle = shapeFill; shapeContext.fill(); shapeContext.lineWidth = shapeSides; shapeContext.stroke(); // Record shape data URI to image buffer then render to preferred path const shapeBuffer = shapeCanvas.toBuffer("image/png"); var shapePath = './src/shapes/' + shapeSides + ' ' + shapeStroke + '.png'; console.log('Exporting polygon shape as image to: ' + shapePath + '\n'); fs.writeFileSync(shapePath, shapeBuffer); ``` Here’s how your image generation layer part of the script should look like if you’ve done everything accordingly: ```jsx generate.js // Begin calculations for random shape layer generation parameters // Randomize shape parameters but ensure they are never zero // Find out the number of sides the shape has by picking a random number from the number string shapeSides = parseInt(1 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))); console.log('Used number string to determine polygon shape sides count: ' + shapeSides + '\n'); // Combine the first three digits of one of the two hex color values picked earlier with the last three of the other for greater variation shapeStroke = shapeStroke.concat(colorHex.slice(4, 7).concat(bgColorHex.slice(1, 4))); shapeFill = shapeFill.concat(bgColorHex.slice(4, 7).concat(colorHex.slice(1, 4))); console.log('Used text & background colors to generate new border: ' + shapeStroke + ' & fill: ' + shapeFill + '\n'); // Loop following calculations twice to generate double or higher digit values for the shape for (var i = 0; i < 2; i++) { // Avoid negative results by converting result to absolute value // Pick a random digit from the number string earlier, add the current shapeSize value, serve as float, multiply by Pi and add 10 for sizes between ~50 and ~150 for greater balance shapeSize = Math.abs(10 + Math.PI * parseFloat(shapeSize + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))))); // Same as above except you substract 100 instead of adding 10. This will make the shape roll around the middle shapeCtrX = Math.abs(Math.PI * parseFloat(shapeCtrX + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); shapeCtrY = Math.abs(Math.PI * parseFloat(shapeCtrY + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); } console.log('Used number string to determine polygon shape size: ' + shapeSize + ' X-axis center value: ' + shapeCtrX + ' & Y-axis center value: ' + shapeCtrY + '\n'); // Reduce number string to single digit with the digital root formula function digitalRoot(input) { var nrStr = input.toString(), i, result = 0; if (nrStr.length === 1) { return +nrStr; } for (i = 0; i < nrStr.length; i++) { result += +nrStr[i]; } return digitalRoot(result); } // Print digital root result digiRoot = digitalRoot(wordNrs); console.log('Calculated digital root of number string: ' + digiRoot + '\n'); // Check if result is odd or even function NrChk(nr) { return nr % 2; } console.log('Checking if digital root is odd or even: ' + NrChk(digiRoot) + '\n'); if (NrChk(digiRoot) > 0) { console.log('Generating 3 random words - digital root is odd\n'); } else { console.log('Generating 2 random words - digital root is even\n'); } // Random generator layer 1: Text // Generate set of random words - 2 for even 3 for odd. Since result will always be 0 or 1 easiest and fastest way is to just add 2. Replace "," with space for natural appeal randomStr = (randomWords(NrChk(digiRoot) + 2).toString()).split(','); console.log('Random words generated are: ' + randomStr + '\n'); // Capitalize word set and join them as single string for (var i = 0; i < randomStr.length; i++) { randomStr[i] = (randomStr[i].charAt(0)).toUpperCase() + randomStr[i].slice(1); } wordsOut = randomStr.join(' '); console.log('Capitalizing random words string: ' + wordsOut + '\n'); // Generate image from the random words, while using the library's debug mode to render to file // Exporting images to folders that do not exist yet may cause errors because of FS/OS permissions. Try creating them manually if you encounter such issue. var textPath = './src/texts/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Text Layer].png'; console.log('Exporting random words string as image to: ' + textPath + '\n'); const dataUri = await textToImage.generate(idHex + ' ' + wordsOut, { debug: true, debugFilename: textPath, maxWidth: 330, customHeight: 33, fontSize: 18, fontFamily: 'Arial', lineHeight: 22, margin: 5, bgColor: bgColorHex, textColor: colorHex, textAlign: 'center', verticalAlign: 'top', }); // Random generator layer 2: Icon // Set icon parameters var iconSize = 350; var iconSeed = wordsOut; // Export icon to png const iconExport = jdenticon.toPng(iconSeed, iconSize); var iconPath = './src/icons/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Icon Layer].png'; console.log('Using random words string as seed to generate icon at: ' + iconPath + '\n'); fs.writeFileSync(iconPath, iconExport); // Random generator Layer 3: Shape // Create new canvas object and set the context to 2d const shapeCanvas = new Canvas(350, 350); const shapeContext = shapeCanvas.getContext('2d'); // Start drawing path on canvas console.log('Using polygon settings to draw path points & paint shape...\n'); shapeContext.beginPath(); // Pick four incomprehensively generated points for the drawing path. Feel free to play around with the formula until you get desireable results shapeContext.moveTo(shapeCtrX + shapeSize * (Math.floor(Math.random() * 100 * Math.cos(shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * 10 * Math.sin(shapeSides * shapeSize))), shapeCtrX + shapeSize * (Math.floor(Math.random() * 1000 * Math.tan(shapeCtrY * shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * (1 / Math.tan(shapeCtrX * shapeSides))))); // Connect the path points according to randomly picked number of sides for the polygon for (var i = 1; i <= shapeSides; i++) { shapeContext.lineTo(shapeCtrX + shapeSize * Math.cos(i * 2 * Math.PI / shapeSides), shapeCtrY + shapeSize * Math.sin(i * 2 * Math.PI / shapeSides)); } // Close drawing path to complete the drawn object then proceed with applying border width and color, as well as fill color shapeContext.closePath(); shapeContext.strokeStyle = shapeStroke; shapeContext.fillStyle = shapeFill; shapeContext.fill(); shapeContext.lineWidth = shapeSides; shapeContext.stroke(); // Record shape data URI to image buffer then render to preferred path const shapeBuffer = shapeCanvas.toBuffer("image/png"); var shapePath = './src/shapes/' + shapeSides + ' ' + shapeStroke + '.png'; console.log('Exporting polygon shape as image to: ' + shapePath + '\n'); fs.writeFileSync(shapePath, shapeBuffer); ``` ### Merge layers and record local metadata to JSON With all three parts of the image layer ready, you can finalize its generation by merging the text, icon, and shape renders into one. To accomplish this, you can use the `mergeImages` library with the `shapePath`, `iconPath`, and `textPath` you defined earlier as parameters in this order. Don’t forget to create a `mergePath` variable with the appropriate location and file name for the output. Then, add an additional object parameter to `mergeImages`, where the `Canvas` and `Image` keys have the same values. Finish off the merger, by including a `.then` function with the `response`as its parameter, where you use the `ImageDataURI.outputFile()` method with the `response` parameter once again, as well as the `mergePath` variable you just defined. ```jsx generate.js // Merge existing layers by combining them in image buffer as data URI then output to file var mergePath = './src/merged/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Merged].png'; console.log('Merging all layers & exporting image to: ' + mergePath + '\n'); mergeImages([shapePath, iconPath, textPath], { Canvas: Canvas, Image: Image }).then(function(response) { ImageDataURI.outputFile(response, mergePath) }); ``` Once ready, go ahead and create a new asynchronous `updateLocalMetadata()` function, which you referenced during the audio layer generation process. Its aim is to write to JSON the metadata information you will then use to pin your media files to Chainstack IPFS and set your NFT metadata with. The `updateLocalMetadata()` function takes a long list of parameters, namely `idHex`, `coverPath`, `audioPath`, `wordsOut`, `colorHex`, `digiRoot`, `requestPayload`, and `length`. As its first order of business, create a new `filePath` constant, where you set the appropriate location and file name for the JSON file you will be writing. Then, inside a `try` block, read the `filePath` and check if the file exists. If it does, you will be parsing its contents as an object, so you can update it accordingly. But if it doesn’t you get to write everything directly. Set the `idHex` value as the key of the object property and its value to another object with `name`, `description`, `cover`, and `audio` as its keys. For the `name` key, go ahead and use the `idHex` and `wordsOut` values, so you get a title like `idHex: wordsOut`, ultimately resulting in something like `133337: Test Value West`. In turn, as`description`, you can use something along the lines of `A generative music NFT created with metadata seeds. Words: wordsOut, Color: colorHex, Digital Root: digiRoot, Mood: mood, Genre: genre, Theme: theme, Tempo: tempo, Length: length`. By doing this, you will be able to fully highlight as many generation parameters as possible to paint a good picture of your NFT’s generative process and define its possible traits. Lastly, for `cover`, as well as `audio` just apply the correct path for each. Once you’ve set up the object, stringify it and write it back to the file located at `filePath`. ```jsx generate.js // Create a JSON with the locations of each generated set of media metadata const updateLocalMetadata = async (idHex, coverPath, audioPath, wordsOut, colorHex, digiRoot, requestPayload, length) => { console.log(`\nAttempting to create JSON with local metadata details...`); const filePath = path.join(__dirname, '../src/output/local-metadata.json'); try { const data = await fsPromises.readFile(filePath, 'utf8'); // If the file exists, parse its content, add the new object, and write it back to the file const json = data ? JSON.parse(data) : {}; json[idHex] = { name: `${idHex}: ${wordsOut}`, description: `A generative music NFT created with metadata seeds. Words: ${wordsOut}, Color: ${colorHex}, Digital Root: ${digiRoot}, Mood: ${requestPayload.mood}, Genre: ${requestPayload.genre}, Theme: ${requestPayload.theme}, Tempo: ${requestPayload.tempo}, Length: [${length}s]`, cover: coverPath, audio: audioPath }; await fsPromises.writeFile(filePath, JSON.stringify(json, null, 2), 'utf8'); console.log(`\nLocal metadata JSON created at ${filePath}...\n`); } catch (err) { if (err.code === 'ENOENT') { // If the file doesn't exist, initialize it as an empty object await fsPromises.writeFile(filePath, JSON.stringify({ [idHex]: { name: `${idHex}: ${wordsOut}`, description: `A generative music NFT created with metadata seeds. Words: ${wordsOut}, Color: ${colorHex}, Digital Root: ${digiRoot}, Mood: ${requestPayload.mood}, Genre: ${requestPayload.genre}, Theme: ${requestPayload.theme}, Tempo: ${requestPayload.tempo}, Length: [${length}s]`, cover: coverPath, audio: audioPath } }, null, 2), 'utf8'); console.log(`\nLocal metadata JSON created at ${filePath}...\n`); } else { throw err; } } }; }; ``` That being said, the generation process is now complete, which means you get to put a definitive end to your `generate.js` script. Let’s recap with the entirety of the script to avoid something going AWOL somewhere along the way: ```jsx generate.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const fsPromises = fs.promises; const path = require('path'); const https = require('https'); const randomWords = require('random-words'); // 1.3.0 const textToImage = require('text-to-image'); const jdenticon = require('jdenticon'); const { Canvas, Image } = require('canvas'); const ImageDataURI = require('image-data-uri'); const mergeImages = require('merge-images'); const axios = require('axios'); // Load environment variables const address = process.env.WALLET; const soundraw = process.env.SOUNDRAW; // Initialize generation parameters let randomHex; let randomStr; let wordNrs; let digiRoot; let wordsOut = ''; let colorHex = '#'; let bgColorHex = '#'; let shapeSides = ''; let shapeSize = ''; let shapeCtrX = ''; let shapeCtrY = ''; let shapeStroke = '#'; let shapeFill = '#'; let idHex = ''; // Soundraw parameters const moods = ["Angry", "Busy & Frantic", "Dark", "Dreamy", "Elegant", "Epic", "Euphoric", "Fear", "Funny & Weird", "Glamorous", "Happy", "Heavy & Ponderous", "Hopeful", "Laid back", "Mysterious", "Peaceful", "Restless", "Romantic", "Running", "Sad", "Scary", "Sentimental", "Sexy", "Smooth", "Suspense"]; const genres = ["Acoustic", "Hip Hop", "Beats", "Funk", "Pop", "Drum n Bass", "Trap", "Tokyo night pop", "Rock", "Latin", "House", "Tropical House", "Ambient", "Orchestra", "Electro & Dance", "Electronica", "Techno & Trance"]; const themes = ["Ads & Trailers", "Broadcasting", "Cinematic", "Corporate", "Comedy", "Cooking", "Documentary", "Drama", "Fashion & Beauty", "Gaming", "Holiday season", "Horror & Thriller", "Motivational & Inspiring", "Nature", "Photography", "Sports & Action", "Technology", "Travel", "Tutorials", "Vlogs", "Wedding & Romance", "Workout & Wellness"]; const length = 77 const fileFormat = "mp3"; const tempo = ["low", "normal", "high"]; const energyLevels = ["Low", "Medium", "High", "Very High"]; const generator = async() => { // Random generator layer 0: Seed preparations console.log('\nSeed generation started...\n'); // Generate random hex with 20 bytes for symbols (same as wallet addresses) randomHex = web3.utils.randomHex(20).concat(address.slice(2)); console.log('Random hex generated: ' + randomHex + '\n'); // Generate ids for filenames to organize easier idHex = randomHex.slice(2, 5).concat(randomHex.slice(79, 82)) console.log('Used hex to generate ID: ' + idHex + '\n'); // Generate random hex color value by picking random characters from the generated hex string for (var i = 0; i < 6; i++) { colorHex = colorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); bgColorHex = bgColorHex.concat(randomHex.slice(2).charAt(Math.floor(Math.random() * randomHex.slice(2).length))); } console.log('Used hex to generate text color: ' + colorHex + ' & background color: ' + bgColorHex + '\n'); // Generate new string by combining the random hex output with wallet address and converting it to number string wordNrs = web3.utils.hexToNumberString(randomHex); console.log('Transformed hex into number string: ' + wordNrs + '\n'); // Select Soundraw parameters based on the wordNrs number string let categories = [moods, genres, themes, tempo]; let categoryNames = ['Mood', 'Genre', 'Theme', 'Tempo']; let numberOfTimeframes = 3; let requestPayload = {}; // Create a loop to generate a random index for the current category for (let i = 0; i < 4; i++) { // Create an array that will hold the randomIndex value for each iteration of the following loop let randomIndices = [] // Iterate loop three times to reach all possible options with double-digit values for (let index = 0; index < 3; index++) { randomIndices[index] = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) } // Sum the results from each iteration abd make sure they match the category length let randomIndex = randomIndices.reduce((a, b) => a + b, 0); if (randomIndex >= categories[i].length) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } else if (randomIndex < 0) { randomIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomIndex >= categories[i].length || randomIndex < 0) { randomIndex = 0 } } let categorySelected = categories[i][randomIndex]; if (categoryNames[i] !== 'Tempo') { requestPayload[categoryNames[i].toLowerCase()] = categorySelected; } else { requestPayload.tempo = categorySelected; } } // Create arrays for holding the energy level objects and their lengths let energyLevelsArray = []; let lengths = []; for (let j = 0; j < numberOfTimeframes - 1; j++) { lengths.push(Math.random()); } lengths.sort(); // Adjust the lengths so they are proportional and add up to the audio length accordingly let previous = 0; for (let j = 0; j < numberOfTimeframes; j++) { lengths[j] = lengths[j] * length - previous; previous = lengths[j]; lengths[j] = parseFloat(lengths[j].toFixed(1)); } let currentTime = 0; // Generate different energy levels for different timeframes for (let j = 0; j < numberOfTimeframes; j++) { let energyStart = parseFloat(currentTime.toFixed(1)); let energyEnd = j < numberOfTimeframes - 1 ? parseFloat((currentTime + lengths[j]).toFixed(1)) : length; currentTime = energyEnd; // Apply the same logic as for randomIndex previously without the tripple iteration let randomEnergyIndex randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } else if (randomEnergyIndex < 0) { randomEnergyIndex = parseInt(0 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))) if (randomEnergyIndex >= energyLevels.length || randomEnergyIndex < 0) { randomEnergyIndex = 0 } } let selectedEnergy = energyLevels[randomEnergyIndex]; energyLevelsArray.push({ start: energyStart, end: energyEnd, energy: selectedEnergy }); } // Update the request payload requestPayload.energy_levels = energyLevelsArray; requestPayload.length = length; requestPayload.file_format = fileFormat; // Print selected parameters and make them the audio filename let filename = `${idHex} ${requestPayload.mood} ${requestPayload.genre} ${requestPayload.theme} ${requestPayload.tempo} [${length}s].mp3`; // Submit an axios request to the Soundraw API and fetch the audio file console.log(`Attempting to submit request to Soundraw API with parameters ${JSON.stringify(requestPayload, null, 2)}\n`); axios({ method: 'post', url: 'https://soundraw.io/api/v2/musics/compose', data: requestPayload, headers: { "Content-Type": "application/json", "Authorization": soundraw } }) .then(async function (response) { const audioFilePath = path.join('audio', filename); console.log(`Soundraw request successful. Response: ${JSON.stringify(response.data)}`); const formattedAudioFilePath = './src/' + audioFilePath.replace(/\\/g, "/"); // replace backslashes with forward slashes const file = fs.createWriteStream(path.join(__dirname, '../src', audioFilePath)); const request = https.get(response.data.mp3_url, function (response) { response.pipe(file).on('finish', async function () { // Call the function to update the JSON file try { console.log(`\nSoundraw audio saved to: ${formattedAudioFilePath}`); await updateLocalMetadata(idHex, mergePath, formattedAudioFilePath, wordsOut, colorHex, digiRoot, requestPayload, length); } catch (err) { console.error(err); } }); }); request.on('error', (err) => { console.error(`Request error: ${err}`); file.end(); }); file.on('error', (err) => { console.error(`File error: ${err}`); file.end(); }); }) .catch(function (error) { console.log(error); }); // Begin calculations for random shape layer generation parameters // Randomize shape parameters but ensure they are never zero // Find out the number of sides the shape has by picking a random number from the number string shapeSides = parseInt(1 + wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))); console.log('Used number string to determine polygon shape sides count: ' + shapeSides + '\n'); // Combine the first three digits of one of the two hex color values picked earlier with the last three of the other for greater variation shapeStroke = shapeStroke.concat(colorHex.slice(4, 7).concat(bgColorHex.slice(1, 4))); shapeFill = shapeFill.concat(bgColorHex.slice(4, 7).concat(colorHex.slice(1, 4))); console.log('Used text & background colors to generate new border: ' + shapeStroke + ' & fill: ' + shapeFill + '\n'); // Loop following calculations twice to generate double or higher digit values for the shape for (var i = 0; i < 2; i++) { // Avoid negative results by converting result to absolute value // Pick a random digit from the number string earlier, add the current shapeSize value, serve as float, multiply by Pi and add 10 for sizes between ~50 and ~150 for greater balance shapeSize = Math.abs(10 + Math.PI * parseFloat(shapeSize + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length))))); // Same as above except you substract 100 instead of adding 10. This will make the shape roll around the middle shapeCtrX = Math.abs(Math.PI * parseFloat(shapeCtrX + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); shapeCtrY = Math.abs(Math.PI * parseFloat(shapeCtrY + parseInt(wordNrs.charAt(Math.floor(Math.random() * wordNrs.length)))) - 100); } console.log('Used number string to determine polygon shape size: ' + shapeSize + ' X-axis center value: ' + shapeCtrX + ' & Y-axis center value: ' + shapeCtrY + '\n'); // Reduce number string to single digit with the digital root formula function digitalRoot(input) { var nrStr = input.toString(), i, result = 0; if (nrStr.length === 1) { return +nrStr; } for (i = 0; i < nrStr.length; i++) { result += +nrStr[i]; } return digitalRoot(result); } // Print digital root result digiRoot = digitalRoot(wordNrs); console.log('Calculated digital root of number string: ' + digiRoot + '\n'); // Check if result is odd or even function NrChk(nr) { return nr % 2; } console.log('Checking if digital root is odd or even: ' + NrChk(digiRoot) + '\n'); if (NrChk(digiRoot) > 0) { console.log('Generating 3 random words - digital root is odd\n'); } else { console.log('Generating 2 random words - digital root is even\n'); } // Random generator layer 1: Text // Generate set of random words - 2 for even 3 for odd. Since result will always be 0 or 1 easiest and fastest way is to just add 2. Replace "," with space for natural appeal randomStr = (randomWords(NrChk(digiRoot) + 2).toString()).split(','); console.log('Random words generated are: ' + randomStr + '\n'); // Capitalize word set and join them as single string for (var i = 0; i < randomStr.length; i++) { randomStr[i] = (randomStr[i].charAt(0)).toUpperCase() + randomStr[i].slice(1); } wordsOut = randomStr.join(' '); console.log('Capitalizing random words string: ' + wordsOut + '\n'); // Generate image from the random words, while using the library's debug mode to render to file // Exporting images to folders that do not exist yet may cause errors because of FS/OS permissions. Try creating them manually if you encounter such issue. var textPath = './src/texts/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Text Layer].png'; console.log('Exporting random words string as image to: ' + textPath + '\n'); const dataUri = await textToImage.generate(idHex + ' ' + wordsOut, { debug: true, debugFilename: textPath, maxWidth: 330, customHeight: 33, fontSize: 18, fontFamily: 'Arial', lineHeight: 22, margin: 5, bgColor: bgColorHex, textColor: colorHex, textAlign: 'center', verticalAlign: 'top', }); // Random generator layer 2: Icon // Set icon parameters var iconSize = 350; var iconSeed = wordsOut; // Export icon to png const iconExport = jdenticon.toPng(iconSeed, iconSize); var iconPath = './src/icons/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Icon Layer].png'; console.log('Using random words string as seed to generate icon at: ' + iconPath + '\n'); fs.writeFileSync(iconPath, iconExport); // Random generator Layer 3: Shape // Create new canvas object and set the context to 2d const shapeCanvas = new Canvas(350, 350); const shapeContext = shapeCanvas.getContext('2d'); // Start drawing path on canvas console.log('Using polygon settings to draw path points & paint shape...\n'); shapeContext.beginPath(); // Pick four incomprehensively generated points for the drawing path. Feel free to play around with the formula until you get desireable results shapeContext.moveTo(shapeCtrX + shapeSize * (Math.floor(Math.random() * 100 * Math.cos(shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * 10 * Math.sin(shapeSides * shapeSize))), shapeCtrX + shapeSize * (Math.floor(Math.random() * 1000 * Math.tan(shapeCtrY * shapeSides))), shapeCtrY + shapeSize * (Math.floor(Math.random() * (1 / Math.tan(shapeCtrX * shapeSides))))); // Connect the path points according to randomly picked number of sides for the polygon for (var i = 1; i <= shapeSides; i++) { shapeContext.lineTo(shapeCtrX + shapeSize * Math.cos(i * 2 * Math.PI / shapeSides), shapeCtrY + shapeSize * Math.sin(i * 2 * Math.PI / shapeSides)); } // Close drawing path to complete the drawn object then proceed with applying border width and color, as well as fill color shapeContext.closePath(); shapeContext.strokeStyle = shapeStroke; shapeContext.fillStyle = shapeFill; shapeContext.fill(); shapeContext.lineWidth = shapeSides; shapeContext.stroke(); // Record shape data URI to image buffer then render to preferred path const shapeBuffer = shapeCanvas.toBuffer("image/png"); var shapePath = './src/shapes/' + shapeSides + ' ' + shapeStroke + '.png'; console.log('Exporting polygon shape as image to: ' + shapePath + '\n'); fs.writeFileSync(shapePath, shapeBuffer); // Merge existing layers by combining them in image buffer as data URI then output to file var mergePath = './src/merged/' + idHex + ' ' + wordsOut + ' ' + colorHex + ' [Merged].png'; console.log('Merging all layers & exporting image to: ' + mergePath + '\n'); mergeImages([shapePath, iconPath, textPath], { Canvas: Canvas, Image: Image }).then(function(response) { ImageDataURI.outputFile(response, mergePath) }); // Create a JSON with the locations of each generated set of media metadata const updateLocalMetadata = async (idHex, coverPath, audioPath, wordsOut, colorHex, digiRoot, requestPayload, length) => { console.log(`\nAttempting to create JSON with local metadata details...`); const filePath = path.join(__dirname, '../src/local-metadata.json'); try { const data = await fsPromises.readFile(filePath, 'utf8'); // If the file exists, parse its content, add the new object, and write it back to the file const json = data ? JSON.parse(data) : {}; json[idHex] = { name: `${idHex}: ${wordsOut}`, description: `A generative music NFT created with metadata seeds. Words: ${wordsOut}, Color: ${colorHex}, Digital Root: ${digiRoot}, Mood: ${requestPayload.mood}, Genre: ${requestPayload.genre}, Theme: ${requestPayload.theme}, Tempo: ${requestPayload.tempo}, Length: [${length}s]`, cover: coverPath, audio: audioPath }; await fsPromises.writeFile(filePath, JSON.stringify(json, null, 2), 'utf8'); console.log(`\nLocal metadata JSON created at ${filePath}`); } catch (err) { if (err.code === 'ENOENT') { // If the file doesn't exist, initialize it as an empty object await fsPromises.writeFile(filePath, JSON.stringify({ [idHex]: { name: `${idHex}: ${wordsOut}`, description: `A generative music NFT created with metadata seeds. Words: ${wordsOut}, Color: ${colorHex}, Digital Root: ${digiRoot}, Mood: ${requestPayload.mood}, Genre: ${requestPayload.genre}, Theme: ${requestPayload.theme}, Tempo: ${requestPayload.tempo}, Length: [${length}s]`, cover: coverPath, audio: audioPath } }, null, 2), 'utf8'); console.log(`\nLocal metadata JSON created at ${filePath}`); } else { throw err; } } }; }; // Don't forget to run the entire process! generator(); ``` ## Step 3: Pinning to Chainstack IPFS and minting First things first, create a new `pin.js` script inside your `scripts/` directory, if you don’t have one already. If you do, you will be mostly updating your script throughout this step but if you don’t, rest easy for you will find detailed explanations of the process here too. So, go ahead and start by processing the necessary dependencies. ```jsx pin.js // Process dependencies require('dotenv').config(); const fs = require('fs'); const fsPromises = require('fs').promises; const path = require('path'); const axios = require('axios'); const FormData = require('form-data'); ``` Next, create an asynchronous function, `generateContent`, to read and parse the`local-metadata.json` you generated in the previous script to a JSON object that contains the metadata of each media file to be pinned. Now, iterate over the `json` object. For each property in `json`, create an `element` object to store its value, as well as a new `content` array to hold the media file details. Create a set of four new constants `coverTitle`, `audioTitle`, `tokenTitle`, and `tokenDescr` to set the titles for the image and audio files, as well as to retrieve the name and description of the NFT from the stored `element` values. Push the image and audio files to the `content` array as streamable files with their respective titles. Next, push each `content` array set, along with the `tokenTitle` and `tokenDescr` to the `allContent` array. Then, return the `allContent` array, containing all the details of the media files waiting to be pinned. ```jsx pin.js // Define the media files to be pinned async function generateContent() { const data = await fsPromises.readFile(path.join(__dirname, '../src/output/local-metadata.json'), 'utf8'); const json = JSON.parse(data); let allContent = []; for (const key in json) { if (json.hasOwnProperty(key)) { const element = json[key]; const content = []; const coverTitle = path.basename(element.cover); const audioTitle = path.basename(element.audio); const tokenTitle = element.name; const tokenDescr = element.description; content.push({ file: fs.createReadStream(path.join(__dirname, '..', element.cover)), title: coverTitle }); content.push({ file: fs.createReadStream(path.join(__dirname, '..', element.audio)), title: audioTitle }); allContent.push({ content: content, tokenTitle: tokenTitle, tokenDescr: tokenDescr }); } } return allContent; } ``` Having established the base data in `allContent`, move on to create another new function—`addFiles`, which you will be using to pin files using the Chainstack IPFS Storage API. This function loops through each file in the `source` array, sending an HTTP POST request to the Chainstack IPFS pinning service. The `source` parameter is the array of media files to pin, and the `single` parameter determines the endpoint to which the request is sent. If `single` is `true`, the request is sent to the endpoint for pinning single files, otherwise, it's sent to the endpoint for pinning multiple files. The function uses a `while` loop to attempt the pinning process multiple times if it fails. On each attempt, it logs the attempt number, error messages, and ID of the successfully pinned file. ```jsx pin.js // Define a function to pin files with Chainstack IPFS Storage const addFiles = async (source, single = false) => { const url = single ? "https://api.chainstack.com/v1/ipfs/pins/pinfile" : "https://api.chainstack.com/v1/ipfs/pins/pinfiles"; const pubIDs = []; const maxRetries = 7; const retryTimeout = 22222; for (let file of source) { let retries = 0; while (retries < maxRetries) { try { console.log(`Attempting to pin ${file.title} with Chainstack IPFS Storage... Attempt number: ${retries + 1}\n`); const data = new FormData(); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', file.file); data.append('title', file.title); ``` Create a configuration object for the HTTP request. It includes the request method, URL, headers, and data. The headers contain the content type, authorization, and other headers required by the `FormData` instance. Execute the HTTP request using `axios` and store the response in the `response` variable. If the pinning is successful, get the public ID of the pinned file from `response.data.id` or `response.data[0].id`, depending on whether a single or multiple files were pinned. Add the ID to the `pubIDs` array. If the pinning process fails, catch the error and retry the pinning process after waiting for the timeout period. If the process still fails after the maximum number of retries, throw an error. Return the `pubIDs` array, which contains the IDs of the successfully pinned files. ```jsx pin.js const config = { method: 'POST', url: url, headers: { "Content-Type": 'multipart/form-data;', "Authorization": process.env.CHAINSTACK, ...data.getHeaders() }, data: data }; const response = await axios(config); let id; if (single) { console.log(`Successfully pinned ${file.title} with Chainstack IPFS Storage using public ID: ${JSON.stringify(response.data.id)}\n`); id = response.data.id; id = Array.isArray(id) ? id : [id]; } else { console.log(`Successfully pinned ${file.title} with Chainstack IPFS Storage using public ID: ${JSON.stringify(response.data[0].id)}\n`); id = response.data[0].id; } pubIDs.push(id); // If successful, break the loop break; } catch (error) { console.error(`Error in addFiles: ${error.message}.. Attempting to retry...\n`); // Retry after the timeout if unsuccessful retries++; console.log(`Retrying after error. Current retry count is: ${retries}`); await new Promise((resolve) => setTimeout(resolve, retryTimeout)); // If max retries is reached and still failing, throw the error if (retries === maxRetries) { throw new Error(`Failed after ${maxRetries} attempts. ${error.message}`); } } } } return pubIDs; }; ``` Continue by defining an asynchronous `findCIDs` function to retrieve the CID for each pinned file, which accepts two parameters: `fileID` and `single` flag, indicating if the `fileID` provided is single or not. If the `single` flag is true, remove any double quotes from the `fileID` and make sure it is an array. If it's not, convert it into one. Next, define constants `maxRetries` and `retryTimeout` to control how many times the function should attempt to find the CID before giving up, and how long it should wait between each attempt. If the `single` flag is false, make the function create two empty arrays, `cid` and `name`, and then run a loop over the `fileID` array. For each `fileID`, recursively call the `findCIDs` function with the `single` flag set to true, pushing the results into the `cid` and `name` arrays respectively. On the other hand, if the `single` flag is true, initiate a loop controlled by the `retries` counter and `maxRetries` constant. In each iteration, send a `GET` request to the Chainstack IPFS API and log the response. If a valid CID and filename are found, break out of the loop and return them. In case of any error during the HTTP request, log the error message, increment the `retries` counter, and wait for the `retryTimeout` duration before continuing the loop. Finally, return an array containing the found CIDs and filenames when the `single` flag is set to false, and a single pair of CID and filename when the `single` flag is set to true. ```jsx pin.js // Define a function to find CIDs for files pinned with Chainstack IPFS Storage const findCIDs = async (fileID, single = false) => { if (single) { fileID = fileID.replace(/"/g, ''); fileID = Array.isArray(fileID) ? fileID : [fileID]; } // Define the maximum retries and the timeout between retries const maxRetries = 7; const retryTimeout = 22222; if (!single) { let cid = []; let name = []; // Loop through all the pinned files for (var i = 0; i < fileID.length; i++) { // Get the CID and filename for the file const result = await findCIDs(fileID[i], true); cid.push(result[0]); name.push(result[1]); } // Print the CIDs found and return the cid and name values console.log(`All CIDs found: ${cid.join(', ')}\n`); return [cid, name]; } else { let cid; let name; let retries = 0; // Set up the retry loop while (retries < maxRetries) { try { console.log(`Attempting to find CID using public ID: ${fileID} with Chainstack IPFS Storage...\n`); // Define the Axios configuration const url = "https://api.chainstack.com/v1/ipfs/pins/" + fileID; var config = { method: 'GET', url: url, headers: { "Content-Type": 'text/plain', "Authorization": process.env.CHAINSTACK, "Accept-Encoding": 'identity', }, decompress: false }; // Store the Axios response const response = await axios(config); console.log(`CID found: ${response.data.cid} Filename: ${response.data.title}\n`); cid = response.data.cid; name = response.data.title; // Throw an error if the cid and name values are not valid if (cid != null && cid !== 'error' && name != null && name !== 'error') { break; } else { // Throw an error if the CID and filename are not valid throw new Error('CID or name values are not valid.'); } } catch (error) { console.error(`Error in findCIDs: ${error.message}.. Attempting to retry...\n`); // Retry after the timeout if unsuccessful retries++; await new Promise((resolve) => setTimeout(resolve, retryTimeout)); } } return [cid, name]; } }; ``` After the CIDs have been found, proceed to create an asynchronous `writeJSON` function, which will handle the creation of metadata for each NFT. As parameters, make sure it accepts the `pinCID`, `pinName`, `tokenTitle`, and `tokenDescr` you have defined earlier. For the first order of business for your `writeJSON` function, create two temporary variables—`audioIPFS` and `coverIPFS`. These will be used to store the full IPFS gateway URLs, so you can display the image and audio file in the metadata. Don’t forget to check if there is a valid `pinCID` and `pinName` to avoid erroneous data. Next, create a `for` loop with `pinName.length` being the determinant. Inside the loop, create an `if-else` statement, checking if a given `pinName` contains the `.mp3` file extension. Set the `audioIPFS` value to the gateway base URL, paired with the corresponding `pinCID` if it’s an MP3, or `coverIPFS` if it isn’t. Considering the audio files generated are MP3s, you can use this to differentiate between audio and image file in a simple manner. Then, write the properly formatted metadata you have collected using `fs.writeFileSync()`, and after checking everything was written correctly, return the temporary `jsonMeta` object. ```jsx pin.js // Define a function to write the metadata to a .json file const writeJSON = async (pinCID, pinName, tokenTitle, tokenDescr) => { let audioIPFS; let coverIPFS; if (pinCID && pinName) { for (var i = 0; i < pinName.length; i++) { if (pinName[i].includes('mp3')) { audioIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } else { coverIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } } // Write the metadata to the file ./src/NFTmetadata.json fs.writeFileSync(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`, JSON.stringify({ "description": tokenDescr, "external_url": "https://chainstack.com/nfts/", "image": coverIPFS, "animation_url": audioIPFS, "name": tokenTitle })); let jsonMeta; if (fs.existsSync(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`)) { jsonMeta = { file: fs.createReadStream(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`), title: `${tokenTitle.replace(/:/g, '')}.json` }; } return jsonMeta; } }; ``` Lastly, define the main asynchronous function `pinNFT` to run the entire process of pinning the NFT metadata. Within this function, wrap everything in a `try` block to handle any errors that might occur during execution. Call the `generateContent` function which should return an array containing all NFT data from a local metadata file and initialize an array `nftURLs` to store the URLs of the pinned metadata. Then, loop through each NFT in the `allNFTs` array. For each NFT, extract the `content`, `tokenTitle`, and `tokenDescr` fields. For each file in the `content`, call the `addFiles` function and wait for the retry time out. Afterwards, call the `findCIDs` function and add the returned CID and filename to the `pinCIDs` and `pinNames` arrays respectively. After all files for a particular NFT have been processed, call the `writeJSON` function passing `pinCIDs`, `pinNames`, `tokenTitle`, and `tokenDescr` as arguments. This function should return a JSON metadata file. Then, call the `addFiles` function again, this time for the JSON metadata file, and wait for the timeout before getting its CID using the `findCIDs` function. Finally, add the IPFS URL of the JSON metadata file to the `nftURLs` array. After looping through all of them, write the `nftURLs` array to a `metadataURLs.json`file in the `./src/output/` directory. Don't forget to call the `pinNFT` function to start the pinning process. ```jsx pin.js // Define the main function that executes all necessary functions to pin the NFT metadata const pinNFT = async () => { try { // Generate the content from local metadata file const allNFTs = await generateContent(); // Initialize array to store the pinned metadata urls let nftURLs = []; for (let nft of allNFTs) { const { content, tokenTitle, tokenDescr } = nft; let pinCIDs = []; let pinNames = []; // Ensure all files for this entry are pinned before moving on to the next for (let file of content) { const ids = await addFiles([file]); await new Promise((resolve) => setTimeout(resolve, 22222)); const [pinCID, pinName] = await findCIDs(ids); pinCIDs.push(pinCID[0]); pinNames.push(pinName[0]); await new Promise((resolve) => setTimeout(resolve, 22222)); } const jsonMeta = await writeJSON(pinCIDs, pinNames, tokenTitle, tokenDescr); await new Promise((resolve) => setTimeout(resolve, 22222)); const id = await addFiles([jsonMeta]); await new Promise((resolve) => setTimeout(resolve, 22222)); const jsonCID = await findCIDs(id); console.log(`NFT metadata for ${tokenTitle} successfully pinned with Chainstack IPFS Storage!\n`); // Add the metadata URL to the nftURLs array nftURLs.push(`https://ipfsgw.com/ipfs/${jsonCID[0]}`); } // Write the metadata URLs to JSON console.log(`Writing metadata URL to ./src/output/metadataURLs.json...\n`); fs.writeFileSync('./src/output/metadataURLs.json', JSON.stringify(nftURLs, null, 2)); } catch (error) { console.error(`Error during NFT pinning: ${JSON.stringify(error)}`); } }; // Don't forget to run the main function! pinNFT(); ``` Once the script finishes running, the metadata for all NFTs will have been successfully pinned to Chainstack IPFS Storage, and their URLs will be saved in `metadataURLs.json`. Here’s the entire script to recap: ```jsx pin.js // Process dependencies require('dotenv').config(); const fs = require('fs'); const fsPromises = require('fs').promises; const path = require('path'); const axios = require('axios'); const FormData = require('form-data'); // Define the media files to be pinned async function generateContent() { const data = await fsPromises.readFile(path.join(__dirname, '../src/output/local-metadata.json'), 'utf8'); const json = JSON.parse(data); let allContent = []; for (const key in json) { if (json.hasOwnProperty(key)) { const element = json[key]; const content = []; const coverTitle = path.basename(element.cover); const audioTitle = path.basename(element.audio); const tokenTitle = element.name; const tokenDescr = element.description; content.push({ file: fs.createReadStream(path.join(__dirname, '..', element.cover)), title: coverTitle }); content.push({ file: fs.createReadStream(path.join(__dirname, '..', element.audio)), title: audioTitle }); allContent.push({ content: content, tokenTitle: tokenTitle, tokenDescr: tokenDescr }); } } return allContent; } // Define a function to pin files with Chainstack IPFS Storage const addFiles = async (source, single = false) => { const url = single ? "https://api.chainstack.com/v1/ipfs/pins/pinfile" : "https://api.chainstack.com/v1/ipfs/pins/pinfiles"; const pubIDs = []; const maxRetries = 7; const retryTimeout = 22222; for (let file of source) { let retries = 0; while (retries < maxRetries) { try { console.log(`Attempting to pin ${file.title} with Chainstack IPFS Storage... Attempt number: ${retries + 1}\n`); const data = new FormData(); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', file.file); data.append('title', file.title); const config = { method: 'POST', url: url, headers: { "Content-Type": 'multipart/form-data;', "Authorization": process.env.CHAINSTACK, ...data.getHeaders() }, data: data }; const response = await axios(config); let id; if (single) { console.log(`Successfully pinned ${file.title} with Chainstack IPFS Storage using public ID: ${JSON.stringify(response.data.id)}\n`); id = response.data.id; id = Array.isArray(id) ? id : [id]; } else { console.log(`Successfully pinned ${file.title} with Chainstack IPFS Storage using public ID: ${JSON.stringify(response.data[0].id)}\n`); id = response.data[0].id; } pubIDs.push(id); // If successful, break the loop break; } catch (error) { console.error(`Error in addFiles: ${error.message}.. Attempting to retry...\n`); // Retry after the timeout if unsuccessful retries++; console.log(`Retrying after error. Current retry count is: ${retries}`); await new Promise((resolve) => setTimeout(resolve, retryTimeout)); // If max retries is reached and still failing, throw the error if (retries === maxRetries) { throw new Error(`Failed after ${maxRetries} attempts. ${error.message}`); } } } } return pubIDs; }; // Define a function to find CIDs for files pinned with Chainstack IPFS Storage const findCIDs = async (fileID, single = false) => { if (single) { fileID = fileID.replace(/"/g, ''); fileID = Array.isArray(fileID) ? fileID : [fileID]; } // Define the maximum retries and the timeout between retries const maxRetries = 7; const retryTimeout = 22222; if (!single) { let cid = []; let name = []; // Loop through all the pinned files for (var i = 0; i < fileID.length; i++) { // Get the CID and filename for the file const result = await findCIDs(fileID[i], true); cid.push(result[0]); name.push(result[1]); } // Print the CIDs found and return the cid and name values console.log(`All CIDs found: ${cid.join(', ')}\n`); return [cid, name]; } else { let cid; let name; let retries = 0; // Set up the retry loop while (retries < maxRetries) { try { console.log(`Attempting to find CID using public ID: ${fileID} with Chainstack IPFS Storage...\n`); // Define the Axios configuration const url = "https://api.chainstack.com/v1/ipfs/pins/" + fileID; var config = { method: 'GET', url: url, headers: { "Content-Type": 'text/plain', "Authorization": process.env.CHAINSTACK, "Accept-Encoding": 'identity', }, decompress: false }; // Store the Axios response const response = await axios(config); console.log(`CID found: ${response.data.cid} Filename: ${response.data.title}\n`); cid = response.data.cid; name = response.data.title; // Throw an error if the cid and name values are not valid if (cid != null && cid !== 'error' && name != null && name !== 'error') { break; } else { // Throw an error if the CID and filename are not valid throw new Error('CID or name values are not valid.'); } } catch (error) { console.error(`Error in findCIDs: ${error.message}.. Attempting to retry...\n`); // Retry after the timeout if unsuccessful retries++; await new Promise((resolve) => setTimeout(resolve, retryTimeout)); } } return [cid, name]; } }; // Define a function to write the metadata to a .json file const writeJSON = async (pinCID, pinName, tokenTitle, tokenDescr) => { let audioIPFS; let coverIPFS; if (pinCID && pinName) { for (var i = 0; i < pinName.length; i++) { if (pinName[i].includes('mp3')) { audioIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } else { coverIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } } // Write the metadata to the file ./src/NFTmetadata.json fs.writeFileSync(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`, JSON.stringify({ "description": tokenDescr, "external_url": "https://chainstack.com/nfts/", "image": coverIPFS, "animation_url": audioIPFS, "name": tokenTitle })); let jsonMeta; if (fs.existsSync(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`)) { jsonMeta = { file: fs.createReadStream(`./src/jsons/${tokenTitle.replace(/:/g, '')}.json`), title: `${tokenTitle.replace(/:/g, '')}.json` }; } return jsonMeta; } }; // Define the main function that executes all necessary functions to pin the NFT metadata const pinNFT = async () => { try { // Generate the content from local metadata file const allNFTs = await generateContent(); // Initialize array to store the pinned metadata urls let nftURLs = []; for (let nft of allNFTs) { const { content, tokenTitle, tokenDescr } = nft; let pinCIDs = []; let pinNames = []; // Ensure all files for this entry are pinned before moving on to the next for (let file of content) { const ids = await addFiles([file]); await new Promise((resolve) => setTimeout(resolve, 22222)); const [pinCID, pinName] = await findCIDs(ids); pinCIDs.push(pinCID[0]); pinNames.push(pinName[0]); await new Promise((resolve) => setTimeout(resolve, 22222)); } const jsonMeta = await writeJSON(pinCIDs, pinNames, tokenTitle, tokenDescr); await new Promise((resolve) => setTimeout(resolve, 22222)); const id = await addFiles([jsonMeta]); await new Promise((resolve) => setTimeout(resolve, 22222)); const jsonCID = await findCIDs(id); console.log(`NFT metadata for ${tokenTitle} successfully pinned with Chainstack IPFS Storage!\n`); // Add the metadata URL to the nftURLs array nftURLs.push(`https://ipfsgw.com/ipfs/${jsonCID[0]}`); } // Write the metadata URLs to JSON console.log(`Writing metadata URL to ./src/output/metadataURLs.json...\n`); fs.writeFileSync('./src/output/metadataURLs.json', JSON.stringify(nftURLs, null, 2)); } catch (error) { console.error(`Error during NFT pinning: ${JSON.stringify(error)}`); } }; // Don't forget to run the main function! pinNFT(); ``` ### Preparing your mints Now it’s time to move forward with the final script—`mint.js`, so go ahead and create it, if you don’t have it already. Start by processing your dependencies, in this case, `dotenv`, `hardhat-web3`, `fs`, and `path`, then initialize your wallet address and private key by loading them from your `.env` file. ```jsx mint.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const path = require('path'); // Initialize your wallet address and private key const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; ``` Next, create a new global`contractAdrs` variable to set the smart contract address for the selected network. If you're using the `sepolia` network, set it to `SEPOLIA_CONTRACT`, and for the Ethereum Mainnet, set it to `MAINNET_CONTRACT`. ```jsx mint.js // Initialize your deployed smart contract address for the selected network let contractAdrs; if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT contractAdrs = contractENV; } else if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT; contractAdrs = contractENV; } else { const contractENV = process.env.MAINNET_CONTRACT; contractAdrs = contractENV; } ``` Then, create a new global `contractName` constant to store the name of your NFT contract, so it is easier to look up the appropriate Hardhat artifact. Read the artifact file and parse it as a JSON object to load the `ABI` you will be needing further down the script. Remember to load the `metadataURLs.json` file as the value of a new global `metadataUrls` constant and use the `web3.eth.Contract` method to create a new contract object. Set the interactions origin to your Ethereum wallet address to be able to call the methods in your smart contract. ```jsx mint.js // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; // Load metadata URLs from file const metadataUrls = require('../src/output/metadataURLs.json'); // Create a new contract object and set interactions origin to the owner address const contractObj = new web3.eth.Contract(contractABI, contractAdrs, { from: address, }); ``` Afterwards, define an asynchronous `startMint` function where all the magic of minting happens. Proceed by creating an empty `txUrls` array which will be used to store the URLs of all minting transactions, as well as a temporary `nonce` variable, whose value should be set to `await` the `getTransactionCount` method with your `address` as a parameter. This will prevent any errors caused by overlapping transactions. Following that, iterate over every URL in the `metadataUrls` array. For each of them, you're going to mint an NFT. Within the loop, estimate the gas required to mint each NFT first. Then, create a transaction for the minting process, which you then sign using your private key. Make sure you ask for your receipt too, once your transaction has been sent to the network successfully. Use the information from the receipt to generate a valid Etherscan URL for your transaction. Then, add the Etherscan URL to the `txUrls` array, and increment the nonce for the next transaction. Lastly, set a timeout after each NFT mint to allow for network propagation and write all transaction URLs to a `mintTXs.json` file in the `./src/output/` directory. Naturally, don’t forget to call the `startMint` function at the end too! ```jsx mint.js // Define the minting function const startMint = async () => { console.log(`\nAttempting to mint on ${network.name} to: ${address}...\n`); // Create an array to store all transaction URLs let txUrls = []; // Get the current transaction count, which will serve as the initial nonce let nonce = await web3.eth.getTransactionCount(address); // Iterate over each metadata URL to mint NFT for (const metadata of metadataUrls) { // Estimate the gas costs needed to process the transaction const gasCost = await contractObj.methods.safeMint(address, metadata).estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas} for metadata: ${metadata}\n`); else console.error(`Error estimating gas: ${err} for metadata: ${metadata}\n`); }); // Define the transaction details and sign it const mintTX = await web3.eth.accounts.signTransaction( { from: address, to: contractAdrs, data: contractObj.methods.safeMint(address, metadata).encodeABI(), gas: gasCost, nonce: nonce, }, privKey, ); // Get transaction receipt const createReceipt = await web3.eth.sendSignedTransaction(mintTX.rawTransaction); // Provide appropriate network for Etherscan link let etherscanUrl; if (network.name !== 'mainnet') { etherscanUrl = `https://${network.name}.etherscan.io/tx/${createReceipt.transactionHash}`; console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: ${etherscanUrl}\n`); } else { etherscanUrl = `https://etherscan.io/tx/${createReceipt.transactionHash}`; console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: ${etherscanUrl}\n`); } // Push the transaction URL to the array txUrls.push(etherscanUrl); // Increment the nonce for the next transaction nonce++; // Wait before the next mint console.log(`Allowing time for network propagation...`); await new Promise((resolve) => setTimeout(resolve, 22222)); } // Write all the transaction URLs to the JSON file console.log(`Writing transaction URLs to ./src/output/mintTXs.json...\n`); fs.writeFileSync('./src/output/mintTXs.json', JSON.stringify(txUrls, null, 2)); }; // Don't forget to run the main function! startMint(); ``` And to put a definitive curtains call on our tutorial, let’s do one final recap with the full `mint.js` script. You can find all files involved in the making of this tutorial in the [full tutorial repo](https://github.com/chainstacklabs/generative-ai-music-nft-minter-tutorial-repo). ```jsx mint.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const path = require('path'); // Initialize your wallet address and private key const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; // Initialize your deployed smart contract address for the selected network let contractAdrs; if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT contractAdrs = contractENV; } else if (network.name == 'sepolia') { const contractENV = process.env.sepolia_CONTRACT; contractAdrs = contractENV; } else { const contractENV = process.env.MAINNET_CONTRACT; contractAdrs = contractENV; } // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; // Load metadata URLs from file const metadataUrls = require('../src/output/metadataURLs.json'); // Create a new contract object and set interactions origin to the owner address const contractObj = new web3.eth.Contract(contractABI, contractAdrs, { from: address, }); // Define the minting function const startMint = async () => { console.log(`\nAttempting to mint on ${network.name} to: ${address}...\n`); // Create an array to store all transaction URLs let txUrls = []; // Get the current transaction count, which will serve as the initial nonce let nonce = await web3.eth.getTransactionCount(address); // Iterate over each metadata URL to mint NFT for (const metadata of metadataUrls) { // Estimate the gas costs needed to process the transaction const gasCost = await contractObj.methods.safeMint(address, metadata).estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas} for metadata: ${metadata}\n`); else console.error(`Error estimating gas: ${err} for metadata: ${metadata}\n`); }); // Define the transaction details and sign it const mintTX = await web3.eth.accounts.signTransaction( { from: address, to: contractAdrs, data: contractObj.methods.safeMint(address, metadata).encodeABI(), gas: gasCost, nonce: nonce, }, privKey, ); // Get transaction receipt const createReceipt = await web3.eth.sendSignedTransaction(mintTX.rawTransaction); // Provide appropriate network for Etherscan link let etherscanUrl; if (network.name !== 'mainnet') { etherscanUrl = `https://${network.name}.etherscan.io/tx/${createReceipt.transactionHash}`; console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: ${etherscanUrl}\n`); } else { etherscanUrl = `https://etherscan.io/tx/${createReceipt.transactionHash}`; console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: ${etherscanUrl}\n`); } // Push the transaction URL to the array txUrls.push(etherscanUrl); // Increment the nonce for the next transaction nonce++; // Wait before the next mint console.log(`Allowing time for network propagation...`); await new Promise((resolve) => setTimeout(resolve, 22222)); } // Write all the transaction URLs to the JSON file console.log(`Writing transaction URLs to ./src/output/mintTXs.json...\n`); fs.writeFileSync('./src/output/mintTXs.json', JSON.stringify(txUrls, null, 2)); }; // Don't forget to run the main function! startMint(); ``` ## Bringing it all together Congratulations on successfully traversing through the captivating world of minting generative music NFTs! You have now carved out a thorough process for creating, deploying, and managing your personal cache of digital music tokens. By harnessing the potency of blockchain technology, individuals like you—whether creators or collectors—can make the most of this novel channel for creative display, secure ownership, and economic gain. As you step forward on your journey with music NFTs, bear in mind that the scope for innovation is limitless. Fiddle with a variety of parameters and smart contract features to craft NFTs that not only encapsulate your distinct artistic flair but also provide valuable content for your patrons. Whether you are an established music maker or an up-and-coming artist, music NFTs provide the chance to unlock fresh opportunities and revamp the way you interact with and value the auditory arts. So don't hold back, embrace the adventure, and kickstart the process of minting your exclusive music NFTs. Broadcast your creations far and wide, and experience the revolutionizing impact of this frontier technology on the dynamic world of music. Ready to see the generation in action? Find a collection of generative music NFTs, minted in the process of the tutorial on Sepolia OpenSea. Here are some of the results: Onwards to a symphony of success, happy minting! ### About the author Senior Copywriter @ Chainstack Writes on Ethereum, NFTs, and underlying technology I BUIDL tutorials insightful so your dev experience can be delightful. [](https://github.com/petarsrepo) [](https://twitter.com/petarcopyrock) [](https://www.linkedin.com/in/pstoykov/) # How to mint a music NFT: Dropping fire tunes with Chainstack IPFS storage Source: https://docs.chainstack.com/docs/how-to-mint-music-nft-with-chainstack-ipfs-storage * Creates a Music NFT using a custom ERC721 contract (via OpenZeppelin Wizard) with metadata pointing to audio and cover files on IPFS * Deploys contract with Hardhat to testnet; verifies it on Etherscan * Pins the audio and cover image plus NFT metadata using Chainstack IPFS Storage * Mints the NFT from your wallet, allowing you to see your music NFT on MetaMask/OpenSea with embedded audio ## Main article In the digital age, artists are constantly seeking innovative ways to share, monetize, and protect their work. Enter the world of non-fungible tokens (NFTs)—a revolutionary way for musicians to create, sell, and own their masterpieces. As the music industry evolves, NFTs are becoming the driving force behind a paradigm shift in how artists generate income and maintain control over their creations. The music industry has always been a hotbed of innovation and change, from vinyl records to streaming services. Now, the advent of blockchain technology and the rise of NFTs are opening up an entirely new frontier for musicians and music enthusiasts alike. With the potential to revolutionize how we create, own, and trade music, NFTs are rapidly becoming an essential tool for artists looking to embrace the digital era. But how does one go about minting a music NFT? If you've been wondering about this very question, you've come to the right place. In this comprehensive guide, we'll walk you through the process step by step, demystifying the world of music NFTs and empowering you to join the revolution. So, whether you're an aspiring musician, a dedicated collector, or simply curious about this exciting new development, read on and get ready to make some noise in the NFT space. ## How to mint a music NFT? Minting NFTs through coding can be slightly more challenging than relying on a marketplace platform to handle the process for you. However, it provides you with greater flexibility and allows you to engage with the core mechanics of minting. Let’s explore how you can get started: ## Step 1: Get started with the basics ### 1.1: Obtain a node endpoint Running your own node can be a time-consuming process. Instead, you can quickly and effortlessly deploy a node with Chainstack, saving both time and effort. To get started, visit the Chainstack website, [create a free account](/docs/manage-your-account), deploy a Sepolia node, and obtain its HTTPS endpoint. Don’t worry about your testnet choice, the code base in this tutorial is structured in a way that will allow you to switch between Sepolia, and Mainnet freely, without having to rewrite everything. If you're unsure about the process, follow the steps in our easy-to-understand [quickstart guide](/docs/manage-your-networks#join-a-public-network). ### 1.2: Install core dependencies If you haven't installed [node.js](https://nodejs.org/en/download) yet, go ahead and do so. Once that's done, you'll need to set up a workspace for your codebase. Create a new directory in your preferred location and initialize your node.js project by entering the following commands in your CLI (command line interface): ```shell CLI npm init -y ``` The `-y` flag indicates that all the default values should be used without prompting the user for input. And while you’re still on the topic of initializing your project, go ahead and do so for git, in case you haven’t done that already. To make this happen, [download and install git](https://git-scm.com/downloads), if it’s not present on your system yet, then execute the following in a freshly run CLI instance: ```shell CLI git init ``` Next, install the Hardhat library with the `web3` and `hardhat-verify` plugins, which will provide you with the functionality required to interact with your node, as well as verify your deployed contract: ```shell CLI npm install --save-dev @nomiclabs/hardhat-web3 'web3' @nomicfoundation/hardhat-verify ``` After the installation is complete, initialize your Hardhat project for JavaScript with: ```shell CLI npx hardhat ``` Once the initialization is complete, you will find a `hardhat.config.js` file in your project `root`. Open it and replace the contents with the following: ```javascript hardhat.config.js // Process dependencies require("@nomiclabs/hardhat-web3"); require("@nomicfoundation/hardhat-verify"); ``` ### 1.3: Securely store your secrets To securely store all values, which are best left away from prying eyes, such as your `endpoint` you'll utilize a `.env` file. But before you set off to create such a file, go ahead and install the `dotenv` package using `npm` via CLI: ```shell CLI npm i dotenv ``` With the package installed, create a `.env` file in your project root and transfer over your endpoint URL there as the first key-value pair. If you want to use all three Ethereum networks (Sepolia, and Mainnet) you can set it up like so: ```bash .env SEPOLIA="YOUR_SEPOLIA_ENDPOINT" MAINNET="YOUR_MAINNET_ENDPOINT" ``` This makes it possible for you to load the `SEPOLIA`, or `MAINNET` value from your `.env` file in any script in your project that has the `dotenv` dependency processed appropriately. So, go ahead and break the ice by adding `require('dotenv').config();` in your `hardhat.config.js` script as early as possible. Referencing `dotenv` before any and all other dependencies prevents possible conflicts, which may arise due to an inappropriate loading order. Proceed by initializing each network with its corresponding endpoint in `hardhat.config.js` as the values for the `url` keys, using the dotenv `process.env` method. You can also set a `PRIVATE_KEY` value for the `accounts` keys already, considering you will be getting a new one in the next section. Here’s how your `hardhat.config.js` should look like, once you’ve set everything up correctly: ```javascript hardhat.config.js // Process dependencies require("dotenv").config(); require("@nomiclabs/hardhat-web3"); require("@nomicfoundation/hardhat-verify"); // Define Hardhat settings module.exports = { solidity: "0.8.17", networks: { mainnet: { url: process.env.MAINNET, accounts: [process.env.PRIVATE_KEY] }, sepolia: { url: process.env.SEPOLIA, accounts: [process.env.PRIVATE_KEY] }, sepolia: { url: process.env.SEPOLIA, accounts: [process.env.PRIVATE_KEY] }, }, }; ``` Before you continue, however, it is crucial to remember that you should NEVER upload your `.env` file to any public repository, as it will serve as a container for all your sensitive information moving forward. Apart from your endpoint URL, you will also store things like your private key inside the `.env` file, which could expose your entire wallet if leaked. Having this in mind, go ahead and create a `.gitignore` file if you don’t have one already, so you can add `.env` to it before you publish. Refer to the following example for how you can do that: ```bash .gitignore # dotenv environment variables file .env .env.test ``` ### 1.4: Create a new wallet and fund it Once the initialization is complete, create a new `/scripts` directory and add a new `wallet.js` file in it. When you’ve created the new script, include the following lines of code to initialize the necessary dependencies: ```javascript scripts/wallet.js // Process dependencies require('dotenv').config() require("@nomiclabs/hardhat-web3"); ``` Next, you will need to set up a function to create your wallet. To do this, you can use the web3.js method `web3.eth.accounts.create();`, just make sure you return the `address` and `privateKey` values at the end of your `async` function: ```javascript scripts/wallet.js // Create a new wallet then return the address and private key const createWallet = async () => { const wallet = web3.eth.accounts.create(); return [wallet.address, wallet.privateKey]; } ``` Once entered, you will also need to create another `async` function that will fund your wallet with testnet ETH from the [Chainstack faucet](https://faucet.chainstack.com). To do this you will first need to take care of a few things: 1. [Create a Chainstack API key](/reference/platform-api-getting-started#create-api-key) and copy it, which will be similar to `Bearer y0urChainstackAPIkeyHer3`, then store it in your `.env` file as `CHAINSTACK`: ```shell .env CHAINSTACK="Bearer YOUR_CHAINSTACK_API_KEY" ``` 2. Install axios library to be able to send HTTP requests to the faucet: ```shell Shell npm i axios ``` 3. Require `axios` at the end of your `wallet.js` dependencies like so: ```javascript scripts/wallet.js // Process dependencies require('dotenv').config() require("@nomiclabs/hardhat-web3"); const axios = require('axios'); ``` With that taken care of, you can move forward with your `fundWallet` function, by adding `address` and `apiKey` as its required parameters. Next, create a new `const` called `apiUrl`, add the faucet API URL, and place `${network.name}` at the end of it to be able to select the right testnet version: ```javascript scripts/wallet.js // Fund the wallet using the Chainstack faucet const fundWallet = async (address, apiKey) => { const apiUrl = `https://api.chainstack.com/v1/faucet/${network.name}`; } ``` Using the Hardhat `network.name` variable allows you to switch between networks with the `--network` parameter whenever you are running a particular script with Hardhat. Hopping back to your `fundWallet` function, it is time for you to close it off with a `try-catch` loop that will contain your `axios` request to the Chainstack faucet. Inside the `try` part of the loop, add a new `const` called `response`, where you will set the `axios` settings and return the response: ```javascript scripts/wallet.js // Fund the wallet using the Chainstack faucet const fundWallet = async (address, apiKey) => { const apiUrl = `https://api.chainstack.com/v1/faucet/${network.name}`; try { const response = await axios.post(apiUrl, { address }, { headers: { 'Authorization': apiKey, 'Content-Type': 'application/json', }, }); return response.data; } ``` This creates a `POST` request to the Chainstack faucet `apiURL` with your `address`, your `apiKey` as the `Authorization` header, and `application/json` as the `Content-Type`. And with the `try` part of the loop taken care of, you can wrap up the entire thing by catching and then throwing an `error`, should it occur. Here’s how your `wallet.js` script should look like at this point: ```javascript scripts/wallet.js // Process dependencies require('dotenv').config() require("@nomiclabs/hardhat-web3"); const axios = require('axios'); // Create a new wallet then return the address and private key const createWallet = async () => { const wallet = web3.eth.accounts.create(); return [wallet.address, wallet.privateKey]; } // Fund the wallet using the Chainstack faucet const fundWallet = async (address, apiKey) => { const apiUrl = `https://api.chainstack.com/v1/faucet/${network.name}`; try { const response = await axios.post(apiUrl, { address }, { headers: { 'Authorization': apiKey, 'Content-Type': 'application/json', }, }); return response.data; } catch (error) { throw error; } }; ``` But hey, hang on a minute! There is still something missing from your `wallet.js` script—some of the variables are not set yet and neither are the calls that will run the two functions you have set. So, go ahead and create them but do so in a new `async` function called `main`. And while you’re at it, why not add some visual feedback to make it easy for you to process their results. Since there is little new to learn with this function, let’s straight up recap with the full `wallet.js` script until now: ```javascript scripts/wallet.js // Process dependencies require('dotenv').config() require("@nomiclabs/hardhat-web3"); const axios = require('axios'); // Create a new wallet then return the address and private key const createWallet = async () => { const wallet = web3.eth.accounts.create(); return [wallet.address, wallet.privateKey]; } // Fund the wallet using the Chainstack faucet const fundWallet = async (address, apiKey) => { const apiUrl = `https://api.chainstack.com/v1/faucet/${network.name}`; try { const response = await axios.post(apiUrl, { address }, { headers: { 'Authorization': apiKey, 'Content-Type': 'application/json', }, }); return response.data; } catch (error) { throw error; } }; // Main function to generate a new wallet and fund it using the Chainstack faucet const main = async () => { try { // Config for the Faucet API call const apiKey = process.env.CHAINSTACK; console.log('\nAttempting to generate new wallet...\n') const [address, privateKey] = await createWallet(); console.log(`Created new wallet with address: ${address}\n`); console.log(`New private key: ${privateKey} === KEEP IT SAFE ===\n`); console.log(`Copy the following and replace the "WALLET" and "PRIVATE_KEY" lines in your ".env" file:\n\nWALLET="${address}" \nPRIVATE_KEY="${privateKey}"\n`); console.log(`Sending ${network.name} faucet request for address ${address}...\n`); const fundResponse = await fundWallet(address, apiKey); console.log(`Successfully funded ${address} on ${network.name} for ${fundResponse.amountSent}ETH.\n\nView transaction on Etherscan: ${fundResponse.transaction}\n`); } catch (error) { console.error('An error occurred:', error.response.data); } } // Don't forget to run the main function! main(); ``` With the `wallet.js` script fully set up, the time has come for you to launch it via CLI using Hardhat: ```shell CLI # Sepolia npx hardhat run scripts/wallet.js --network sepolia ``` You will get the following response if you have the additional visual feedback applied: ```shell CLI Attempting to generate new wallet... Created new wallet with address: 0x13a310e3FfAa420D317F4d51C85225FDEB8e6eAd New private key: 0x70472ba98ccf4ab019cbd3f1124dd3cc95a46cd60f8465ba67e3163914287576 === KEEP IT SAFE === Copy the following and replace the "WALLET" and "PRIVATE_KEY" lines in your ".env" file: WALLET="0x13a310e3FfAa420D317F4d51C85225FDEB8e6eAd" PRIVATE_KEY="0x70472ba98ccf4ab019cbd3f1124dd3cc95a46cd60f8465ba67e3163914287576" Sending sepolia faucet request for address 0x13a310e3FfAa420D317F4d51C85225FDEB8e6eAd... Successfully funded 0x13a310e3FfAa420D317F4d51C85225FDEB8e6eAd on sepolia for 0.5ETH. View transaction on Etherscan: https://sepolia.etherscan.io/tx/0x2827d3886ca45b63f2692caf3189fef67128825eaebc5c9ab0f6e7184deaf73a ``` Congrats, you have now created a wallet and funded it successfully with the Chainstack faucet! Your next step is to enter the two keys into your `.env` file as values for the `WALLET` and `PRIVATE_KEY` keys. You can also use a pre-existing wallet address with the appropriate private key attached to it, by setting them in your `.env` file instead. ### 1.5: Verify your wallet balance Now it's time to verify if your balance has been updated. Create a new file named `balance.js` within the `/scripts` directory and copy over the first two dependencies from your `wallet.js` script, without adding the one for `axios`. Then, create a new `address` constant and set its value to the `WALLET` you just copied in your `.env` file, using `process.env.WALLET`: ```javascript scripts/balance.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); // Initialize your wallet address const address = process.env.WALLET; ``` With that taken care of, go ahead and create an asynchronous function that will use the `address` constant as a parameter and within it call the web3.js `getBalance` method. Since it will take more time for your node to process this request than that of executing the rest of the function's code locally, make sure you add `await` prior to the method: ```javascript scripts/balance.js // Define your get balance function const getbal = async (address) => { // Call the web3.js getBalance method const balance = await web3.eth.getBalance(address); }; ``` In doing so you will prevent the rest of the code from processing before the method's promise is resolved once a response is returned. By default, the `getBalance` method will return a very barebones balance value in wei, so go ahead and add some extra visual feedback by calling the web3.js `fromWei` method to convert the wei output to ETH units. Once ready, go ahead and wrap things up by calling your `getbal`function with the `address` parameter at the end of your script: ```javascript scripts/balance.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); // Initialize your wallet address const address = process.env.WALLET; // Define your get balance function const getbal = async (address) => { // Call the web3.js getBalance method const balance = await web3.eth.getBalance(address); // Return your wallet balance in Wei and ETH on the selected network console.log(`\nChecking ${network.name} balance for address: ${address}...\n\nYour balance is: ${balance}Wei\nThis amounts to: ${web3.utils.fromWei(balance)}ETH\n`); }; // Don't forget to run your get balance function! getbal(address); ``` Once you’re ready, go ahead and run the script via Hardhat in CLI using the `--network` parameter: ```shell CLI npx hardhat run scripts/balance.js --network $NETWORK ``` ```shell CLI Checking sepolia balance for address: 0x13a310e3FfAa420D317F4d51C85225FDEB8e6eAd... Your balance is: 500000000000000000Wei This amounts to: 0.5ETH ``` ## Step 2: Prepare and deploy the smart contract ### 2.1: Draft an NFT smart contract Now that you've completed all the necessary preparations, it's time to create your smart contract. While this might seem intimidating at first, you can rest easy knowing that [OpenZeppelin](https://www.openzeppelin.com/) provides pre-built, security-audited contract templates. The cherry on top? You can [use their wizard](https://wizard.openzeppelin.com/#erc721) to create a customized contract that perfectly fits your needs. For our sample project, we will be using the following settings: [OpenZeppelin Contract Wizard](https://wizard.openzeppelin.com/#erc721) Aside from the name and symbol, which are quite self-explanatory, we'll include a `Mintable` option with `Auto Increment Ids` enabled. This feature allows privileged accounts (e.g., your account) to mint new tokens, which can represent new additions to your collection. We also need to enable the `URI Storage` option, as it allows us to attach media files like images to our NFTs. Additionally, we'll incorporate the `Ownable` option to enable administrative actions. While there are more parameters available in the wizard, they fall outside the scope of this tutorial. However, don't hesitate to experiment with them if you wish to explore additional functionalities beyond the basic ones we've implemented so far. Finally, ensure that the OpenZeppelin dependencies are available in your project by installing its `contracts` library with the following: ```shell CLI npm i @openzeppelin/contracts ``` ### 2.2: Compile the minter smart contract With your contract ready, copy the code from the OpenZeppelin wizard into a new file, such as `MyMusicNFT.sol`, or click **Download** in the top-right corner. Then, create a new directory called `contracts` in your project `root` and place your smart contract in it. Hardhat automatically compiles a new contract when it needs it, without additional input from your end but you can also do that manually using `npx hardhat compile`. This will compile all contracts located in the `contracts` directory, so if you have more than one, they will all be process accordingly. Before you move forward with deploying your compiled contract, however, there are some things to take care of first: 1. Create an [Etherscan account](https://etherscan.io/register) and [API key](https://etherscan.io/myapikey), so you can verify the contract once it is deployed. 2. Save the API key in your `.env` file as value for the `ETHERSCAN` key: ```shell .env SEPOLIA="YOUR_SEPOLIA_ENDPOINT" MAINNET="YOUR_MAINNET_ENDPOINT" CHAINSTACK="Bearer YOUR_CHAINSTACK_API_KEY" WALLET="YOUR_WALLET_ADDRESS" PRIVATE_KEY="YOUR_WALLET_PRIVATE_KEY" ETHERSCAN="YOUR_ETHERSCAN_API_KEY" ``` 3. Create a `deploy.js` script in your `/scripts` directory and add the following: ```javascript scripts/deploy.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); require("@nomicfoundation/hardhat-verify"); const fs = require('fs'); const path = require('path'); const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; ``` Apart from referencing the hardhat-verify library you installed earlier, there are two packages you haven’t encountered in this tutorial yet—`fs` and `path`, both of which ship by default with node.js. The former, `fs`, is a file system module, that allows you to interact with local files, while the latter is used for handling and transforming file paths. As the next step, create a few constants called `contractName`, `artifactPath`, `contractArtifact`, `contractABI`, and `contractBIN`. The first one will be used by Hardhat to determine which contract you will be interacting with, while the second and third to locate and read the compiled contract artifact. This artifact contains your smart contract’s `ABI` and its `bytecode` , or `BIN`. ### ABI an BIN The application binary interface (ABI) facilitates interaction between software modules, translating Solidity contract calls for Ethereum's Virtual Machine (EVM) and decoding transaction data. On the other hand, bytecode (BIN) is the binary output of Solidity code and consists of machine-readable instructions including one-byte "opcodes", hence the name. The artifact, containing your smart contract’s `ABI` and `BIN` is automatically generated when you compile it with Hardhat as `/artifacts/contracts/YourContractName.sol/YourContractName.json`, where `YourContractName` is the name of your contract. In this tutorial’s case, it is `MyFirstMusicNFT`, so go ahead and set the `contractName` constant's value to your smart contract's actual name. To help your code discover the location of the artifact, you can use the `path.resolve` method with `__dirname` as the first parameter, and the JSON path mentioned in the paragraph above, preceded by `..` to indicate the parent directory. ### About `__dirname` In node.js, `__dirname` is an environment variable that gives the absolute path to the directory of the currently running file. Unlike `./`, which denotes the current directory of a file, or `../`, which refers to its parent directory, `__dirname` always points to the precise directory where the executing file resides. Then, go ahead and read the JSON file with the `fs.readFileSync` method as the first parameter but make sure you have set `utf-8` as the encoding type in the second. Lastly, pass the entire thing as a JSON object by wrapping it with `JSON.parse`. Here’s how this part of your script should look like: ```javascript scripts/deploy.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); require("@nomicfoundation/hardhat-verify"); const fs = require('fs'); const path = require('path'); const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI and bytecode const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; const contractBIN = contractArtifact.bytecode; ``` ### 2.3: Deploy your smart contract Next, it's time for you to put together the rest of the `deploy.js` script by adding a new function to handle the actual deployment process. To do that, define a fresh asynchronous `main` function first and create a new contract object constant `contractNFT` by calling the web3.js `Contract` method with `contractABI` and `address` as parameters, respectively. Then, deploy the `contractNFT` contract object as a constant `contractTX` transaction object by applying the `deploy` method to it with a `data` parameter set to `contractBIN`. ```javascript scripts/deploy.js // Create asynchronous function to deploy your contract async function main() { console.log(`\nAttempting to deploy the ${contractName} contract on ${network.name} from: ${address}\n`); // Create new contract object const contractNFT = new web3.eth.Contract(contractABI, address); // Deploy contract object as a transaction const contractTX = await contractNFT.deploy({ // Set transaction data as the contract bytecode data: contractBIN, }); } ``` Once you've set this up, proceed by creating a function to estimate the gas of the contract deployment transaction `contractTX`. To do that, just slap the web3.js `estimateGas` method at the back of the `contractTX` object, and add some relevant visual feedback. Then, continue by calling the web3.js `signTransaction` method with two parameters, the first being an object, and the second—`privKey`. Compose the object parameter using the key `from` set to `address`, `data` as `contractTX` with the `encodeABI` method applied to it, and `gasCost` as value for the`gas` one as the third. ```javascript scripts/deploy.js // Estimate the gas costs needed to process the transaction const gasCost = await contractTX.estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas}...`); else console.error(`Error estimating gas: ${err}...`); }); // Sign the transaction const createTransaction = await web3.eth.accounts.signTransaction({ // Define transaction parameters from: address, data: contractTX.encodeABI(), gas: gasCost, }, privKey ); ``` With this taken care of, your next step is to ask for a receipt with a `createReceipt` constant with value calling the web3.js `sendSignedTransaction` method with a single parameter, preceded by an `await` operator. Set the `createTransaction` constant as a value for it, apply the `rawTransaction` method to it, and then add some visual feedback to display the receipt as the closing statement for the entire `deploy` function. Don't forget to run it either! ```javascript scripts/deploy.js // Return transaction receipt const createReceipt = await web3.eth.sendSignedTransaction( createTransaction.rawTransaction ); // Log contract address from receipt console.log(`\nContract successfully deployed on ${network.name} at: ${createReceipt.contractAddress} \n\nCopy the following line to your ".env" file:\n\n${network.name.toUpperCase()}_CONTRACT="${createReceipt.contractAddress}"\n`); ``` Now, it’s time to verify your contract with Hardhat after it has been deployed. But considering it takes roughly 5 blocks of confirmations before you are able to do that, let’s set up a function to wait for the appropriate time before launching the verification. The function in itself just checks the current block using the web3.js `getBlockNumber` method in a given interval, so just copy over the following outside the `main` function loop: ```javascript scripts/deploy.js // Wait for `n` blocks function async function waitForBlocks(n) { // Get the latest block number let latestBlockNumber = await web3.eth.getBlockNumber(); console.log(`Current block number: ${latestBlockNumber}...`); // Calculate the block number to wait for let targetBlockNumber = latestBlockNumber + n; console.log(`Waiting until block number: ${targetBlockNumber}...`); // Check for the right block at a given interval return new Promise((resolve) => { let interval = setInterval(async () => { latestBlockNumber = await web3.eth.getBlockNumber(); console.log(`Checked latest block number: ${latestBlockNumber}...`); // Check if the current block number matches the one to wait for if (latestBlockNumber >= targetBlockNumber) { clearInterval(interval); console.log(`Target block reached: ${latestBlockNumber}...\n`); resolve(); } }, 5000); // Set polling interval as per your need. }); } ``` Last, you need to set up the verification function itself, so create a new asynchronous `verifyContract` function and inside it call the `waitForBlocks` and `run("verify:verify")`, while waiting for them to execute first. Make sure to also add the `address` and `constructorArguments` parameters, with `createReceipt.contractAddress` as value for the former and a blank array `[]` as one for the latter: ```javascript scripts/deploy.js // Verify the contract async function verifyContract() { console.log("Verifying contract in 5 blocks...\n"); // Wait for 5 blocks before running verification await waitForBlocks(5); await run("verify:verify", { address: createReceipt.contractAddress, constructorArguments: [], }); console.log("\nContract deployed and verified!\n"); } verifyContract(); } ``` That's it! Now all you have to do is run the `deploy.js` script via CLI with `npx hardhat run scripts/deploy.js --network $NETWORK` to have your very own smart contract deployed on the testnet of your choice. But still, the steps that led you here were indeed a tad bit more complex than what you had done so far in this tutorial, so let's recap by reviewing the entire deploy script's code: ```javascript scripts/deploy.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); require("@nomicfoundation/hardhat-verify"); const fs = require('fs'); const path = require('path'); const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI and bytecode const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; const contractBIN = contractArtifact.bytecode; // Create asynchronous function to deploy your contract async function main() { console.log(`\nAttempting to deploy the ${contractName} contract on ${network.name} from: ${address}\n`); // Create new contract object const contractNFT = new web3.eth.Contract(contractABI, address); // Deploy contract object as a transaction const contractTX = await contractNFT.deploy({ // Set transaction data as the contract bytecode data: contractBIN, }); // Estimate the gas costs needed to process the transaction const gasCost = await contractTX.estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas}...`); else console.error(`Error estimating gas: ${err}...`); }); // Sign the transaction const createTransaction = await web3.eth.accounts.signTransaction({ // Define transaction parameters from: address, data: contractTX.encodeABI(), gas: gasCost, }, privKey ); // Return transaction receipt const createReceipt = await web3.eth.sendSignedTransaction( createTransaction.rawTransaction ); // Log contract address from receipt console.log(`\nContract successfully deployed on ${network.name} at: ${createReceipt.contractAddress} \n\nCopy the following line to your ".env" file:\n\n${network.name.toUpperCase()}_CONTRACT="${createReceipt.contractAddress}"\n`); // Verify the contract async function verifyContract() { console.log("Verifying contract in 5 blocks...\n"); // Wait for 5 blocks before running verification await waitForBlocks(5); await run("verify:verify", { address: createReceipt.contractAddress, constructorArguments: [], }); console.log("\nContract deployed and verified!\n"); } verifyContract(); } // Wait for `n` blocks function async function waitForBlocks(n) { // Get the latest block number let latestBlockNumber = await web3.eth.getBlockNumber(); console.log(`Current block number: ${latestBlockNumber}...`); // Calculate the block number to wait for let targetBlockNumber = latestBlockNumber + n; console.log(`Waiting until block number: ${targetBlockNumber}...`); // Check for the right block at a given interval return new Promise((resolve) => { let interval = setInterval(async () => { latestBlockNumber = await web3.eth.getBlockNumber(); console.log(`Checked latest block number: ${latestBlockNumber}...`); // Check if the current block number matches the one to wait for if (latestBlockNumber >= targetBlockNumber) { clearInterval(interval); console.log(`Target block reached: ${latestBlockNumber}...\n`); resolve(); } }, 5000); // Set polling interval as per your need. }); } // Don't forget to run your main function! Add error handling too main().catch((error) => { console.error(error); process.exitCode = 1; }); ``` Congratulations! Should you have followed the steps so far correctly, your first NFT contract is now live on the testnet of your choice! With that milestone achieved, it is time to define the metadata properties of your NFT prior to minting. ## Step 3: Pin the metadata and mint your NFTs ### 3.1: Getting started with Chainstack IPFS Storage Before you can mint any NFT, you will first need to pin all relevant media files to the Interplanetary File System (IPFS). Thanks to this, you won't have to rely on the availability of any centralized provider, instead having it permanently accessible. In IPFS, uploading is adding a file to the network, while pinning is ensuring its persistent availability by preventing its removal on a specific node. To make the entire process a real walk in the park, you can use Chainstack IPFS Storage, which will provide a seamless interface and API for you to do that. You can use the interface to create the bucket and folder that you need to pin the files manually, or follow this process to do so via the API: 1. Sign in to your Chainstack account via the [console](https://console.chainstack.com/) and select **IPFS Storage** from the navigation on the left. 2. Click the **Create bucket** and enter a name of your choice. 3. Inside the bucket, click **New folder** with a name of your choice. 4. Open the folder and bucket and examine the URL in the address bar for each of them. 5. Copy the bucket ID starting with `BUCK`, for example, `BUCK-1337-8085-1337`. 6. Copy the folder ID starting with `FOLD`, for example, `FOLD-1337-8085-1337`. 7. Paste the three values in your `.env` file for the `BUCKET_ID`, `FOLDER_ID`, and `CHAINSTACK` keys like so: ```bash .env BUCKET_ID="BUCK-1337-8085-1337" FOLDER_ID="FOLD-1337-8085-1337" ``` ### 3.2: Pin your NFT media with Chainstack IPFS Storage With the prerequisites taken care of, it is time for you to pin the media files with Chainstack IPFS Storage. In the [tutorial repo](https://github.com/petarsrepo/music-nft-minter-tutorial-repo), you will find a tutorial audio file and cover image in the `src` directory that you can use freely to test the waters. Should you prefer to pin them with Chainstack IPFS Storage via the interface manually, you can do so already and jump to the next step, where you will set up the JSON metadata. Otherwise, it is time for you to create a new `pin.js` file to do that via the API. To get started with the new pin script, process the dependencies first. You will need the dotenv, fs, axios, and the FormData packages, the latter of which you can install with the following command: ```shell CLI npm install form-data ``` Here's how the start of the script should look like: ```javascript scripts/pin.js // Process dependencies require('dotenv').config(); const fs = require('fs'); const axios = require('axios'); const FormData = require('form-data'); ``` Next, define the location of the media files you will be pinning with Chainstack IPFS Storage via the API like so: ```javascript scripts/pin.js // Define the media files to be pinned with Chainstack IPFS Storage const content = [ { file: fs.createReadStream("./src/(Tutorial) My First Music NFT Cover.png"), title: "(Tutorial) My First Music NFT Cover.png" }, { file: fs.createReadStream("./src/(Tutorial) PetarISFire - Chainstackwave.mp3"), title: "(Tutorial) PetarISFire - Chainstackwave.mp3" } ]; ``` Once ready, it is time for you to define a new asynchronous `addFiles` function as a constant with `source` and `single = false` as the two parameters it seeks: ```javascript scripts/pin.js // Define a function to pin files with Chainstack IPFS Storage const addFiles = async (source, single = false) => { } ``` The second parameter `single = false` will serve to differentiate between pinning single and multiple files, as the API URLs for them are not the same. Set up the differentiation in the code like so: ```javascript scripts/pin.js // Differentiate between pinning single and multiple files const url = single ? "https://api.chainstack.com/v1/ipfs/pins/pinfile" : "https://api.chainstack.com/v1/ipfs/pins/pinfiles"; ``` Next, define a new `data` constant with a value equal to `new FormData()` and follow up with an if-else statement, checking for the state of the `single` parameter: ```javascript scripts/pin.js // Define the pin metadata const data = new FormData(); if (single) { } else { } ``` Inside the `if` statement, add some visual feedback to display which file you are attempting to pin by using the `JSON.stringify` method with the `source[0].title` value. Afterward, add the `bucket_id`, `folder_id`, `file`, and `title` to the `data` object via the `append` method. You can load the `bucket_id` and `folder_id` values from your `.env` file by setting them as the first parameter for the `append` method and using `process.env.BUCKET_ID` and `process.env.FOLDER_ID`, respectively as the second. In turn, do so for the `file` and `title` values but referencing `source[0].file` and `source[0].title` accordingly: ```javascript scripts/pin.js // Define the pin metadata const data = new FormData(); if (single) { console.log('Attempting to pin ' + JSON.stringify(source[0].title) + ' with Chainstack IPFS Storage...'); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', source[0].file); data.append('title', source[0].title); } else { } ``` The process is relatively similar for the `else` part of the statement, the only difference being that you need to use the `forEach` method to make sure all entries are referenced correctly: ```javascript scripts/pin.js // Define the pin metadata const data = new FormData(); if (single) { console.log('Attempting to pin ' + JSON.stringify(source[0].title) + ' with Chainstack IPFS Storage...'); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', source[0].file); data.append('title', source[0].title); } else { source.forEach((file) => { console.log('Attempting to pin ' + JSON.stringify(file.title) + ' with Chainstack IPFS Storage...'); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', file.file); data.append('title', file.title); }); } ``` Once ready, proceed by creating a new `config` constant, which will set up the `axios` configuration. Here you will also make a reference to your Chainstack API authorization token in a fashion similar to this: ```javascript scripts/pin.js // Define the Axios configuration const config = { method: 'POST', url: url, headers: { "Content-Type": 'multipart/form-data;', "Authorization": process.env.CHAINSTACK, ...data.getHeaders() }, data: data }; ``` Next, create a new `response` constant which will store the `axios` response, once again differentiating between single and multiple files: ```javascript scripts/pin.js // Store the Axios response const response = await axios(config); if (single) { console.log(`File successfully pinned with Chainstack IPFS Storage with public ID: ${response.data.id}\\n`); return JSON.stringify(response.data.id); } else { const pubIDs = response.data.map((item) => item.id); console.log(`Files successfully pinned with Chainstack IPFS Storage with public IDs: ${pubIDs.join(', ')}\\n`); return pubIDs; } }; ``` With the `addFiles` function successfully set up, it is time to do so for the `findCIDs` one. Its purpose will be to obtain the content IDs (CIDs) of the files you have pinned, which are unique identifiers that allow universal access via any IPFS client. So, go ahead and create it as an asynchronous function by defining a new `findCIDs` constant and set `fileID` and `single = false` as the two parameters it would accept. Then, proceed by removing possible excess characters like so: ```javascript scripts/pin.js // Define a function to find CIDs for files pinned with Chainstack IPFS Storage const findCIDs = async (fileID, single = false) => { if (single) { fileID = fileID.replace(/"/g, ''); fileID = [fileID]; } } ``` It is quite possible that the `CID` will not be ready when your script moves on to find them, so let's set up a redundancy process that will automatically retry the search. Start by defining a new `maxRetries` constant, as well as a `retryTimeout` one, setting the former's value to `5` and the latter to `22000`. Simply put, this will make the function retry for a maximum of 3 times with an 11-second timeout between each retry. Next, create an `if-else` statement with the `!single` parameter in the `if` part. Inside it, create a new `cid` and `name` temporary array variables. After that, create a `for` loop that will use the `push` method to store the CID and title values of the freshly pinned files: ```javascript scripts/pin.js // Define the maximum retries and the timeout between retries const maxRetries = 5; const retryTimeout = 22000; if (!single) { let cid = []; let name = []; // Loop through all the pinned files for (var i = 0; i < fileID.length; i++) { // Get the CID and filename for the file const result = await findCIDs(fileID[i], true); cid.push(result[0]); name.push(result[1]); } // Print the CIDs found and return the cid and name values console.log('All CIDs found:' + cid + '\\n'); return [cid, name]; } else { } ``` The rest of the function follows a similar logic but additionally features the retry loop, the axios configuration, as well as some error-checking code. Here's how the entire `findCIDs` function should look like in the end: ```javascript scripts/pin.js // Define a function to find CIDs for files pinned with Chainstack IPFS Storage const findCIDs = async (fileID, single = false) => { if (single) { fileID = fileID.replace(/"/g, ''); fileID = [fileID]; } // Define the maximum retries and the timeout between retries const maxRetries = 3; const retryTimeout = 11000; if (!single) { let cid = []; let name = []; // Loop through all the pinned files for (var i = 0; i < fileID.length; i++) { // Get the CID and filename for the file const result = await findCIDs(fileID[i], true); cid.push(result[0]); name.push(result[1]); } // Print the CIDs found and return the cid and name values console.log('All CIDs found:' + cid + '\\n'); return [cid, name]; } else { let cid; let name; let retries = 0; // Set up the retry loop while (retries < maxRetries) { try { console.log('Attempting to find CID using public ID: ' + fileID + ' with Chainstack IPFS Storage...'); // Define the Axios configuration const url = "https://api.chainstack.com/v1/ipfs/pins/" + fileID; var config = { method: 'GET', url: url, headers: { "Content-Type": 'text/plain', "Authorization": process.env.CHAINSTACK, }, }; // Store the Axios response const response = await axios(config); console.log('CID found:' + response.data.cid + ' Filename: ' + response.data.title + '\\n'); cid = response.data.cid; name = response.data.title; // Throw an error if the cid and name values are not valid if (cid != null && cid !== 'error' && name != null && name !== 'error') { break; } else { // Throw an error if the CID and filename are not valid throw new Error('CID or name values are not valid.'); } } catch (error) { console.error(`Error in findCIDs: ${error.message}. Attempting to retry...\\n`); // Retry after the timeout if unsuccessful retries++; await new Promise((resolve) => setTimeout(resolve, retryTimeout)); } } return [cid, name]; } }; ``` Once the CIDs are successfully fetched with the previous function, you will need to write them to a JSON file along with the rest of the metadata of your NFT. Said JSON file will then be used to mint an NFT with the appropriate contents. So, go ahead and create a new asynchronous `writeJSON` function as a constant with the `pinCID` and `pinName` parameters as the ones it should accept. Create new temporary `audioIPFS` and `coverIPFS` variables and then an `if-else` statement with `pinCID && pinName` as its parameters. Inside the `if` statement, create a `for` loop which will piece together the appropriate URLs of your media pins: ```javascript scripts/pin.js // Define a function to write the metadata to a .json file const writeJSON = async (pinCID, pinName) => { let audioIPFS; let coverIPFS; if (pinCID && pinName) { for (var i = 0; i < pinName.length; i++) { if (pinName[i].includes('mp3')) { audioIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } else { coverIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } } } } ``` In this tutorial, the URLs are pieced together using the Chainstack IPFS Storage gateway `ipfsgw.com` which will make your pins available faster, however, it is generally recommended to use `ipfs://` for truly universal access, even if it takes much longer to propagate this way. That being said, finish off the rest of the function by writing the metadata you collected earlier to a JSON file in the `src` directory. The entire `writeJSON` function should be set up in a fashion similar to this: ```javascript scripts/pin.js // Define a function to write the metadata to a .json file const writeJSON = async (pinCID, pinName) => { let audioIPFS; let coverIPFS; if (pinCID && pinName) { for (var i = 0; i < pinName.length; i++) { if (pinName[i].includes('mp3')) { audioIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } else { coverIPFS = "https://ipfsgw.com/ipfs/" + pinCID[i]; } } // Write the metadata to the file ./src/NFTmetadata.json fs.writeFileSync('./src/NFTmetadata.json', JSON.stringify({ "description": "My first music NFT mint.", "external_url": "https://chainstack.com/nfts/", "image": coverIPFS, "animation_url": audioIPFS, "name": "PetarISFire - Chainstackwave" })); let jsonMeta; if (fs.existsSync('./src/NFTmetadata.json')) { jsonMeta = { file: fs.createReadStream('./src/NFTmetadata.json'), title: "NFTmetadata.json" }; } return jsonMeta; } }; ``` Lastly, create a new asynchronous `pinNFT` function that will queue all the relevant functions in the correct order to have your media files and metadata JSON ready for minting: ```javascript scripts/pin.js // Define the main function that executes all necessary functions to pin the NFT metadata const pinNFT = async () => { try { const ids = await addFiles(content); await new Promise((resolve) => setTimeout(resolve, 5000)); const [pinCID, pinName] = await findCIDs(ids); await new Promise((resolve) => setTimeout(resolve, 5000)); const jsonMeta = await writeJSON(pinCID, pinName); await new Promise((resolve) => setTimeout(resolve, 5000)); const id = await addFiles([jsonMeta], true); await new Promise((resolve) => setTimeout(resolve, 5000)); const jsonCID = await findCIDs(id, true); console.log('NFT metadata successfully pinned with Chainstack IPFS Storage!\\n'); console.log('Copy this URL and set it as value for the "metadata" variable in the "mint.js" script file:\\n' + 'https://ipfsgw.com/ipfs/' + jsonCID); } catch (error) { console.error('Error during NFT pinning:', error.message); } }; //Don't forget to call the main function! pinNFT(); ``` That's it! You should now have a fully working script that will pin your media files, write the relevant metadata to JSON, pin said JSON, and have everything ready for minting automatically, even if it takes a couple of retries to do so. Don't forget to run the `pin.js` script via CLI, so you can watch the fireworks! ### If the script returns Error during NFT pinning: Request failed with status code 400 as response. If you encounter this error, there is some degree of duplication with existing files you have pinned previously. To resolve it, you must either set new filenames and titles for the interrupted pins, or delete the previous ones. Do note that once you delete them, any previously minted NFTs using them as source for metadata will no longer display correctly and will be left as blank containers permanently. And considering this has been the most complex script you have set up until now (and the rest of the tutorial), it is certainly wise to recap with the full `pin.js` code: ```javascript scripts/pin.js // Process dependencies require('dotenv').config(); const fs = require('fs'); const axios = require('axios'); const FormData = require('form-data'); // Define the media files to be pinned const content = [ { file: fs.createReadStream("./src/(Tutorial) My First Music NFT Cover.png"), title: "(Tutorial) My First Music NFT Cover.png" }, { file: fs.createReadStream("./src/(Tutorial) PetarISFire - Chainstackwave.mp3"), title: "(Tutorial) PetarISFire - Chainstackwave.mp3" } ]; // Define a function to pin files with Chainstack IPFS Storage const addFiles = async (source, single = false) => { // Differentiate between pinning single and multiple files const url = single ? "https://api.chainstack.com/v1/ipfs/pins/pinfile" : "https://api.chainstack.com/v1/ipfs/pins/pinfiles"; // Define the pin metadata const data = new FormData(); if (single) { console.log('Attempting to pin ' + JSON.stringify(source[0].title) + ' with Chainstack IPFS Storage...'); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', source[0].file); data.append('title', source[0].title); } else { source.forEach((file) => { console.log('Attempting to pin ' + JSON.stringify(file.title) + ' with Chainstack IPFS Storage...'); data.append('bucket_id', process.env.BUCKET_ID); data.append('folder_id', process.env.FOLDER_ID); data.append('file', file.file); data.append('title', file.title); }); } // Define the Axios configuration const config = { method: 'POST', url: url, headers: { "Content-Type": 'multipart/form-data;', "Authorization": process.env.CHAINSTACK, ...data.getHeaders() }, data: data }; // Store the Axios response const response = await axios(config); if (single) { console.log(`File successfully pinned with Chainstack IPFS Storage using public ID: ${response.data.id}\\n`); return JSON.stringify(response.data.id); } else { const pubIDs = response.data.map((item) => item.id); console.log(`Files successfully pinned with Chainstack IPFS using public IDs: ${pubIDs.join(', ')}\\n`); return pubIDs; } }; // Define a function to find CIDs for files uploaded to IPFS const findCIDs = async (fileID, single = false) => { if (single) { fileID = fileID.replace(/"/g, ''); fileID = [fileID]; } // Define the maximum retries and the timeout between retries const maxRetries = 3; const retryTimeout = 11000; if (!single) { let cid = []; let name = []; // Loop through all the uploaded files for (var i = 0; i < fileID.length; i++) { // Get the CID and filename for the file const result = await findCIDs(fileID[i], true); cid.push(result[0]); name.push(result[1]); } // Print the CIDs found and return the cid and name values console.log('All CIDs found:' + cid + '\\n'); return [cid, name]; } else { let cid; let name; let retries = 0; // Set up the retry loop while (retries < maxRetries) { try { console.log('Attempting to find CID via public ID: ' + fileID + ' on Chainstack IPFS...'); // Define the Axios configuration const url = "https://api.chainstack.com/v1/ipfs/pins/" + fileID; var config = { method: 'GET', url: url, headers: { "Content-Type": 'text/plain', "Authorization": process.env.CHAINSTACK, }, }; // Store the Axios response const response = await axios(config); console.log('CID found:' + response.data.cid + ' Filename: ' + response.data.title + '\\n'); cid = response.data.cid; name = response.data.title; // Throw an error if the cid and name values are not valid if (cid != null && cid !== 'error' && name != null && name !== 'error') { break; } else { // Throw an error if the CID and filename are not valid throw new Error('CID or name values are not valid.'); } } catch (error) { console.error(`Error in findCIDs: ${error.message}. Attempting to retry...\\n`); // Retry after the timeout if unsuccessful retries++; await new Promise((resolve) => setTimeout(resolve, retryTimeout)); } } return [cid, name]; } }; // Define a function to write the metadata to a .json file const writeJSON = async (uploadCID, uploadName) => { let audioIPFS; let coverIPFS; if (uploadCID && uploadName) { for (var i = 0; i < uploadName.length; i++) { if (uploadName[i].includes('mp3')) { audioIPFS = "https://ipfsgw.com/ipfs/" + uploadCID[i]; } else { coverIPFS = "https://ipfsgw.com/ipfs/" + uploadCID[i]; } } // Write the metadata to the file ./src/NFTmetadata.json fs.writeFileSync('./src/NFTmetadata.json', JSON.stringify({ "description": "My first music NFT mint.", "external_url": "https://chainstack.com/nfts/", "image": coverIPFS, "animation_url": audioIPFS, "name": "PetarISFire - Chainstackwave" })); let jsonMeta; if (fs.existsSync('./src/NFTmetadata.json')) { jsonMeta = { file: fs.createReadStream('./src/NFTmetadata.json'), title: "NFTmetadata.json" }; } return jsonMeta; } }; // Define the main function that executes all necessary functions to upload the NFT metadata const uploadNFT = async () => { try { const ids = await addFiles(content); await new Promise((resolve) => setTimeout(resolve, 5000)); const [uploadCID, uploadName] = await findCIDs(ids); await new Promise((resolve) => setTimeout(resolve, 5000)); const jsonMeta = await writeJSON(uploadCID, uploadName); await new Promise((resolve) => setTimeout(resolve, 5000)); const id = await addFiles([jsonMeta], true); await new Promise((resolve) => setTimeout(resolve, 5000)); const jsonCID = await findCIDs(id, true); console.log('NFT metadata successfully uploaded to Chainstack IPFS!\\n'); console.log('Copy this URL and set it as value for the "metadata" variable in the "mint.js" script file:\\n' + 'https://ipfsgw.com/ipfs/' + jsonCID); } catch (error) { console.error('Error during NFT upload:', error.message); } }; //Don't forget to call the main function! uploadNFT(); ``` ### 3.3: Create the script for minting By the time you reach this step, you should have successfully pinned your NFT media files with Chainstack IPFS Storage, and have their CIDs referenced in a JSON file that was also pinned there. The JSON file must contain an `image` key to store the NFT cover, an `animation_url` one for the audio file, `name` for the track title, and optionally a `description`, as well as an `external_url` for a link to your profile for example. It should look similar to this: ```Json JSON { "description": "My first music NFT mint.", "external_url": "https://chainstack.com/nfts/", "image": "https://ipfsgw.com/ipfs/QmfVBC87qZyn81Z68ntCkTNehQwdEFr3ZPCnVDTXjENxUT", "animation_url": "https://ipfsgw.com/ipfs/QmPv19dddmwp8BcoaxFmqZjhsps9wKVNubyYtpT2htxfTd", "name": "PetarISFire - Chainstackwave" } ``` If that is indeed the case, you can move forward by creating a new `mint.js` file inside the `scripts` directory for the minting script. Begin by processing the dependencies for the `dotenv`, `hardhat-web3`, and `fs` modules. Proceed by initializing your wallet address, private key, deployed smart contract ABI, and JSON metadata URL, as well as the appropriate deployed contract address for each network via a simple `if` loop: ```javascript scripts/mint.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const path = require('path'); // Initialize your wallet address and private key const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; // Initialize your deployed smart contract address for the selected network let contractAdrs; if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT contractAdrs = contractENV; } else if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT; contractAdrs = contractENV; } else { const contractENV = process.env.MAINNET_CONTRACT; contractAdrs = contractENV; } // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; // Initialize the JSON metadata URL const metadata = "https://ipfsgw.com/ipfs/QmX5mrBWukdWVByxnoUS4GJTysVBFjjoVg1fgSjExNV7Dd" ``` Then, create a new contract object and set the interactions origin to the **owner address**. This is crucial for the successful execution of the script, as earlier you set up the contract to allow minting only from the **owner address**. Without the correct `from` parameter it will return an error. ```javascript scripts/mint.js // Create a new contract object and set interactions origin to the owner address const contractObj = new web3.eth.Contract(contractABI, contractAdrs, { from: address, }); ``` Next, create a gas estimation function like the one in the `scripts/deploy.js` using the web3.js `estimateGas` method with the `safeMint` method of your contract as its target: ```javascript scripts/mint.js // Define a gas estimation function const gasEstimate = async () => { return await contractObj.methods.safeMint(address, metadata).estimateGas(); }; ``` With that taken care of, move forward by defining a new asynchronous `startMint` function as a constant. Inside it, start by adding some visual feedback for the mint address target and the estimate response like so: ```javascript scripts/mint.js // Define a minting function const startMint = async () => { console.log(`\nAttempting to mint on ${network.name} to: ${address}...\n`); // Estimate the gas costs needed to process the transaction const gasCost = await contractObj.methods.safeMint(address, metadata).estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas}...\n`); else console.error(`Error estimating gas: ${err}...\n`); }); } ``` Continue the function by defining the transaction details and signing it with the web3.js `signTransaction` method, using as the first parameter an object with the `address` value for the `from` key, the `contractAdrs` one for `to`. Then, for `data`, call the contract object's `safeMint` method with the `encodeABI` method attached to it, and `gas` set to `gasCost`. Use the `privKey` constant as the second parameter for a final result like this: ```javascript scripts/mint.js // Define the transaction details and sign it const mintTX = await web3.eth.accounts.signTransaction( { from: address, to: contractAdrs, data: contractObj.methods.safeMint(address, metadata).encodeABI(), gas: gasCost, }, privKey, ); ``` Lastly, make sure you get the transaction receipt by creating a `createReceipt` function like the following and call the `startMint` function to truly make the mix complete: ```javascript scripts/mint.js // Process dependencies require('dotenv').config(); require("@nomiclabs/hardhat-web3"); const fs = require('fs'); const path = require('path'); // Initialize your wallet address and private key const address = process.env.WALLET; const privKey = process.env.PRIVATE_KEY; // Initialize your deployed smart contract address for the selected network let contractAdrs; if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT contractAdrs = contractENV; } else if (network.name == 'sepolia') { const contractENV = process.env.SEPOLIA_CONTRACT; contractAdrs = contractENV; } else { const contractENV = process.env.MAINNET_CONTRACT; contractAdrs = contractENV; } // Replace 'MyFirstMusicNFT' with your contract's name. const contractName = 'MyFirstMusicNFT'; // Find the compiled smart contract to get the ABI const artifactPath = path.resolve(__dirname, `../artifacts/contracts/${contractName}.sol/${contractName}.json`); const contractArtifact = JSON.parse(fs.readFileSync(artifactPath, 'utf-8')); const contractABI = contractArtifact.abi; // Initialize the JSON metadata URL const metadata = "https://ipfsgw.com/ipfs/QmX5mrBWukdWVByxnoUS4GJTysVBFjjoVg1fgSjExNV7Dd" // Create a new contract object and set interactions origin to the owner address const contractObj = new web3.eth.Contract(contractABI, contractAdrs, { from: address, }); // Define a minting function const startMint = async () => { console.log(`\nAttempting to mint on ${network.name} to: ${address}...\n`); // Estimate the gas costs needed to process the transaction const gasCost = await contractObj.methods.safeMint(address, metadata).estimateGas((err, gas) => { if (!err) console.log(`Estimated gas: ${gas}...\n`); else console.error(`Error estimating gas: ${err}...\n`); }); // Define the transaction details and sign it const mintTX = await web3.eth.accounts.signTransaction( { from: address, to: contractAdrs, data: contractObj.methods.safeMint(address, metadata).encodeABI(), gas: gasCost, }, privKey, ); // Get transaction receipt const createReceipt = await web3.eth.sendSignedTransaction(mintTX.rawTransaction); // Provide appropriate network for Etherscan link if (network.name !== 'mainnet'){ console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: https://${network.name}.etherscan.io/tx/${createReceipt.transactionHash}\n`); } else { console.log(`NFT successfully minted on ${network.name} with hash: ${createReceipt.transactionHash}\n\nView the transaction on Etherscan: https://etherscan.io/tx/${createReceipt.transactionHash}\n`); } }; // Don't forget to run the main function! startMint(); ``` Congratulations! After running the `mint.js` script with Hardhat using `npx hardhat run scripts/mint.js --network NETWORK_NAME`, you should now have minted the first music NFT in your collection! To add more NFTs, simply follow the process to create a new JSON file with different parameters and rerun the relevant functions, making sure to select the appropriate JSON file for the metadata variable in the `mint.js` function. ### 3.4: View your NFTs on MetaMask and OpenSea If you've managed to complete the previous steps, this one should be quite simple. Begin by opening your MetaMask wallet or [downloading it](https://metamask.io/download/) if you haven't already. Log in and choose the Sepolia network at the top. Click the icon in the top right corner (not the MetaMask logo) and select **Import Account**. Opt for **Private Key**, as that's how you initially set up your address, and paste it into the designated field. Afterward, head to the **NFTs** tab and select **Import NFTs** at the bottom. Paste your music NFT contract’s address into the first field, and the token ID in the next. If it’s your first mint the token ID will be 0. Finish the process by clicking **Add**, and there you have it—your music NFT is now visible in MetaMask! [Etherscan](https://goerli.etherscan.io/token/0x7d8c7C54d98D533Af176DE1a0e280898E55537eb) Unfortunately, MetaMask does not permit you to play the audio associated with your music NFT. To accomplish this, visit OpenSea, specifically the [Sepolia Testnet version found here](https://testnets.opensea.io/). Use MetaMask to log in, and once successful, hover over your avatar and click **Profile**. You should see your music NFT; click on it to reveal its details. If the featured image only shows a small preview, click on it once more and then a third time to open it in full screen, so your tune starts playing. [OpenSea Goerli](https://testnets.opensea.io/assets/goerli/0x7d8c7c54d98d533af176de1a0e280898e55537eb/0) ## Access the tutorial repo To make it even easier for you to follow and implement the concepts discussed in this tutorial, we have prepared a comprehensive code repository for your convenience. The repo contains all the necessary files, code snippets, and resources used throughout the tutorial. You can access the full tutorial code at its [dedicated GitHub repo here](https://github.com/chainstacklabs/music-nft-minter-tutorial-repo). Feel free to download or clone the repository, and use it as a reference while working through the tutorial. This will help you save time and ensure that you have a complete understanding of the concepts presented. ## Bringing it all together With all of this taken care of, you have successfully dipped your feet and managed to explore the fascinating world of minting music NFTs. And thanks to this, you now have established a step-by-step process on how to create, deploy, and manage your very own digital music collectibles. By leveraging blockchain technology, artists and collectors like yourself can take full advantage of this new avenue for monetization, creative expression, and secure ownership. As you embark on your music NFT journey, remember that the possibilities are endless. Experiment with different parameters and smart contract functionalities to create NFTs that not only represent your unique artistic vision but also offer value to your audience. Whether you're an established musician or an emerging talent, music NFTs can open doors to new opportunities and reshape the way you engage with and appreciate the art of sound. So go ahead, take the leap, and begin minting your very own music NFTs. Share your creations with the world, and witness the transformative power of this cutting-edge technology in the ever-evolving music industry. Happy minting! ### About the author Senior Copywriter @ Chainstack Writes on Ethereum, NFTs, and underlying technology I BUIDL tutorials insightful so your dev experience can be delightful. [](https://github.com/petarsrepo) [](https://twitter.com/petarcopyrock) [](https://www.linkedin.com/in/pstoykov/) # How to store your Web3 dApp secrets: Guide to environment variables Source: https://docs.chainstack.com/docs/how-to-store-your-web3-dapp-secrets-guide-to-environment-variables **TLDR** * This article explores the importance of securely handling environment variables (e.g., private keys, access tokens, RPC endpoints) in Web3 DApps to prevent leaks and malicious attacks. * It demonstrates why storing secrets on the front end is unsafe and how a back-end proxy or specialized service (e.g., Dotenv Vault) adds a protective layer. * It covers common practices (like .env files, whitelisting, and rate limiting), as well as enterprise-level approaches (secret manager tools) for robust security. * By following these methods, developers can safeguard their DApp’s integrity and user trust across development, staging, and production environments. ## Main article As a DApp developer, it is essential to ensure that sensitive information such as private keys, access tokens, and node endpoint URLs are properly safeguarded to prevent unauthorized access. Typically, such information is called a “secret” or an “environment variable”, and there is an extensive list of possible approaches for its application. This guide will explore the various methods for storing environment variable secrets and the best practices for implementing them in your DApp. So, by the end of the guide, you will have a solid understanding of how to securely store environment variable secrets for your Web3 DApp, whether it is in development, staging, or production setting. And considering just how rampant poor security practices for environment variables are in the industry, this guide is not only nice to have, but an essential reading material for many Web3 BUIDLers out there, regardless of their level. That being said, let’s dig into the details of DApp secrets and environment variables. ## What are environment variables? Environment variables are values that can be passed to a computer's operating system or an application at runtime. These values can be used to configure the behavior of the operating system or application, and they are often used to store sensitive information such as passwords and private keys. Environment variables are typically set in a script or configuration file and can be accessed by the operating system or application using predefined methods. In a blockchain context, a Web3 DApp may use an environment variable to specify the account’s private key that it should use to sign transactions or the network that it should be deployed. This allows Web3 developers to change the behavior of the DApp without having to make changes to its code base. This prevents sensitive information from being hard-coded into the DApp's code, which could be potentially accessed by unauthorized parties. ## Avoid front-end storage for environment variables When running an app that sends requests to an API and includes a front end, it can be easy and convenient to store the secret keys directly in the front end, but they are also very easy to expose and exploit this way. This can create a multitude of issues for both you and your users, ranging from data leaks and unauthorized access all the way to the dreaded denial of service (DDoS) attack vector. ### Example of API keys stored in the front end Check out this [simple web app](https://unprotected-front-end.vercel.app/) we built and deployed to showcase such a scenario. The app uses the Etherscan API to retrieve the latest block number on the Ethereum network. Open the browser DevTools to inspect the Network tab, then click the **Latest block** button to fetch the latest block on the Ethereum chain, you will see the request in the **Network** tab, and you can easily find the API request. Don’t worry; there is no actual secret in this case, as the Etherscan API allows you to make a limited number of requests without an API key. ### Hint Right-click → Inspect → Network ### What are the consequences of storing API keys in the front end? For several reasons, it is important to avoid using exposed environment variables in a Web3 DApp. Firstly, exposing environment variables can potentially allow unauthorized parties to access sensitive information such as private keys and access tokens. This can lead to preventable security risks that compromise the overall security of your DApp and make it vulnerable to bad actors. Secondly, exposing environment variables can facilitate attackers in reverse engineering your DApp and potentially gain access to sensitive information or assets. Lastly, exposing environment variables can also help attackers to impersonate your DApp and carry out malicious actions. Overall, it is best to avoid using exposed environment variables. Instead, use secure methods to store and access sensitive information with your Web3 DApp. Let’s have a look at some of the most popular methods to do just that securely in the following paragraphs. ## How to store environment variables securely? There are several ways to store environment variables securely in a Web3 DApp. In this guide, we will explore a few methods based on use cases. 1. Development, testing, and learning environment via an environment or configuration file The first method involves storing environment variables in a separate configuration file or environment file, which can be accessed by the DApp at runtime. This file can be encrypted and password-protected to prevent unauthorized access. Typically this is done using the [dotenv](https://github.com/motdotla/dotenv) package and a `.env` file. This is a very good solution for testing or educational environments where you intend to push the app to a remote repository like GitHub, and no front end is involved. 2. Custom solution for storing secrets in a secure database via a backend server A way to store environment variables when using a front end is in a secure database via a backend server, which serves as a proxy. This method allows for fine-grained control over access to sensitive information. It can provide additional security measures such as encryption and access control, but it is also more complex to build and maintain. 3. Professional/enterprise level via a secret manager tool Lastly, there is an extensive list of tools available that can be used to manage and securely store environment variables for a Web3 DApp. Some of the more popular options are [Dotenv Vault](https://www.dotenv.org/), [Microsoft Azure Key Vault](https://azure.microsoft.com/en-in/products/key-vault/), [AWS Secrets Manager](https://aws.amazon.com/secrets-manager/), [Google Cloud Secret Manager](https://cloud.google.com/secret-manager), [HashiCorp Vault](https://www.vaultproject.io/), and [Doppler](https://www.doppler.com/). These tools often provide features such as encryption and access control to ensure that sensitive information is properly safeguarded. Overall, the best approach for securely storing environment variables in a Web3 DApp will depend on the specific needs and requirements of your project. It is recommended to carefully evaluate the different options and choose the best approach for your use case’s security and compliance requirements. ## Local and development environments The dotenv package allows you to use `.env` files to store secrets. Before platform and SaaS alternatives made it into the spotlight, developers used a `.env` file that is not committed to a public repo to keep all their keys and secrets there. While the approach is still used today, it is primarily recommended for use in local and development environments only. This is so because exposing your secrets is just one “commit” away from being exposed, and anyone with access to the `.env` file basically has all the keys to your kingdom. Using a `.env` file helps you mitigate the risk of exposing your API keys, but you still need to be careful when pushing code to a remote repository. To have peace of mind, make sure to include a `.gitignore` file in your local repository. A `.gitignore` holds a list of directories and files that you want to “ignore” when pushing code to the version control platform, for example, GitHub. Use a [gitignore generator](https://mrkandreev.name/snippets/gitignore-generator/#Node) to make sure you cover all of the files and directories potentially holding sensitive data. Example of a `.gitignore` file: ```sh .gitignore // Dependency directories node_modules/ jspm_packages/ // dotenv environment variables files .env .env.test .env.production ``` Typically, this approach is best suited for small teams since it gets difficult to keep members of a large team in sync with just a `.env` file. You can also use the `dotenv` package to work with `.env` files easily. All you have to do is add a `require` reference for its config method as early as possible in your code. Afterward, you can always fetch a particular secret with a `process.env` call of the key, whose value you want to get like so. ### How to use the dotenv package Here’s what you need to do to replicate this for your environment: [Download and install node.js](https://nodejs.org/) if you don’t have it already Navigate to your project’s root folder using the CLI Install the dotenv package from your CLI via npm: ```sh CLI npm install dotenv ``` Create a `.env` file and enter your secrets in the appropriate format: ```sh .env CHAINSTACK_NODE_URL="https://nd-123-456-789.p2pify.com/API_KEY" ``` Enter this as early as possible in your DApp’s JavaScript file to load the package once it runs: ```js index.js require('dotenv').config(); ``` Fetch your DApp secret like so: ```js index.js const secret = process.env.CHAINSTACK_NODE_URL; ``` Confirm your secret is loading correctly in your script by logging the result in the console: ```js index.js // If secret is declared console.log(secret); // If the secret is not declared console.log(process.env.YOUR_KEY); ``` Run your script and check if the result is correct: ```sh CLI $ node index.js // Script response YOUR_VALUE ``` ### Full script example The best way to learn is through practice, so let's try to replicate the script in the video above step-by-step. Following is an example of what a `.env` file looks like, how you can create one and import it in your script, as well as use its content anywhere throughout your JavaScript code. For this particular case, we will use two secret key-value pairs that closely resemble those you would typically find in a real-world Web3 DApp context, even if it is a rather simple one. First, create a `.env` file in your project root and add your “secrets” to it: ```sh .env ENDPOINT_URL='https://nd-123-456-789.p2pify.com/API_KEY' ADDRESS='0x95222290DD7278Aa3Ddd389Cc1E1d165CC4BAfe5' ``` Then use the environment variables in JavaScript: ```js index.js require('dotenv').config(); const endpoint = process.env.ENDPOINT_URL; const address = process.env.ADDRESS; console.log(`Environment variables in use: \n ${endpoint} \n ${address}`); ``` [This repository](https://github.com/soos3d/Deploy-an-ERC-20-token-on-Scroll-using-Hardhat) is a good example of a simple project set up in a development environment that uses `.env` and `.gitignore` files to protect sensitive data. If you look into the `.gitignore` file, you will see a list of files and directories that are not pushed to the remote repository, and you will not find a `.env` file committed along with other files. To use this project, you need to create your own `.env` file after you clone the repository. ### Approach overview * Good for testing/small projects/educational content * Not ideal for big teams * Does not properly protect API keys When deploying an app, the API keys will need to be stored in the (public) repo as a `.env` file, for example, as is the case for [Vercel](https://vercel.com/) and [Heroku](https://vercel.com/). Alternatively, you can enter your production secrets within the platforms to give you a sense of security, but once they are pushed to the (public) repo, the keys will be visible to anyone. ## Secure front-end usage via custom proxy When deploying an app with a front end, an option to protect your [REST API](https://www.ibm.com/cloud/learn/rest-apis) keys or [RPC](https://en.wikipedia.org/wiki/Remote_procedure_call) endpoints is to handle the request using a back-end proxy server. A proxy server is a computer that sits between a device and the internet. In this case, when you access a website or online service, the request goes through the proxy server. The proxy server then sends the request to the website or service on your behalf and passes the response back to you. Such a concept is really useful for this use case because you can have your front end send a request to the back-end proxy, which then passes it to the actual API or endpoint. This way, the front end never communicates directly with the API and endpoints, so you don’t need to store the secrets there. There are several ways to do this, so in this section, you will learn how to build a simple proxy server to protect REST API keys and RPC endpoints. You will find two examples further: 1. A simple web app that uses the Etherscan API to retrieve the latest block number 2. Another simple web app that sends a request to an Ethereum endpoint to retrieve an account balance ### Build a proxy server to protect REST API keys For this example, we built a simple app that uses the Etherscan API to retrieve the latest block from the Ethereum network and then display it on the screen. You can find the source code and how to test it in its [GitHub repository here](https://github.com/chainstacklabs/express-rest-api-proxy-server). ### The REST API proxy server In this example, we use the [express.js framework](https://expressjs.com/) to build a simple server that we can use as a proxy to communicate with the Etherscan API. Express.js is a web application framework for node.js designed for building web applications and APIs. It provides a set of features and tools for building web applications, including routing, middleware, and template engines. In the repository, the `index.js` file in the root directory holds the server’s code, and as you can see, is very straightforward: ```js index.js // This is the server file. const express = require('express') const cors = require('cors') const rateLimit = require('express-rate-limit') require('dotenv').config() const PORT = process.env.PORT || 3000 const app = express() // Rate limiting, limit the number of requests a user can send within a specific amount of time. // With this setup, the user can only make 100 request max every 10 minutes. const limiter = rateLimit({ WindowMs: 10 * 60 * 1000, // 10 minutes in ms. max: 100 // 100 request max. }) app.use(limiter) app.set('trust proxy', 1) // Set static folder; this allows our server to pick up the HTML file in the src folder. app.use(express.static('src')) // Routes // This route looks into the index.js file in the routes folder and picks up the '/' route. app.use('/api', require('./routes')) // Enable cors app.use(cors()) app.listen(PORT, () => console.log(`Server running on port ${PORT}`)) ``` This server runs on the port you specify in the `.env` file of the project or port 3000 if you don’t specify it explicitly. It then creates a route to the `index.js` file inside the routes directory, where the URL is built to then send a [GET](https://developer.mozilla.org/en-US/docs/Web/HTTP/Methods/GET) request to the Etherscan API. Here's the route file: ```js index.js const express = require('express') const router = express.Router() const needle = require('needle') // You could use 'node-fetch' too, but it might have some conflicts. // Env variables, taken from the .env file. const ETHERSCAN_API_BASE_URL = process.env.ETHERSCAN_API_BASE_URL const ETHERSCAN_API_KEY_NAME = process.env.ETHERSCAN_API_KEY_NAME const ETHERSCAN_API_KEY_VALUE = process.env.ETHERSCAN_API_KEY_VALUE // Route from the server file. router.get('/', async (req, res) => { try { // URLSearchParams allows us to build the URL const apiKey = new URLSearchParams({ [ETHERSCAN_API_KEY_NAME]: ETHERSCAN_API_KEY_VALUE, }) // Build the full API URL using URLSearchParams const fullUrl = `${ETHERSCAN_API_BASE_URL}&${apiKey}` // Send the request to the Etherscan API, and retrieve the JSON body response. const apiResponse = await needle('get', fullUrl) const data = apiResponse.body res.status(200).json(data) } catch (error) { res.status(500).json({ error }) } }) module.exports = router ``` This simple solution helps protect your API keys but is not bulletproof. A skilled bad actor could still find where the request is sent to, and although they could not extract your secret API keys, they could still flood the servers with requests. This could severely slow down the service you offer, drive up your costs in case of usage costs, and even prevent access to it entirely in a denial of service (DDoS) attack vector. To mitigate these issues, we included a rate limiter and enabled [Cross-Origin Resource Sharing](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS) (CORS) in the server file. The rate limiter is customizable and set up only to allow 100 requests every 10 minutes. ```js index.js const limiter = rateLimit({ WindowMs: 10 * 60 * 1000, // 10 minutes in ms. max: 100 // 100 request max. }) app.use(limiter) ``` CORS, instead, is a mechanism that allows a web server to serve resources to a web page from a different domain than the one it originated from. This allows web pages to access resources from APIs or other servers hosted on a different domain. It allows the server to specify which domains are allowed to request it. It is important for security because it helps prevent malicious websites from making unauthorized requests to servers on behalf of the user and helps to protect against [cross-site scripting](https://owasp.org/www-community/attacks/xss/) (XSS) attacks and other types of malicious activity. ## What is whitelisting Whitelisting is a relatively common practice to protect systems and networks, and it’s a security measure that maintains a list of approved applications or domains that are allowed to run on a system, network, or device while blocking all others. CORS, which we mentioned earlier, is an example of this; it provides a way to specify the domains that are allowed to access restricted resources on a web page from an external domain outside of your own. This approach provides a higher level of security than other security measures, such as blacklisting, which blocks known malicious applications or software. With whitelisting, even if an unknown or new threat arises, it cannot run on the system or network because it is not on the approved whitelist. Implementing whitelisting is highly recommended when designing your application, and many providers offer such options. ### Whitelisting limitations While whitelisting is a valuable practice that should be included in your DApp design, it's not foolproof and can still be exploited. In the Web3 world, whitelisting is commonly implemented by allowing only specific domains or IP addresses to send requests to a server or RPC endpoint. This can add a layer of protection, but it shouldn't be the only thing you rely on to secure your RPC endpoint—particularly if it's exposed on the front end. The main attacks whitelists are susceptible to are **distributed denial of service** (DDoS) and **spoofing**. #### DDoS attacks Distributed Denial of Service (DDoS) attacks overwhelm targeted systems or networks with massive amounts of traffic from numerous sources, leading to increased resource usage and potentially making them inaccessible to legitimate users. You might wonder how this issue arises when domains aren't on the whitelist. When a request is sent to an endpoint, the server receives and responds to it regardless of your authorization status. If you have permission, the server will process your request and provide the relevant information. If access is denied, the server will return an error message. Unfortunately, this process can be exploited to consume server resources during a DDoS attack. Consider the following response examples from whitelisted endpoints: ```Json JSON {"jsonrpc":"2.0","error":{"code":-32002,"message":"rejected due to project ID settings"}} {"jsonrpc":"2.0","error":{"code":0,"message":"not authorized"},"id":null} ``` As you can see, the endpoint responds even though we are not allowed to use it, and during a DDoS attack, this might even cause the node to go out of sync and be unusable. #### Spoofing attacks In a spoofing attack, a bad actor pretends to be someone or something trustworthy to sneak into systems, steal private information, or mess up communication between people. The attacker changes or hides information like IP addresses, email addresses, or website links to trick users or systems into thinking they're interacting with a real, safe source. In our specific case, [IP spoofing](https://www.cloudflare.com/learning/ddos/glossary/ip-spoofing/) and bypassing CORS policies are possible exploitations. During an **IP spoofing attack**, the attacker changes the source IP address in the packets they send to make it seem like they are coming from a trusted source. This can be used to get around network security measures, launch DDoS attacks, or deceive users into giving away sensitive information. In this case, this could lead to getting around the IP whitelist, and the attacker could flood your endpoint with requests consuming your resources and driving up your costs; a severe DDoS attack could even bring your service down. **Cross-origin resource sharing** it’s a way to whitelist domains so that only the allowed ones can request restricted web resources. This approach is also vulnerable to attacks as the CORS policy might be misconfigured or taken advantage of. The following is a list of precautions that should be followed: 1. Properly configure the `Access-Control-Allow-Origin` header. Ensure that sensitive information is not exposed by specifying the origin in the `Access-Control-Allow-Origin` header. This should be set to a trusted domain rather than using a wildcard (`*`) or `null` value. 2. Only allow trusted sites. The origin specified in the `Access-Control-Allow-Origin` header should only be sites that are trusted. Avoid reflecting origins from cross-origin requests without proper validation, as this is easily exploitable. 3. Avoid using null. Avoid using the `Access-Control-Allow-Origin: null` header, as cross-origin resource calls from internal documents and sandboxed requests can specify the null origin. CORS headers should be properly defined in respect of trusted origins for private and public servers. 4. Avoid wildcards in internal networks. Avoid using wildcards in internal networks, as trusting network configuration alone to protect internal resources is not sufficient when internal browsers can access untrusted external domains. 5. Use proper server-side security policies. Remember that CORS is not a substitute for server-side protection of sensitive data. An attacker can directly forge a request from any trusted origin. Therefore, web servers should continue to apply protections over sensitive data, such as authentication and session management, in addition to properly configured CORS. By taking these steps, web developers and administrators can prevent common CORS-based attacks and better protect sensitive data. To learn more, see [Exploiting CORS – How to Pentest Cross-Origin Resource Sharing Vulnerabilities](https://www.freecodecamp.org/news/exploiting-cors-guide-to-pentesting/#exploitable-cors-cases) from freeCodeCamp. ## Implement security measures Designing a truly secure DApp is not easy, and you should take a comprehensive approach to security from the very beginning as it’s difficult to implement major changes once the project scales. This is why you will often see even big and famous projects with this kind of vulnerability. To avoid this, you need to take a comprehensive approach to security, using the strategies we talked about and integrating with third-party tools like [Cloudflare](https://www.cloudflare.com/learning/cdn/glossary/reverse-proxy/) and [NGINX](https://docs.nginx.com/nginx/admin-guide/web-server/reverse-proxy/) reverse proxy services. These tools are especially helpful for big companies who want to make their DApps as safe as possible. In addition to third-party tools, consider following the [Secure Software Development Lifecycle](https://owasp.org/www-project-integration-standards/writeups/owasp_in_sdlc/) and the [OWASP Top 10 web application security risks](https://owasp.org/www-project-top-ten/). ### Connect front end and proxy server Now that we understand how the back end works let’s see how it connects to the front end. In our small project, this is done by the `script.js` file in the `src` directory. ```js script.js async function fetchBlock() { let displayBlock = document.getElementById("block-number") // Fetch the latest block number querying the Etherscan API const response = await fetch("/api") // Place the JSON response in a constant and log it const blockHex = await response.json() console.log(`The latest block in HEX is: ${blockHex.result}`) // Extract the HEX block value from the JSON const blockString = JSON.stringify(blockHex.result) //console.log(blockString) // Slice the HEX value to remove 0x const slicedBlock = blockString.slice(3, -1) //console.log(slicedBlock) // Convert and log the decimal block number const blockDecimal = parseInt(slicedBlock, 16) console.log(`The latest block number is: ${blockDecimal}`) displayBlock.innerHTML = blockDecimal } ``` As you can see, this script handles the action coming from the HTML. In this case, when the user presses the button, the script sends a request to our backend API, parses the response, and displays it in the HTML. Check out the [repository](https://github.com/chainstacklabs/express-proxy-server)—you can clone it and run the app locally. This will give you a very good idea of how the logic works. If you inspect the source code, you will only see that the request is sent to the API, but you won’t find any secret. To use this concept on a deployed app, you’ll have to deploy your back end separately and modify the `script.js` file to send the request to that address. ### Build a proxy server to protect an RPC endpoint The principle is the same for this scenario; the substantial difference is that, in this case, we have to build the body of the POST request to send to the endpoint. The following function handles this part: ```js index.js async function fetchBalance() { // Proxy server URL // localhost when ran locally, you will need to add your server URL once deployed. const url = 'http://localhost:4000/balance'; // Initialize the HTML element where the balance will be displayed in the front end let displayBalance = document.getElementById("balance") // The Ethereum address to query picked up from the front end const address = document.getElementById("address").value console.log(`Address to query: ${address}`) // Send a POST request to the proxy server to query the balance fetch(url, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ address }) }) .then(response => response.json()) .then(data => { // Handle the response data console.log(`The balance is: ${data.balance} ETH`); // Slice the response to only show the first 6 numbers const slicedBalance = data.balance.slice(0, 7) // Display the balance in the front end displayBalance.innerHTML = `Balance: ${slicedBalance} ETH` }) .catch(error => { // Handle any errors console.error(error); }); } ``` Note that we specify the URL to the proxy server in `const url = 'http://localhost:4000/balance';`. In this case, the server is running on the same machine, but you’ll have to add the address where you’ll deploy the server in case. Then we build the POST request’s body. The response is then cleaned up a bit to display only the first six digits. The same side effects apply to this concept, so implementing a rate limiter and CORS is again a good solution to avoid abuses. You can find the source code and how to test it in its [GitHub repository here](https://github.com/chainstacklabs/express-proxy-server). ### How to connect the front end with the back end So far, we've discussed building the back end, but let's now focus on integrating it with the front end. In the [`script.js`](https://github.com/soos3d/node-proxy-server-to-protect-your-rpc-url-endpoint/blob/main/src/script.js) file from the example showing how to protect an RPC endpoint, you'll notice a connection to the back end via a URL like this: `http://localhost:4000/YOUR_ROUTE`. This is because the server was initially tested in a local environment. To make this work on the internet, follow these steps: Create a separate repository for the server. Add the server files to the new repository. Deploy the server using a platform like Digital Ocean. Set the environment variables in the deployment platform. Replace `http://localhost:4000/` in the front-end files with the URL of your deployed server. By completing these steps, you'll successfully connect the front end to the back end, allowing your application to work online. ## Production, CI/CD, and enterprise application Now that we have explored some custom ways, let’s see how you could protect your secrets at an enterprise level. For a cloud-based deployment in production, it is recommended to use a secure service like an online password/secret manager to safely handle your API keys and endpoints. They also allow you to separate your environment variables by permission levels, thus allowing your whole team to access its credentials from a single source. For this guide, we will take a closer look at the [Dotenv Vault secret manager](https://www.dotenv.org/), which is incredibly user-friendly. It is a perfect way for you to make the transition from development to production, both swiftly and easily, while still offering the exceptional security features you would need for a Web3 DApp in production. ### Dotenv Vault sync anywhere Dotenv Vault effectively utilizes the same approach as `.git` and GitHub when handling your DApp secrets. This means you can use it as a private repo for everything you normally store in a `.env` file. Once you have set everything up in the Dotenv Vault interface, you can use the pull function to instantly sync it locally to any device you want via your CLI: ```sh CLI // CLI // Connect to your Dotenv Vault project npx dotenv-vault new INSERT_YOUR_VAULT_ID_HERE_OR_LEAVE_BLANK_TO_CREATE_NEW // Log into your Dotenv Vault project npx dotenv-vault login // Open your Dotenv Vault project to introduce changes npx dotenv-vault open // Pull your Dotenv Vault project .env file to local storage npx dotenv-vault pull ``` Even if you are just using Dotenv Vault for local file sync, you can also leverage its powerful feature set for your DApp. One of these is the ability to have different values for each key according to the environmental context you are currently using. Thanks to this, you can quickly and easily move from `development` to `production` environment variables—all you need to do is pull the file for the appropriate environment: ```sh CLI // CLI // Pull the development .env file npx dotenv-vault pull development // Pull the staging .env file npx dotenv-vault pull staging // Pull the ci .env file npx dotenv-vault pull ci // Pull the production .env file npx dotenv-vault pull production ``` While Dotenv Vault’s sync anywhere feature is great already, it offers exceptionally easy integration with pretty much any platform. This means you can use it natively within your CI/CD workflow without any critical information hitting your HDD. And the best thing about it? Dotenv Vault uses first-party storage for all relevant data, so there is less risk of external partner leaks, as is the case for all other alternatives. With [Dotenv Vault’s Integrate Everywhere™](https://www.dotenv.org/docs/tutorials/integrations) approach, you can do encrypted fetching of your environment variables completely in memory. Thanks to this priceless feature, you can rest easy knowing that exposed hardcoded variables and any security risks that originate from them are pretty much a thing of the past. To use this method, you follow the same procedure as you would during a remote-to-local `.env` file sync. Once you have the `.env` file locally, you will need to build it and then fetch its decryption key. With the key safely in your possession, it can be entered into [GitHub Actions](https://docs.github.com/en/actions) ⇒ Secrets, as in this example, or any other CI/CD platform like [Heroku](https://www.heroku.com/), [CircleCI](https://circleci.com/), [Netlify](https://www.netlify.com/), and [Vercel](https://vercel.com/), among many others. The same goes for other cloud build platforms like [AWS](https://aws.amazon.com/), [Google Cloud Platform (GCP)](https://cloud.google.com/), [Gatsby](https://gatsbyjs.com/), and even those supporting the containerized application process like [Docker](https://www.docker.com/) and [Kubernetes](https://kubernetes.io/). And once you have entered your decryption key in the appropriate platform’s settings, all you need to do is make a `require` reference to the `[dotenv-vault-core package](https://github.com/dotenv-org/dotenv-vault-core)` as early in your code as possible: ```js JavaScript // index.js require('dotenv-vault-core').config(); const endpoint = process.env.ENDPOINT_URL; const address = process.env.PUBLIC_KEY; console.log('Environment variables in use:\n' + endpoint + '...\n' + addresss); ``` Overall the workflow is quite accessible, even for the less tech-savvy, making it a perfect tool for you to make a swift and adequate transition from development to production. Here’s how it works in a [GitHub Actions CI/CD workflow](https://github.com/petarsrepo/web3-github-dotenv): [GitHub Actions CI/CD workflow for Web3 DApp](https://github.com/petarsrepo/web3-github-dotenv) ## Other alternatives Now that you know how the Dotenv Vault workflow plays out, it is important to note there are other secret manager alternatives too. Some of them are built into the corresponding platform, while others are available for external use just as well. Overall, there are plenty of options to choose from, available on the market, whether you're looking for a free(mium) tool to do the basics, or a paid one that can offer you more advanced capabilities. Among the popular examples of alternative online secret managers are: ## Conclusion Now that you have made it this far, it should be quite clear that securely storing environment variables is essential for ensuring the seamless operation and integrity of any Web3 DApp. Overall, the most suitable approach for securely storing environment variables will depend on the specific requirements of your project. By using secure methods such as configuration/environment files, secure databases via a backend server, or secret manager tools, you, as Web3 BUIDLers, can protect sensitive information and prevent unauthorized access. Additionally, following best practices such as encrypting and password-protecting environment variable files and implementing access control measures can further enhance the security of your DApp. And by taking the time to store environment variables properly, you are effectively taking an important step towards building a secure and reliable Web3 landscape not just for yourself and your project but for everyone involved end-to-end. Isn’t that what we all want in the end?… ### See also ### About the authors Senior Copywriter @ Chainstack Writes on Ethereum, NFTs, and underlying technology I BUIDL tutorials insightful so your dev experience can be delightful. [](https://github.com/petarsrepo) [](https://twitter.com/petarcopyrock) [](https://www.linkedin.com/in/pstoykov/) Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # HTTP batch request VS multicall contract Source: https://docs.chainstack.com/docs/http-batch-request-vs-multicall-contract **TLDR** * Both HTTP batch requests and multicall contracts can bundle multiple calls to reduce client-server overhead and improve response times. * In the tests, HTTP batch requests often slightly outperformed multicall contracts, but both outperformed sending multiple single requests by a significant margin. * Batch requests still count each call against your RPC usage, whereas multicall typically counts as a single request. However, multicall requires extra contract deployment and can lead to “request timeout” or “response size too big” errors for more complex calls. * The best choice depends on your specific use cases, desired simplicity, and performance requirements. ## Main article This article is about two approaches many developers believe can help them save on RPC consumptions: batch request and multicall smart contract. In this article, we will explore how to use them and compare their performances. ## HTTP batch request HTTP batch request is a feature most Ethereum clients support, for example, [Geth](https://geth.ethereum.org/docs/interacting-with-geth/rpc/batch). With batch requests enabled, multiple HTTP requests can be packaged into one single request and sent to the server. Server process this bulk request and returns a bulk result. All of these are done in a single round trip. This feature can be useful for reducing the load on the server and improving the performance to a certain extent. ## How to implement To implement an HTTP batch request, just send a request with a payload containing multiple request objects in an array like below: ```Json JSON [ {"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":1}, {"jsonrpc":"2.0","method":"eth_blockNumber","params":[],"id":2}, {"jsonrpc":"2.0","method":"eth_syncing","params":[],"id":3}, {"jsonrpc":"2.0","method":"eth_chainId","params":[],"id":4} ] ``` The server sends back the results in one response. The results are arranged in the same order as the requests were received. For example: ```Json JSON [ { "jsonrpc": "2.0", "id": 1, "result": "Geth/v1.10.26-stable-e5eb32ac/linux-amd64/go1.18.8" }, { "jsonrpc": "2.0", "id": 2, "result": "0x10058f8" }, { "jsonrpc": "2.0", "id": 3, "result": false }, { "jsonrpc": "2.0", "id": 4, "result": "0x1" } ] ``` To run it in curl: ```bash cURL curl 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'Content-Type: application/json' \ --data '[ {"jsonrpc":"2.0","method":"web3_clientVersion","params":[],"id":1}, {"jsonrpc":"2.0","method":"eth_blockNumber","params":[],"id":2}, {"jsonrpc":"2.0","method":"eth_syncing","params":[],"id":3}, {"jsonrpc":"2.0","method":"eth_chainId","params":[],"id":4} ]' ``` Popular Web3 libraries like [web3.js](https://web3js.readthedocs.io/en/v1.2.11/web3-eth-personal.html?highlight=batch) and [ethers.js](https://docs.ethers.org/v5/api/providers/other/#JsonRpcBatchProvider) support this feature too. Below is an example of getting ether's balance from multiple accounts using web3.js. ### Web3.js install instructions Run `npm i web3` to install web3.js. The code in this guide is compatible with `web3.js V4`. ```js JavaScript const { Web3 } = require("web3"); const NODE_URL = "YOUR_CHAINSTACK_ENDPOINT"; const web3 = new Web3(NODE_URL); const addresses = [ "0x1f9090aaE28b8a3dCeaDf281B0F12828e676c326", "0x2bB42C655DdDe64EE508a0Bf29f9D1c6150Bee5F", ]; async function getBalances() { const startTime = Date.now(); // Create a batch request object const batch = new web3.BatchRequest(); // Array to hold promises for each request const promises = []; // Loop through each address and add a getBalance request to the batch addresses.forEach((address, index) => { const request = { jsonrpc: "2.0", id: index + 1, method: "eth_getBalance", params: [address, "latest"], }; // Add request to the batch and store the promise const requestPromise = batch.add(request); promises.push(requestPromise); }); // Send the batch request and wait for all responses const responses = await batch.execute(); // Process responses responses.forEach((response, index) => { if (response.error) { console.error(response.error); } else { const balance = response.result; const timeFromStart = Date.now() - startTime; console.log( `${addresses[index]} has a balance of ${Number( web3.utils.fromWei(balance, "ether") ).toFixed(3)} ETH retrieved in: ${timeFromStart / 1000} seconds.` ); } }); } getBalances(); ``` The `getBalances` function creates a new `BatchRequest` object using `web3.BatchRequest()`. The function then loops through each address in the `addresses` array and creates a new request to get the balance of that address using `web3.eth.getBalance.request()`. It adds each request to the batch using `batch.add()`. Finally, the function executes the batch request using `batch.execute()`. When executed, the requests in the batch are sent to the Ethereum network simultaneously, and the callback functions are executed when the responses are received. ## Multicall contract A multicall contract is a smart contract that takes in the function call objects as parameters and executes them together. A developer can use the multicall contract as a proxy to call other contracts on Ethereum. The implementation of a multicall contract is, in fact, very simple: it leverages Solidity’s [call function](https://solidity-by-example.org/call/) to broadcast contract calls. This is a sample implementation of multicall’s aggregate function: ```solidity solidity function aggregate(Call[] memory calls) public returns (uint256 blockNumber, bytes[] memory returnData) { blockNumber = block.number; returnData = new bytes[](calls.length); for(uint256 i = 0; i < calls.length; i++) { (bool success, bytes memory ret) = calls[i].target.call(calls[i].callData); require(success); returnData[i] = ret; } } ``` In summary, this function takes an array of `Call`, calls each one, and returns an array of the results along with the block number in which the function was called. It is designed to be used as a general-purpose aggregator for calling other contracts on the Ethereum blockchain. ## How to implement Anyone can deploy their own multicall contract. In this article, we leverage [MakerDAO’s multicall contract](https://github.com/makerdao/multicall) on the Ethereum mainnet; which is deployed at [0xeefBa1e63905eF1D7ACbA5a8513c70307C1cE441](https://etherscan.io/address/0xeefBa1e63905eF1D7ACbA5a8513c70307C1cE441). Below is an example calling the smart contract with MakerDAO’s helper library [multicall.js](https://github.com/makerdao/multicall.js?files=1); it essentially does the same thing as the previous example: ```js JavaScript const multicall = require("@makerdao/multicall") const config = { rpcUrl: "YOUR_CHAINSTACK_ENDPOINT", multicallAddress: "0xeefba1e63905ef1d7acba5a8513c70307c1ce441" }; const addressArr = [ "0x2B6ee955a98cE90114BeaF8762819d94C107aCc7", "0x2bB42C655DdDe64EE508a0Bf29f9D1c6150Bee5F" ]; async function main() { const startTime = Date.now(); console.log("Started..."); const calls = []; // Retrieve the Ether balance of each Ethereum address in addressArr using the multicall library. for (let i = 0; i < addressArr.length; i++) { const callObj = { call: [ 'getEthBalance(address)(uint256)', addressArr[i] ], returns: [ [`ETH_BALANCE ${addressArr[i]}`, val => val / 10 ** 18] ] }; calls.push(callObj); } const result = await multicall.aggregate(calls, config); console.log(result); const timeFromStart = Date.now() - startTime; console.log(`Result received in ${timeFromStart / 1000} seconds`); } main(); ``` The `main` function iterates through each address in the `addressArr` array and creates a call object for each address. These call objects use the multicall library to retrieve the ether balance for each address. Once all of the call objects have been created and pushed to the `calls` array, the multicall library's `aggregate` function is called with the array of call objects and the configuration object. This function aggregates the results of all of the calls into a single object, which is stored in the `result` variable. Finally, the code logs the "result" to the console and calculates the time it took to receive the "result", which is also logged to the console. You will need to [install the multicall.js library](https://github.com/makerdao/multicall.js?files=1#installation) to run this code. ## Performance comparison In this section, we compare the performance of 3 different approaches: * Sending multiple HTTP requests in parallel * Sending a batch HTTP request * Using a multicall contract We will test based on two common use cases: 1. Getting account balance 2. Calling a smart contract ## Getting account balance for 30 distinct addresses The testing script for batch requests and multicall contract is already included in the previous sections. Below is the code for sending multiple HTTP requests in parallel: ```js JavaScript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); var addressArr = [ "0x2B6ee955a98cE90114BeaF8762819d94C107aCc7", "0x2bB42C655DdDe64EE508a0Bf29f9D1c6150Bee5F" ] async function main() { var startTime = Date.now() console.log("started") for (i = 0; i < addressArr.length; i++) { web3.eth.getBalance(addressArr[i]).then(function(result) { var timeFromStart = Date.now() - startTime console.log("Result received in:" + timeFromStart / 1000 + " seconds") }) } } main(); ``` ### Result | | Parallel single requests | Batch request | Multicall | | ------- | ------------------------ | ------------- | --------- | | Round 1 | 1.789 | 1.49 | 1.447 | | Round 2 | 1.896 | 1.159 | 1.54 | | Round 3 | 2.337 | 1.113 | 2.132 | | Round 4 | 2.942 | 1.224 | 1.609 | | Round 5 | 1.638 | 1.602 | 2.012 | The test was conducted between a server in Europe and a client in Singapore. A total of 15 measurements were averaged, which shows the performance of batch request > Multicall > normal request. Compared with sending single requests in parallel, batch request reduces 38% of the total request time, and multicall reduces 18% of the total request time. ## Getting the owners of BAYC tokens Below are the testing scripts using web3.js for making smart contract calls. The tests are based on an ERC-721 standard method [`ownerOf`](https://docs.openzeppelin.com/contracts/2.x/api/token/erc721) from BAYC’s smart contract. Sending multiple HTTP requests in parallel: ```js JavaScript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); const abi = [{ "inputs": [{ "internalType": "uint256", "name": "tokenId", "type": "uint256" }], "name": "ownerOf", "outputs": [{ "internalType": "address", "name": "", "type": "address" }], "stateMutability": "view", "type": "function" }] const contract = new web3.eth.Contract(abi, "0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D"); async function main() { const startTime = Date.now() console.log("started") for (i = 0; i < 30; i++) { contract.methods.ownerOf(i).call().then(function(result) { console.log(result) var timeFromStart = Date.now() - startTime console.log("result received in: " + timeFromStart / 1000 + " seconds") }) } } main(); ``` Sending batch request: ```js JavaScript const { Web3 } = require('web3'); const web3 = new Web3('YOUR_CHAINSTACK_ENDPOINT); const abi = [ { inputs: [{ internalType: "uint256", name: "tokenId", type: "uint256" }], name: "ownerOf", outputs: [{ internalType: "address", name: "", type: "address" }], stateMutability: "view", type: "function", }, ]; const contract = new web3.eth.Contract( abi, "0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D" ); async function main() { const startTime = Date.now(); const batch = new web3.BatchRequest(); console.log("started"); // Array to hold promises for each request const promises = []; for (let i = 0; i < 30; i++) { const request = { jsonrpc: "2.0", id: i + 1, method: "eth_call", params: [ { to: contract.options.address, data: contract.methods.ownerOf(i).encodeABI(), }, "latest", ], }; // Add request to the batch and store the promise const requestPromise = batch.add(request); promises.push(requestPromise); } // Send the batch request and wait for all responses const responses = await batch.execute(); // Process responses responses.forEach((response, index) => { if (response.error) { console.error(response.error); } else { const ownerAddress = web3.eth.abi.decodeParameter( "address", response.result ); const timeFromStart = Date.now() - startTime; console.log( `${index} token owner is ${ownerAddress} received in: ${ timeFromStart / 1000 } seconds` ); } }); } main(); ``` Multicall contract: ```js JavaScript const multicall = require("@makerdao/multicall") const config = { rpcUrl: "YOUR_CHAINSTACK_ENDPOINT", multicallAddress: "0xeefba1e63905ef1d7acba5a8513c70307c1ce441" }; async function main() { var startTime = Date.now() console.log("started") var calls = [] for (i = 0; i < 30; i++) { var callObj = { target: "0xBC4CA0EdA7647A8aB7C2061c2E118A18a936f13D", call: ['ownerOf(uint256)(address)', i], returns: [ ['OWNER_ADDR ' + i] ] } calls.push(callObj) } const result = await multicall.aggregate(calls, config); console.log(result.results); var timeFromStart = Date.now() - startTime console.log("Result received in: " + timeFromStart / 1000 + " seconds") } main(); ``` ### Result | | Parallel single requests | Batch request | Multicall | | ------- | ------------------------ | ------------- | --------- | | Round 1 | 1.693 | 1.931 | 1.878 | | Round 2 | 1.717 | 1.592 | 1.195 | | Round 3 | 1.712 | 1.617 | 2.183 | | Round 4 | 2.103 | 1.589 | 1.3 | | Round 5 | 2.785 | 1.416 | 1.429 | The same test was conducted for read contract calls. The result shows that both batch requests and multicall contracts save around 20% of total request time compared with sending single requests. ## Common questions As an RPC provider, Chainstack counts “request” as RPC calls. After a server receives an HTTP batch request, it “unpacks” the request and processes the calls separately. So from the server’s point of view, 1 batch request of 100 calls consumes 100 requests instead of 1. Check the [Understanding your request consumption](https://support.chainstack.com/hc/en-us/articles/4412534652313-Understanding-your-request-consumption-over-HTTP-and-WebSocket) page on Chainstack support docs for more details. In this case, even though it is a very heavy call, it counts as a single request. The BAYC testing script stops working with 1,040 calls. ## Which approach works better for me Even though tests show that batch request and multicall contract improves performance significantly, they do have their own limitations. Requests in a batch request are executed in order, which means if a new block is received during execution, the subsequent results are likely to be different. If you want to use a multicall contract, you should probably deploy your own contract for production just to ensure its availability. Both batch request and multicall contract return multiple results in a single response. Both of them require much more computational resources. They can easily trigger “request timeout” errors and “response size too big” errors, which makes them not suitable for complex calls. ### See also Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Hybrid hosting Source: https://docs.chainstack.com/docs/hybrid-hosting Chainstack can be used as a solution 100% hosted in your data center or with any selected components deployed in the location of your choice. Ideal for clients prioritizing full control over their nodes, keys, and infrastructure, while seeking to avoid the complexity of developing proprietary orchestration tools. ### Example A software company utilizes Chainstack's interface (either through a graphical user interface or application programming interface) to coordinate and oversee blockchain nodes, networks, and services that operate exclusively within their custom-built infrastructure. ### Contact us for hybrid hosting Start today by sending [us a message](https://support.chainstack.com/). ## Chainstack-managed hosting vs hybrid hosting With Chainstack-managed hosting, you get the complete managed blockchain services experience: monitoring, logging, backups, deployment of fully synced nodes, and request metrics. With hybrid hosting, you get the nodes running in your own infrastructure. | Features | Chainstack-managed hosting | Hybrid hosting | | --------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | Monitoring | Complete node monitoring: node availability and node health. | Partial node monitoring: node availability is monitored. | | Analytics and logging | Full node logging and resource consumption logging. For Chainstack-managed hosting nodes, request metrics are collected: number of requests, a breakdown of method calls, and server response data. For dedicated Chainstack-managed hosting nodes, request metrics, CPU, memory, and storage usage data are collected. | Analytics solution setup on request. | | Backups | Daily automated backups that enable the provision of fully synced nodes in a matter of hours | You can use your own backup solution available on the market. | | Bolt | Dedicated nodes are deployed as synced and near-instantly available for use. | A node must be synced from scratch, which takes weeks or months. To reduce the sync time to hours, a manual snapshot transfer is also available on the [Enterprise subscription](https://chainstack.com/pricing/). | | Billing | You pay for the subscription and the node usage. | You pay for the subscription and node management fee. | | Cloud providers and regions | It is possible to use the default Chainstack cloud providers to run a node on. | You are not limited to the Chainstack supported cloud providers. Use your own cloud infrastructure. | | Benefits | Easy to deploy and manage nodes. Low operation costs. Near-instant node availability. | Blockchain is completely running in your cloud, which is useful in various scenarios, including compliance related concerns. For any application running on top of blockchain, the latency will benefit from the choice to colocate the application on the same server as the node. | # Implementing JWT validation in Golang for Chainstack marketplace integration Source: https://docs.chainstack.com/docs/implementing-jwt-validation-in-golang-for-chainstack-marketplace-integration **TLDR** * Demonstrates how to validate JWTs in Golang using Ed25519 public keys in PEM format – required for Chainstack Marketplace apps. * Explains parsing of EdDSA-signed JWT headers/payload with required audience matching and expiration checks. * Walks through environment variable setup with `.env`, retrieving the public key, decoding the token, and returning the claims for a successful validation. * Ensures robust security by verifying the token’s legitimacy and ensuring it is intended specifically for your application. # Introduction This guide is designed for developers who are integrating applications into the Chainstack Marketplace using Golang. The tutorial offers practical insights into working with JWTs that you'll obtain from Chainstack's validation endpoint. Within the context of a Golang application, you'll learn how to validate these tokens using public keys in PEM format. Notably, Chainstack employs the EdDSA (Edwards-curve Digital Signature Algorithm) for cryptographic operations, and this guide will get into how to validate JWTs to grant access to your users securely. ## What is JWT? JSON web token (JWT) is a compact, URL-safe means of representing claims between two parties. These claims are encoded as a JSON object and can be digitally signed or integrity-protected with a message authentication code (MAC) and/or encrypted. ## Why validate JWT? Validating a JWT is crucial for ensuring its integrity and authenticity. Failure to do so can lead to security risks like unauthorized access and data breaches. In the Chainstack Marketplace, users will use their Chainstack API key to obtain a JWT that is valid for one hour and includes an "audience" claim, which specifies the intended recipient of the token. ## What will you learn? This tutorial will guide you through the process of validating JWTs in a Golang application. We'll use the `golang-jwt/jwt` library for parsing and validating JWTs and the `golang.org/x/crypto/ed25519` library for cryptographic operations. By the end of this tutorial, you'll be able to: * Parse and decode Ed25519 public keys in PEM format. * Validate JWTs using Ed25519 public keys. * Verify the "audience" claim to ensure the JWT is intended for your application. Whether you're developing a new Golang application requiring JWT validation or integrating this feature into an existing project, this tutorial has you covered. ## Prerequisites Before diving into the code, make sure you have the following installed: * Go 1.16 or higher * A text editor or IDE of your choice (e.g., Visual Studio Code) * Terminal or command prompt for running shell commands You should also have a basic understanding of: * Go programming language * JSON web tokens (JWT) * Public key infrastructure (PKI) ## Setting up the project Initialize a new Go project. Open your terminal and run the following command to create a new directory for your project and navigate into it: ```shell Shell mkdir go-jwt-validation && cd go-jwt-validation ``` Initialize Go module. Initialize a new Go module by running: ```shell Shell go mod init jwt-validation ``` Install required libraries. Install the necessary Go libraries by running: ```shell Shell go get github.com/golang-jwt/jwt go get golang.org/x/crypto/ed25519 go get github.com/joho/godotenv ``` Create a `.env` file. Create a `.env` file in the root directory of your project and add the public key you received from Chainstack and other environment variables if needed. ### Ensure the public key is in the privacy-enhanced mail (PEM) format PEM is a widely used encoding format for cryptographic objects such as keys and certificates. It is a base64 encoding of the binary distinguished encoding rules (DER) format with additional header and footer lines. In the code, the public key is stored in PEM format between the lines `-----BEGIN PUBLIC KEY-----` and `-----END PUBLIC KEY-----`. ```shell Shell JWT_PUBLIC_KEY="-----BEGIN PUBLIC KEY-----PUBLIC_KEY_HERE-----END PUBLIC KEY-----" ``` ### Why use a `.env` file? Using a `.env` file lets you separate your configuration variables from your code. This is beneficial for several reasons: * **Security** — sensitive information like keys should not be hard-coded into your application. * **Portability** — it's easier to manage and change configuration when it's separated from the code. * **Environment specificity** — you can have different `.env` files for different environments like development, testing, and production. ## The Chainstack Marketplace authentication flow Before diving into the code, it's important to understand the authentication flow in the Chainstack Marketplace. The user who has purchased the integration with your application will authorize requests using their Chainstack API key. Note that this API key is not the JWT token you need to validate. To validate the API key, you must call the [Retrieve Application Token](/reference/chainstack-platform-api-retrieve-token) API. Here's an example API call to retrieve the validated JWT: ```bash cURL curl --location --request POST '' \\ --header 'Authorization: Bearer CHAINSTACK_API_KEY' ``` This API call will return a validated JWT with a validity period of 60 minutes. ## Claims in a Chainstack Marketplace JWT The JWT from Chainstack will contain several claims that provide information about the token and its intended usage. Below is a table detailing these claims: | Column | Data type | Description | | ------ | --------- | --------------------------------------------------------------------------------------------------- | | sub | string | Unique identifier of the Chainstack account that has installed the application. | | iss | string | The issuer of the token. In this case, it will be "chainstack". | | exp | number | Expiration time of the token in UTC timestamp seconds. Calculated as current time + 1 hour. | | nbf | number | The "not-before" time of the token in UTC timestamp seconds. Calculated as current time - 1 minute. | | iat | number | Issued-at time of the token in UTC timestamp seconds. | | aud | string | The agreed slug of the application. | | scope | string\[] | One or more scopes that should be included in the token payload. | The code we’ll show, for example, validates the JWT based on the timestamp and the expected audience. ## The code Where you initialized the project, create a new file named `validation.go` and paste the following code: ```go validation.go package main import ( "crypto/x509" "encoding/pem" "fmt" "os" "strings" // Added this package for string manipulation "github.com/joho/godotenv" "github.com/golang-jwt/jwt" "golang.org/x/crypto/ed25519" ) // loadEnvVars loads environment variables from .env file func loadEnvVars() error { err := godotenv.Load() if err != nil { return fmt.Errorf("error loading .env file: %w", err) } return nil } // getPublicKey retrieves the public key from an environment variable func getPublicKey() (ed25519.PublicKey, error) { publicPEM := os.Getenv("JWT_PUBLIC_KEY") if publicPEM == "" { return nil, fmt.Errorf("Environment variable for public key is not set") } // Convert single-line PEM to multi-line PEM if needed publicPEM = strings.Replace(publicPEM, "-----BEGIN PUBLIC KEY-----", "-----BEGIN PUBLIC KEY-----\n", 1) publicPEM = strings.Replace(publicPEM, "-----END PUBLIC KEY-----", "\n-----END PUBLIC KEY-----", 1) block, _ := pem.Decode([]byte(publicPEM)) if block == nil { return nil, fmt.Errorf("Failed to parse PEM block containing the public key") } pubKey, err := x509.ParsePKIXPublicKey(block.Bytes) if err != nil { return nil, fmt.Errorf("Failed to parse public key: %w", err) } ed25519PubKey, ok := pubKey.(ed25519.PublicKey) if !ok { return nil, fmt.Errorf("Failed to convert public key to Ed25519 public key") } return ed25519PubKey, nil } // validateJWT validates the JWT token func validateJWT(ed25519PubKey ed25519.PublicKey, jwtToken, expectedAudience string) error { token, err := jwt.Parse(jwtToken, func(token *jwt.Token) (interface{}, error) { return ed25519PubKey, nil }) if err != nil { return fmt.Errorf("Failed to parse JWT token: %w", err) } // Print the JWT headers fmt.Println("JWT Headers:", token.Header) if claims, ok := token.Claims.(jwt.MapClaims); ok && token.Valid { audience, audienceOk := claims["aud"].(string) if audienceOk && audience == expectedAudience { fmt.Println("Decoded Payload:", claims) fmt.Println("Signature is valid.") return nil } else { return fmt.Errorf("Invalid Token: Invalid audience") } } else { return fmt.Errorf("Invalid Token: Token is not valid") } } func main() { err := loadEnvVars() if err != nil { fmt.Printf("Error occurred: %v\n", err) return } ed25519PubKey, err := getPublicKey() if err != nil { fmt.Printf("Error occurred: %v\n", err) return } // Add the JWT to validate jwtToken := "JWT_TO_VALIDATE" expectedAudience := "YOUR_COOL_APP" err = validateJWT(ed25519PubKey, jwtToken, expectedAudience) if err != nil { fmt.Printf("Error occurred: %v\n", err) return } } ``` ## Understanding the code ### Import statements The code starts by importing the necessary packages: ```go validation.go import ( "crypto/x509" "encoding/pem" "fmt" "os" "strings" "github.com/joho/godotenv" "github.com/golang-jwt/jwt" "golang.org/x/crypto/ed25519" ) ``` * `crypto/x509` and `encoding/pem` — these are standard Go libraries used for parsing the PEM-encoded public key. * `fmt` — standard Go package for formatted I/O operations. * `os` — standard Go package for interacting with the operating system, used here for reading environment variables. * `strings` — standard Go package for string manipulation. * `github.com/joho/godotenv`— this library loads environment variables from a `.env` file. * `github.com/golang-jwt/jwt` — this library is used for parsing and validating JWTs. * `golang.org/x/crypto/ed25519` — this library provides the cryptographic operations for the Ed25519 algorithm. ### Functions and their roles #### `loadEnvVars()` This function loads environment variables from a `.env` file into the program. It uses the `godotenv.Load()` method to read the `.env` file and populate the environment variables. If an error occurs, it returns an error wrapped with additional context. #### `getPublicKey()` This function retrieves the public key stored in the `.env` file as an environment variable. It performs several steps: Reads the `JWT_PUBLIC_KEY` environment variable. Checks if the variable is empty and returns an error if it is. Converts the single-line PEM to multi-line PEM format if needed. Decodes the PEM block to get the public key bytes. Parses the public key bytes to get an `ed25519.PublicKey` object. When validating the JWT, we are using the Ed25519 public key to validate a signature that was generated using the EdDSA algorithm with the Ed25519 parameters. #### `validateJWT()` This function validates the JWT token. It takes the Ed25519 public key, the JWT token string, and the expected audience as parameters. It performs the following steps: Parse the JWT. Utilizes the `jwt.Parse` function from the `github.com/golang-jwt/jwt` library to parse the JWT. This function not only decodes the token but also validates it against a series of standard claims: * `exp` — expiration time of the token. If the token is expired, the function will return an error. * `nbf` — not-before time, indicating the earliest time the token is valid. * `iat` — issued-at time, indicating when the token was created. The function uses the provided Ed25519 public key for cryptographic validation of the token's signature. Check audience. Explicitly checks the `aud` claim in the token to ensure it matches the expected audience. This is an additional validation layer on top of the default checks performed by `jwt.Parse`. This claim ensures the user is allowed to call your app specifically. Output. If the token is valid, it prints the decoded payload and a validation success message. If the token is invalid for any reason (e.g., expired, wrong audience, etc.), an error message is returned. #### `main()` This is the entry point of the program. It orchestrates the other functions: 1. Calls `loadEnvVars()` to load the environment variables. 2. Calls `getPublicKey()` to retrieve the public key. 3. Calls `validateJWT()` to validate the JWT. ### How to run the code 1. **Validate a JWT**. To validate a JWT, you'll need an authorized API key. Use the [appropriate endpoint](/reference/chainstack-platform-api-retrieve-token) to validate the API key and obtain a JWT. Note that in a production setting, you'll either need to require users to validate and send the JWT first or implement a flow in your app that validates the API key dynamically. 2. **Add JWT and audience to the script**. Once you've validated the JWT, add it to the corresponding variable inside the `main` function, along with your expected audience. ```go go jwtToken := "VALIDATED_JWT" expectedAudience := "YOUR_EXPECTED_AUDIENCE" ``` 3. **Run the program directly**. Open your terminal, navigate to the directory containing `validation.go`, and run: ```shell Shell go run validation.go ``` This will compile and run your code in one step. If everything is set up correctly, you should see the decoded payload and a validation success message. ### Typical responses 1. **Success**. If the JWT is valid and the audience matches, you'll see output similar to the following: ``` JWT Headers: map[alg:EdDSA typ:JWT] Decoded Payload: map[aud:EXPECTED_AUDIENCE exp:TIMESTAMP iat:TIMESTAMP iss:ISSUER nbf:TIMESTAMP scope:[com.APP.api.Paid] sub:SUBJECT] Signature is valid. ``` ### The actual values will vary based on the JWT and your application At this point, you have a flow to validate JWTs and give access to users! 2. **Failure**. If the JWT is expired, you'll see an error message like: ``` Error occurred: Failed to parse JWT token: Token is expired ``` ## Conclusion In this comprehensive guide, we've explored the essential steps for integrating applications into the Chainstack Marketplace using Golang. We got into the intricacies of JWT validation, a critical aspect of ensuring secure and authorized access to your application. From setting up your Go project and installing the necessary libraries to understanding the Chainstack Marketplace's authentication flow and claims, we've covered it all. We also explored using the Ed25519 public key in conjunction with the EdDSA algorithm for cryptographic validation. This is particularly important given Chainstack's use of the EdDSA algorithm for its JWTs. By following this guide, you should now be equipped with the knowledge and code snippets needed to validate JWTs effectively in your Golang applications. Remember, security is not just a feature but a necessity. Validating JWTs is a fundamental step in safeguarding your application and its users. So, go ahead and implement these best practices in your application to ensure it's as secure as it can be. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Introducing Bun: The future of JavaScript runtimes Source: https://docs.chainstack.com/docs/introducing-bun-the-future-of-javascript-runtimes **TLDR** * Bun is a fast, all-in-one JavaScript runtime with native server and hot reload features. * You can fetch Ethereum data using Bun’s built-in fetch API and a Chainstack node – no extra packages needed. * Quickly set up a server with Bun.serve() and handle requests to retrieve an address’s ETH balance. * Bun’s .env support, TypeScript/JSX readiness, and zero dependencies make it ideal for a streamlined dev workflow. ## Main article JavaScript's landscape just got more exciting with Bun's debut. Hitting the scene with version 1.0, Bun is a speedy toolkit that simplifies running, building, testing, and debugging JavaScript and TypeScript projects. It's a fresh take on the cluttered world of JS tooling since node.js came around. If you're a dev seeking a snappier tool or a sleek node.js alternative, Bun's got you covered. With its wide file support, built-in web APIs, and impressive speed, Bun is set to shake up JavaScript runtimes. Dive in as we explore Bun and craft a DApp using its APIs, all without any extra dependencies. Dive into this guide, and we'll kickstart your journey with Bun. We're crafting a simple DApp using its APIs. And here's the kicker: we'll spin up a server to get blockchain data with zero dependencies. ## Bun: A comprehensive JavaScript toolkit Before starting the project, let’s briefly review what makes Bun such an interesting tool. * All-in-one toolkit. Bun is a unified toolkit designed for running, building, testing, and debugging JavaScript and TypeScript applications, eliminating the need for multiple tools and dependencies. * Server capabilities: * `Bun.serve()`. Easily spin up an HTTP or WebSocket server using familiar web-standard APIs like `Request` and `Response`. * Performance. Bun can serve up to 4x more requests per second than node.js and 5x more messages per second than ws on node.js, as per Bun documentation. * TLS configuration. Secure your server with built-in TLS options. * Native .env support. Bun natively reads `.env` files, removing the need for third-party packages like `dotenv` and `cross-env`. * Hot reloading: * Built-in watch mode. Use `bun --hot` to enable hot reloading, which automatically reloads your application when files change. * Efficient reloading. Unlike tools like `nodemon`, Bun reloads your code without terminating the old process, preserving HTTP and WebSocket connections and maintaining state. * Additional features: * Native web APIs. Built-in support for web standard APIs available in browsers, such as `fetch`, `Request`, `Response`, `WebSocket`, and `ReadableStream`. * TypeScript and JSX support. Run JavaScript, TypeScript, and JSX/TSX files directly without any dependencies. * Module compatibility. Supports both CommonJS and ES modules without any configuration hassles. * Plugins. Highly customizable with a plugin API inspired by `esbuild`, allowing for custom loading logic and support for additional file types. * Optimized APIs. Bun offers highly optimized, standard-library APIs designed for speed and ease of use, outperforming many node.js counterparts. ## Project overview: Ethereum balance checker with Bun At its core, the project aims to provide a creative example of how you can use the new Bun JavaScript runtime to work with DApps, highlighting the fact that you can make a functioning tool with zero extra dependencies. In this guide, we'll walk you through creating a DApp designed to fetch the balance of an Ethereum address. Leveraging Bun, we'll set up a server that awaits user requests containing an Ethereum address. The application will then use a Chainstack Ethereum RPC node to get the balance. The retrieved balance will then be decoded and presented to the user as a structured JSON object looking like the following: ```Json JSON { "address": "0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13", "balance": "162.140649034802429952", "unit": "Ether" } ``` ## Prerequisites * Bun installed on your machine. Bun is available on [macOS, Linux, and Windows](https://bun.sh/docs/installation). * A Chainstack account to deploy an Ethereum node. [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ## Getting started with the project Once you've installed Bun and secured an Ethereum endpoint from Chainstack, it's time to set up your project. Create a new directory and initiate it with the following: ```shell Shell bun init ``` This process is straightforward, especially if you're familiar with `npm init`. Here's how I configured mine: ```shell Shell package name (address_check): entry point (index.ts): server.js Your package.json is now created in the directory. You'll also find: + server.js + .gitignore + jsconfig.json (helpful for editor auto-complete) + README.md To start your project, simply run: bun run server.js ``` With that, your Bun project is set up, complete with a `.gitignore`, README, and main file, which in this case is named `server.js`. ## Setting up the `.env` file Next up, we're going to set up a `.env` file. This is where we'll store our Chainstack endpoint. Just create a new `.env` file and add this inside: ```bash Bash CHAINSTACK_NODE_URL='YOUR_CHAINSTACK_ENDPOINT' ``` This way, our Chainstack endpoint stays safe. With Bun, we can directly get environment variables from its environment; no extra packages are needed. ## The code Now it’s time for the code; in the `server.js` file already created for us by Bun, paste the following: ```javascript server.js const PORT = 5555; const ETHEREUM_ADDRESS_REGEX = /^0x[a-fA-F0-9]{40}$/; const CHAINSTACK_NODE_URL = Bun.env.CHAINSTACK_NODE_URL; const JSON_HEADERS = { "Content-Type": "application/json" }; function isValidEthereumAddress(address) { return ETHEREUM_ADDRESS_REGEX.test(address); } async function fetchFromEthereumNode(address) { console.log(`Calling Ethereum node with address: ${address}\n`); const response = await fetch(CHAINSTACK_NODE_URL, { method: "POST", headers: { "accept": "application/json", "content-type": "application/json", }, body: JSON.stringify({ id: 1, jsonrpc: "2.0", method: "eth_getBalance", params: [address, "latest"], }), }); const responseBody = await response.json(); console.log( `Received response from Ethereum node: ${JSON.stringify(responseBody)}\n` ); if (!response.ok) { throw new Error("Error fetching balance"); } if (responseBody.error) { throw new Error( responseBody.error.message || "Error in Ethereum node response" ); } return responseBody; } function convertWeiToEther(weiValue) { const divisor = BigInt("1000000000000000000"); const wholeEthers = weiValue / divisor; const remainderWei = weiValue % divisor; const remainderEther = remainderWei.toString().padStart(18, "0"); return `${wholeEthers}.${remainderEther}`; } async function getEthereumBalance(address) { const responseBody = await fetchFromEthereumNode(address); const decimalValue = parseInt(responseBody.result.substring(2), 16); const weiValue = BigInt(decimalValue); return convertWeiToEther(weiValue); } function logAndReturnResponse(status, content) { console.log( `Sending response back to user: ${status} ${JSON.stringify(content)} \n` ); return new Response(JSON.stringify(content), { status: status, headers: JSON_HEADERS, }); } Bun.serve({ port: PORT, async fetch(request) { console.log(`Received request: ${request.method} ${request.url}\n`); const urlObject = new URL(request.url); const pathname = urlObject.pathname; try { if (request.method === "GET" && pathname.startsWith("/getBalance/")) { const address = pathname.split("/getBalance/")[1]; if (!isValidEthereumAddress(address)) { return logAndReturnResponse(400, { error: "Invalid Ethereum address format", }); } const balance = await getEthereumBalance(address); return logAndReturnResponse(200, { address: address, balance: balance, unit: "Ether", }); } return logAndReturnResponse(404, { error: "Endpoint does not exist" }); } catch (error) { console.error(`Error occurred: ${error.message}`); return logAndReturnResponse(500, { error: error.message }); } }, }); console.log(`Bun server running on port ${PORT}...`); ``` Let us further explain the code. ### 1. Setting up constants The beginning sets up some constants, like the port number, which you can adapt to your use case, and the Chainstack RPC URL picked up from the environment variables. ```jsx server.js const PORT = 5555; const ETHEREUM_ADDRESS_REGEX = /^0x[a-fA-F0-9]{40}$/; const CHAINSTACK_NODE_URL = Bun.env.CHAINSTACK_NODE_URL; const JSON_HEADERS = { "Content-Type": "application/json" }; ``` * `PORT` — the port number on which our server will listen. * `ETHEREUM_ADDRESS_REGEX` — a regular expression to validate Ethereum addresses. * `CHAINSTACK_NODE_URL` — the URL of the Ethereum node we'll be querying. This is fetched from an environment variable. Note that the environment variable is taken directly from the Bun environment using `Bun.env`. * `JSON_HEADERS` — standard JSON headers used in HTTP responses. ### 2. Utility functions #### `isValidEthereumAddress(address)` This function checks if a given string matches the Ethereum address format, and the server will return an error if the pattern doesn’t match; it is good practice and enhances UX. #### `convertWeiToEther(weiValue)` The smallest unit in Ethereum is named `Wei`. This function converts a value in wei to its equivalent in ether, considering that 1 ether = 10^18 wei, and it is used to return a human-readable value to the user. ### 3. Fetching data from Ethereum node #### `fetchFromEthereumNode(address)` This function does the following: * Logs the Ethereum address it's about to query. * Sends a POST request to the Ethereum node to fetch the balance of the address using [eth\_getBalance](/reference/ethereum_node_api/accounts_info/eth_getBalance). * Checks and logs the response from the Ethereum node. * Throws errors if the response is not successful or if there's an error in the Ethereum node's response. * Returns the response body if everything is okay. ### 4. Getting Ethereum balance: #### `getEthereumBalance(address)` This function orchestrates the process of fetching the balance: * Calls `fetchFromEthereumNode(address)` to get the balance in wei. * Converts the wei value to ether. * Returns the balance in ether. ### 5. Logging and sending responses: #### `logAndReturnResponse(status, content)` This utility function logs the response status and content and returns the response to the client. ### 6. Server logic The `Bun.serve` function sets up the server: * It first logs any incoming request. * If the request is a GET request to the `"/getBalance/"` endpoint, it processes it the following way: * Validates the Ethereum address * Fetches the balance using `getEthereumBalance(address)` * Sends back the balance in ether * If the endpoint doesn't match, it sends a 404 error. * Any errors during the process are caught, and a 500 error is returned. Finally, a log statement indicates that the server is running and listening on the specified port. So, as you can see, we run everything within Bun. `Bun.serve` spins up a server with no extra dependencies. ## Running the server with hot reloading Your server setup is complete, and it's time to get it up and running. For an enhanced development experience, start the server with the `--hot` flag. This activates Bun's hot reloading feature, ensuring that any modifications you make to your modules or files are instantly reflected on the server without manual restarts. When using `Bun.serve()` for HTTP server tasks, Bun got you covered. It smartly identifies any changes and refreshes your fetch handler, all without rebooting the entire Bun process. This results in almost instantaneous hot reloads, optimizing your development flow. Start the server with the following command: ```shell Shell bun --hot server.js ``` Upon execution, you should see: ```shell Shell [0.59ms] ".env" Bun server running on port 5555... ``` With your server active, you're all set to test your DApp. You can use tools like [Postman](https://www.postman.com/); you can use the following curl request: ```bash cURL curl --location 'localhost:5555/getBalance/0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13' ``` This prompts the server to retrieve the balance from the Ethereum node, returning: ```Json JSON { "address": "0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13", "balance": "148.991988684657950720", "unit": "Ether" } ``` While the console will log the following: ```shell Shell Received request: GET http://localhost:5555/getBalance/0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13 Calling Ethereum node with address: 0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13 Received response from Ethereum node: {"jsonrpc":"2.0","id":1,"result":"0x813ade050b80ebb48"} Sending response back to user: 200 {"address":"0xae2Fc483527B8EF99EB5D9B44875F005ba1FaE13","balance":"148.991988684657950720","unit":"Ether"} ``` You've just crafted a streamlined API to fetch Ethereum balances using Bun without additional dependencies. ## Conclusion In the ever-evolving landscape of JavaScript, Bun emerges as a promising toolkit that simplifies and streamlines the development process. Through this guide, we've witnessed the power and efficiency of Bun, crafting a DApp to fetch Ethereum balances with minimal fuss and zero extra dependencies. The ease with which we can set up servers, integrate with blockchain nodes, and benefit from hot reloading showcases Bun's potential to become a staple in the developer's toolkit. As we move forward, we must keep an eye on tools like Bun that prioritize developer experience and performance. Whether you're a seasoned developer or just starting, embracing such tools can significantly enhance your productivity and the quality of your projects. # Introduction to smart contract auditing using Foundry Source: https://docs.chainstack.com/docs/introduction-to-smart-contract-manual-auditing-with-foundry-and-slither **TLDR** * Smart contract auditing is vital due to the immutable nature of on-chain code – once deployed, bugs and vulnerabilities can’t be easily patched. * Manual auditing (plus tools like Slither and Foundry) helps discover issues such as reentrancy, overflow/underflow, or weak pseudo-randomness before they cause real damage. * Following best practices—like checks-effects-interactions, mutex locks, proper random number generation, and robust testing—strengthens contract security. * Thoroughly documenting discovered vulnerabilities and their fixes fosters trust and ongoing improvement in the auditing process. ## Main article Smart contract auditing is the process of reviewing and evaluating the code of a smart contract to identify potential security vulnerabilities, bugs, and other issues that may impact the contract's functionality. There are two main types of auditing: manual auditing and automated auditing. Manual auditing involves reviewing the code line-by-line and using tools like Slither to identify potential issues. Automated auditing involves using software tools to scan the code and identify potential vulnerabilities. The need to audit smart contracts is critical, as smart contracts are immutable and can cause significant harm if they contain security vulnerabilities. In this project, we will provide an overview of smart contract auditing, with a focus on manual auditing techniques. We will also cover common attack vectors, such as reentrancy, replay attacks, and overflows, and provide code examples and snippets to demonstrate how to use tools like Slither in auditing contracts, as well as how to test and report on identified issues. The purpose of smart contract auditing is to guarantee that the contract works as it should and that the code is safe from malicious attacks and unforeseen outcomes. ### Why audit smart contracts? Auditing smart contracts is critical, as smart contracts are self-executing and immutable once deployed on the blockchain. This means that once a smart contract is deployed, its code and behavior cannot be changed, making it important to identify and resolve any security vulnerabilities before deployment. Smart contracts are often used to manage and transfer valuable assets, such as cryptocurrencies, on the blockchain. If a smart contract contains security vulnerabilities, it can be exploited by malicious actors, leading to the loss of these assets. For example, a smart contract that has a vulnerability that allows someone to steal funds from the contract can result in a significant financial loss for the contract's users. Additionally, smart contracts are designed to be autonomous and self-executing, meaning they can perform actions automatically without the need for human intervention. This is both a strength and a weakness of smart contracts, as a vulnerability in the code can cause the contract to behave in unintended ways, such as sending funds to the wrong address or executing code in an infinite loop. In short, auditing smart contracts is important because it helps identify and resolve potential security vulnerabilities and bugs, ensuring that the contract behaves as intended and provides the necessary security for users and their assets. ## Auditing smart contracts manually with Slither and Foundry ### What is Slither? Slither is a tool for making smart contracts more secure. It was created by **Trail Of Bits** and was first released to the public in 2018. Slither is a type of software that helps check the security of smart contracts. It's written in Python and looks for potential problems in the code of a smart contract. Slither has different types of checks built into it to help find different kinds of security issues. It also provides information about the details of the smart contract and has an API that makes it easy to add custom checks. This tool helps developers find and fix problems, understand the code better, and create custom checks as needed. Slither works by analyzing the code of a smart contract and looking for specific patterns and code snippets that are known to be vulnerable. Once Slither has identified potential vulnerabilities in the code, it generates a report that highlights the issues and provides recommendations for how to fix them. ### Install Slither Slither requires [solc](https://docs.soliditylang.org/en/latest/installing-solidity.html), the Solidity compiler, and [Python 3.8+](https://www.python.org/downloads/). I also recommend creating a new Python virtual environment for this project: ```shell Shell python3 -m venv audits ``` Install Slither using pip by running the following command: ```shell Shell pip3 install slither-analyzer ``` To learn more about other installation methods, check out the [Slither repository](https://github.com/crytic/slither#how-to-install). Next, install the [Slither extension](https://marketplace.visualstudio.com/items?itemName=trailofbits.slither-vscode) in VS Code. ### What is Foundry? Foundry is a toolkit for building applications on the Ethereum blockchain. It's written in a programming language called Rust and is designed to be fast, flexible, and easy to use. Foundry is made up of several different tools that work together to make building and testing Ethereum applications easier. Some of the tools include: * Forge — a testing framework for Ethereum applications, similar to other testing frameworks like Truffle, Hardhat, and DappTools. * Cast — a tool that helps you interact with smart contracts on the Ethereum blockchain. You can use it to send transactions and get information about the blockchain. * Anvil — a local Ethereum node, similar to Ganache and Hardhat Network, that you can use to test your applications. * Chisel — a Solidity REPL (Read-Eval-Print-Loop) that lets you test and run Solidity code in a fast, efficient, and verbose way. Each of these tools is designed to make different parts of the Ethereum development process easier and more efficient, and when used together, they provide a comprehensive toolkit for building and testing Ethereum applications. Learn more about Foundry from our [workshop with Learn Web3 DAO](https://github.com/chainstacklabs/learnweb3dao-foundry-workshop). ### Install Foundry To get started with Foundry, you need to install a tool called `foundryup`. Here's how: 1. Open your terminal and run this command (for Linux and macOS): ```bash cURL curl -L https://foundry.paradigm.xyz | bash ``` 2. Once you've got the installation script, open a new terminal or make sure your PATH is up-to-date. Then, run the following command: ```shell Shell foundryup ``` That's it! You're now ready to start using Foundry. If you require additional information or assistance, please visit the Foundry [repository](https://github.com/foundry-rs/foundry#installation) or check out the [Foundry book](https://book.getfoundry.sh/getting-started/installation). ## Auditing smart contracts Before we dive into the smart contract auditing process, let's take a moment to familiarize ourselves with the steps involved. The auditing journey typically unfolds as follows: 1. **Test execution**. We kick off the process by running a series of tests on the smart contract code. This helps us spot any potential hiccups that might be lurking in the code. 2. **Specification and documentation review**. Next, we immerse ourselves in the specifications and documentation of the smart contract. This deep dive gives us a comprehensive understanding of the contract's inner workings. 3. **Fast tool utilization**. Here, we employ rapid-fire tools like Slither to swiftly pinpoint potential vulnerabilities and security issues in the code. 4. **Manual analysis**. Post the automated tests and tool usage, we conduct a meticulous manual analysis of the code. This step helps us catch any issues that might have slipped through the automated checks. 5. **Discussion**. We then engage in a detailed discussion about the identified issues. This dialogue ensures a thorough understanding of the problem and helps us chart out the most effective course of action. 6. **Report compilation**. The final step involves crafting a comprehensive report that documents all the identified issues, along with recommendations for rectifying the problems. This report serves as a valuable reference for future updates or audits of the smart contract. Let's jump into auditing a code base! Remember, we'll be working on the project using Foundry and Slither. [Find the repository on GitHub](https://github.com/chainstacklabs/smart-contracts-audit-foundry-slither). First, clone the repository ```shell Shell $ git clone https://github.com/chainstacklabs/smart-contracts-audit-foundry-slither.git $ cd audit-practice $ forge install $ forge build ``` In the `audit/src` directory, there are three contracts written in Solidity. We will be using these contracts for this tutorial. You can do static analysis with Slither in two ways. You can either use it in the terminal or as an [extension](https://marketplace.visualstudio.com/items?itemName=trailofbits.slither-vscode) in VS Code. We'll explain both options so it's easy to understand. In your terminal, run the `slither` command: ```shell Shell slither . ``` The command `slither .` analyzes all Solidity files within the current directory, as denoted by the `.` symbol. Slither scans these files, generating a comprehensive report that outlines potential vulnerabilities, bugs, and areas of concern regarding code quality within the smart contracts. You will find the results categorized into three groups: high vulnerabilities (red), medium vulnerabilities (yellow), and low vulnerabilities (green). You can conveniently get the same outcome by using the Slither extension in VS Code. Simply click the Slither icon and proceed to run it. Example of Slither report Slither found two high-level risks: * `Lottery.endLottery() (src/OverUnderFlowVul.sol#25-33) uses a weak PRNG` * `Reentrancy in ReentrancyExample.withdraw(uint256) (src/ReentrancyExample.sol#11-16)` High-level risks example If you examine closely, you'll see that Slither missed a major problem in the `OverUnderVul.sol` contract and `ReplayVul.sol`. That's why it's important to also manually go through the code one line at a time. Please note that the following examples are for educational purposes only, and modern versions of Solidity might already include checks to avoid some of the issues explained here. ### Overflow and underflow vulnerabilities Let’s go over `OverUnderVul.sol`, which is a lottery contract, first. ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract Lottery { address public owner; uint public jackpot; mapping(address => uint) public balances; address[] public players; bool public isEnded; constructor() { owner = msg.sender; jackpot = 0; isEnded = false; } function buyTicket() public payable { require(!isEnded, "Lottery has ended"); require(msg.value == 1 ether, "Please send 1 ether to buy a ticket"); players.push(msg.sender); balances[msg.sender] += msg.value; jackpot += msg.value; } function endLottery() public { require(msg.sender == owner, "Only owner can end the lottery"); require(!isEnded, "Lottery has already ended"); isEnded = true; uint winnerIndex = uint(blockhash(block.number - 1)) % players.length; address winner = players[winnerIndex]; balances[winner] += jackpot; jackpot = 0; } } ``` After thoroughly reviewing the contract, it has been discovered that there are potential vulnerabilities with both overflow and underflow. * An overflow vulnerability exists in the jackpot variable. If the sum of all ticket purchases exceeds the maximum value that can be stored in a `uint` variable, which is 2^256-1, the jackpot variable will reset to zero. This can result in the winner receiving an incomplete prize and cause unexpected behavior in other areas of the contract. 2^256-1 is a very big number, but it’s better to cover every possibility. * In the `balances[msg.sender] += msg.value;` line, the balance is also susceptible to overflow. * An underflow vulnerability exists in the balance mapping. If a player withdraws more funds than they have in their balance, the balance will underflow and wrap around to the maximum value of a `uint` variable. This allows them to withdraw a large amount of funds they do not own and could be exploited by malicious actors to steal funds from the contract. *To prevent these vulnerabilities, the contract could add additional checks to ensure that the jackpot and balance variables do not overflow or underflow. For example, the contract could limit the maximum value of the jackpot. The contract could also check that a player has sufficient funds before allowing them to withdraw any amount.* ### Modifying and writing PoC for the `OverUnderVul.sol` contract Here is a modified version of the lottery contract in the `OverUnderFlowVul.sol` file that includes checks to prevent overflow and underflow vulnerabilities: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity 0.8.0; contract Lottery { address public owner; uint256 public jackpot; mapping(address => uint256) public balances; address[] public players; bool public isEnded; constructor() { owner = msg.sender; jackpot = 0; isEnded = false; } function buyTicket() public payable { require(!isEnded, "Lottery has ended"); require(msg.value == 1 ether, "Please send 1 ether to buy a ticket"); players.push(msg.sender); require(balances[msg.sender] + msg.value > balances[msg.sender], "Balance overflow"); balances[msg.sender] += msg.value; require(jackpot + msg.value > jackpot, "Jackpot overflow"); jackpot += msg.value; } function endLottery() public { require(msg.sender == owner, "Only owner can end the lottery"); require(!isEnded, "Lottery has already ended"); isEnded = true; uint256 winnerIndex = uint256(blockhash(block.number - 1)) % players.length; address winner = players[winnerIndex]; require(jackpot > 0, "No jackpot to award"); balances[winner] += jackpot; require(balances[winner] >= jackpot, "Balance underflow"); jackpot = 0; } } ``` The following are the contract modifications: * Two checks for the jackpot and balance overflow have been added to the `buyTicket` function to ensure that the jackpot and balance do not exceed the maximum value of a `uint256` variable. The line `require(jackpot + msg.value > jackpot, "Jackpot overflow");` ensures that the new value of `jackpot` (after adding `msg.value`) is indeed greater than the current `jackpot` value. If it's not, this means an overflow has occurred, and the function will revert due to the `require` statement. The balance check follows the same principle. * A check for balance underflow has been added to the endLottery function to ensure the winner's balance does not go below zero when adding the jackpot amount. The line `require(balances[winner] >= jackpot, "Balance underflow");` ensures that the new balance of the `winner` (after adding the `jackpot`) is indeed greater than or equal to the `jackpot` value. If it's not, this means an underflow has occurred, and the function will revert due to the `require` statement. ### Weak pseudo-random number generator warning Now that we covered those vulnerabilities let’s talk about the big one. The use of a weak pseudo-random number generator (PRNG) in your `endLottery()` function as highlighted by Slither. The line `uint256 winnerIndex = uint256(blockhash(block.number - 1)) % players.length;` is trying to generate a pseudo-random number to select a winner from the `players` array. However, Slither is warning you that this method of generating random numbers is not secure. The `blockhash` function returns the hash of the given block number, and in this case, it's the hash of the previous block (`block.number - 1`). The issue is that block hash, block number, and other similar variables are public on the blockchain. This means that validators (or anyone else) who can see these variables could potentially manipulate the outcome to their advantage. In the context of a lottery, this could mean that a validator might be able to influence the result to ensure they win, which would be a major security vulnerability. Therefore, it's generally recommended to use a more secure method for generating random numbers in a smart contract, which generally includes a service like [Chainlink VRF](/docs/smart-contracts-glossary#chainlink-vrf). Learn more about Chainlink VRF from our [Chainlink VRF Tutorial with Foundry](https://chainstack.com/using-chainlinks-vrf-with-foundry/). The following is a **PoC** contract factory that deploys and interacts with the modified contract: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; import "./OverUnderFlowVul.sol"; contract LotteryPOC { OverUnderFlowVul public lottery; constructor() { lottery = new OverUnderFlowVul(); } function buyTickets(uint256 numTickets) public payable { for (uint256 i = 0; i < numTickets; i++) { lottery.buyTicket{value: 1 ether}(); } } function endLottery() public { lottery.endLottery(); } function withdraw() public { uint256 balance = lottery.balances(msg.sender); require(balance > 0, "No funds to withdraw"); lottery.balances(msg.sender) = 0; payable(msg.sender).transfer(balance); } } ``` The contract is a proof of concept that deploys an instance of the lottery contract. The contract provides three methods to the user: `buyTickets`, `endLottery`, and `withdraw`. The `buyTickets` method allows the user to buy a specified number of tickets, and the `endLottery` method allows the owner of the lottery contract to end the lottery and select a winner. Finally, the `withdraw` method allows players to withdraw their winnings. These changes were made to prevent vulnerabilities related to overflow and underflow in the lottery contract. By making these changes, the lottery contract is now more secure and fair for all players. ### Almost-replay vulnerability Now, let's go over `ReplayVul.sol` which is a simple NFT marketplace contract. ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract SimpleNFTMarketplace { mapping(address => uint256) public balances; mapping(uint256 => address) public tokenOwners; function buyToken(uint256 _tokenId, uint256 _price) public { require(tokenOwners[_tokenId] != address(0), "Token does not exist"); require(balances[msg.sender] >= _price, "Insufficient balance"); balances[msg.sender] -= _price; balances[tokenOwners[_tokenId]] += _price; tokenOwners[_tokenId] = msg.sender; } } ``` This contract has a vulnerability that allows a user to manipulate the ownership of a token by calling the `buyToken` function. Specifically, if an attacker has already obtained ownership of a token, they can call the `buyToken` function with the same `_tokenId` parameter and a very low `_price` parameter. Since the attacker already owns the token, the `tokenOwners[_tokenId] != address(0)` check will pass, and the `require(balances[msg.sender] >= _price)` check will also pass since the attacker can set the price to a very low value. As a result, the attacker's balance will be decreased by the `_price` amount, while the previous owner's balance will be increased by the same amount. Additionally, the ownership of the token will be transferred to the attacker. This can be repeated multiple times by the attacker to keep acquiring ownership of the same token at a very low cost. This vulnerability is not a replay attack in the traditional sense, as it doesn't involve replaying transactions on different networks or with different nonces. Instead, it's a form of attack where a token owner can "buy" their own token at any price, potentially draining the contract's balance. One way to do this is by introducing a nonce parameter in the `buyToken` function that must be incremented every time the function is called. This would ensure that each transaction is unique but it doesn't directly address the issue of a token owner buying their own token. To address this issue, a more direct solution would be to add a `require` statement to the `buyToken` function that checks if `msg.sender` is not the current owner of the token. This would prevent the token owner from buying their own token, thus mitigating the vulnerability. As always, it's crucial to thoroughly test and audit your contract before deploying it to a live network. Here's the updated contract with the nonce check implemented: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract SimpleNFTMarketplace { mapping(address => uint256) public balances; mapping(uint256 => address) public tokenOwners; function buyToken(uint256 _tokenId, uint256 _price) public { require(tokenOwners[_tokenId] != address(0), "Token does not exist"); require(tokenOwners[_tokenId] != msg.sender, "Token already owned by buyer"); require(balances[msg.sender] >= _price, "Insufficient balance"); balances[msg.sender] -= _price; balances[tokenOwners[_tokenId]] += _price; tokenOwners[_tokenId] = msg.sender; } } ``` In this updated contract, the line `require(tokenOwners[_tokenId] != msg.sender, "Token already owned by buyer");` ensures that the buyer is not already the owner of the token. If they are, the function will revert due to the `require` statement. This prevents a token owner from buying their own token, which addresses the vulnerability in the original contract. ### Reentrancy attacks Now, let's go over the reentrancy issue spotted by Slither in `ReentrancyExample.sol` contract. The vulnerability identified by Slither was `Reentrancy in ReentrancyExample.withdraw(uint256) (src/ReentrancyExample.sol#11-16)` ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; contract ReentrancyExample { mapping(address => uint) balances; function deposit() public payable { balances[msg.sender] += msg.value; } function withdraw(uint amount) public { require(balances[msg.sender] >= amount, "Insufficient balance."); (bool success, ) = msg.sender.call{value: amount}(""); require(success, "Transfer failed."); balances[msg.sender] -= amount; } function getBalance() public view returns (uint) { return balances[msg.sender]; } } ``` The Slither report indicates that there is a reentrancy vulnerability in the `withdraw` function of the `ReentrancyExample` contract. Remember, reentrancy is a type of attack where an attacker exploits a contract's code to execute a function multiple times before the previous execution has been completed. In this contract, an attacker could potentially call the `withdraw` function repeatedly before the `balances[msg.sender] -= amount;` line is executed, allowing them to repeatedly withdraw funds from their balance and drain the contract's balance. Here's the potential attack scenario: 1. An attacker deposits some ether into your contract, thereby establishing a non-zero balance. 2. The attacker calls the `withdraw` function. Your contract starts executing the `withdraw` function. 3. During the `withdraw` call, before the balance is deducted, the attacker's contract fallback function is triggered by the `call` function. 4. Within the attacker's fallback function, the attacker again calls the `withdraw` function. 5. Since the contract's state has not yet been updated (the balance deduction happens after the `msg.sender.call`), the contract still sees the balance as not being withdrawn, so it processes the nested `withdraw` call, and sends the funds to the attacker. 6. This can be repeated until the contract's balance is drained. To fix this vulnerability, the contract should use a mutex to prevent reentrancy. A common way to do this is to use the "checks-effects-interactions" pattern, which means that a contract should first check all of the preconditions for executing a function, then update the contract's state, and then interact with external contracts or send ether. Note that this contract is an example only and does not mean other improvements cannot be made. For example, the `withdraw` function could be external instead of public, and the mappings could be private. Even with this example, Slither will still give a reentrancy warning even though we took action against it; static analysis tools such as Slither analyze code in a very systematic and rule-based way, looking for specific patterns that may suggest potential vulnerabilities. Here is an updated version of the contract that uses the "checks-effects-interactions" pattern: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity 0.8.0; contract ReentrancyExample { mapping(address => uint) balances; mapping(address => bool) locked; function deposit() public payable { balances[msg.sender] += msg.value; } function withdraw(uint amount) public { require(balances[msg.sender] >= amount, "Insufficient balance."); require(!locked[msg.sender], "Reentrancy detected."); locked[msg.sender] = true; balances[msg.sender] -= amount; (bool success, ) = msg.sender.call{value: amount}(""); require(success, "Transfer failed."); locked[msg.sender] = false; } function getBalance() public view returns (uint) { return balances[msg.sender]; } } ``` In this version of the contract, we have added a new `locked` mapping to keep track of which accounts are currently executing the `withdraw` function. Before updating the contract's state or interacting with external contracts, we check if the account is already locked. If it is, we revert the transaction to prevent reentrancy. If it is not locked, we set the `locked` flag to `true`, update the balance and perform the transfer, and then set the flag back to `false` to release the lock. To demonstrate the vulnerability, an attacker could create a contract that repeatedly calls the `withdraw` function of the `ReentrancyExample` contract before the previous call has been completed. Remember to be careful and consider all the edge cases when using this kind of locking mechanism. For instance, if a call to an external contract fails (like if the external contract doesn't have a fallback function or if it reverts for some reason), then the lock will not be released, locking that address out of the `withdraw` function permanently. You might want to use a try-catch block to handle potential exceptions from the external call, allowing you to unlock even when the external call fails. Here is a simple PoC contract that demonstrates the attack: ```solidity solidity // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; import "./ReentrancyVul.sol"; contract ReentrancyAttack { ReentrancyExample public target; uint public count; constructor(address _target) { target = ReentrancyExample(_target); } function attack() public payable { count++; if (count < 10) { target.withdraw(1 ether); } } receive() external payable { if (count < 10) { target.withdraw(1 ether); } } } ``` This contract creates a new `ReentrancyAttack` contract and sets the `target` to the address of the `ReentrancyExample` contract. The `attack` function is called repeatedly to execute the `withdraw` function of the `ReentrancyExample` contract. The `receive` function is a fallback function that is called when the contract receives ether. This function also calls the `withdraw` function, allowing the attacker to repeatedly withdraw funds from the `ReentrancyExample` contract. To protect against this attack, we can deploy the updated `ReentrancyExample` contract with the mutex protection described earlier, and also we could use the [reentrancy guard contract from OpenZeppelin](https://github.com/OpenZeppelin/openzeppelin-contracts/blob/master/contracts/utils/ReentrancyGuard.sol). ## Conclusion In conclusion, smart contract auditing is crucial to ensure the safety of assets managed by smart contracts on the blockchain. Auditing can be done manually or with the use of automated tools, but in this article, the focus was on manual auditing with tools like Slither and Foundry. The need for smart contract auditing is driven by the immutable nature of smart contracts, which makes it difficult to fix security vulnerabilities once they are deployed. The auditing process involves examining the code line-by-line and identifying potential security issues, which are then documented and reported on. I hope this article has provided valuable insights into the importance of smart contract auditing and how to do it effectively. Thank you for reading. ### About the author Junior blockchain developer Enjoy working with React, Next.js, Solidity, Jamstack [](https://github.com/natachigram) [](https://twitter.com/natachijs) [](https://www.linkedin.com/in/natachijs/) # Introduction Source: https://docs.chainstack.com/docs/ipfs-storage-introduction IPFS Storage is the truly decentralized storage solution implemented on the Chainstack platform. It is a combination of [IPFS](https://ipfs.tech/) and [Storj](https://www.storj.io/). ### This is a closed beta If you want to participate, contact *[support@chainstack.com](mailto:support@chainstack.com)*. ## Why IPFS + Storj? IPFS is a peer-to-peer file-sharing protocol that is [content-addressed](https://docs.ipfs.tech/concepts/content-addressing/)—each file is uniquely identified based on its cryptographic hash. InterPlanetary File System (IPFS) is a distributed protocol that enables peer-to-peer sharing and storage of files on a global scale without depending on a centralized server. IPFS implements content-addressed storage that identifies files with a hash of their content, ensuring a unique identifier for each file. This method prevents file duplication and enhances network efficiency. IPFS also supports faster and more efficient file distribution as files are cached and distributed across multiple nodes, which reduces latency and improves download speeds. Storj is a decentralized cloud object storage network. Unlike other providers, Chainstack uses Storj as a backend storage solution instead of keeping users' files on local discs. Storj encrypts each object and then splits it into 80 pieces. Each of these pieces is then distributed along different nodes across the Storj network. Retrieving a file requires only 29 of those 80 pieces. While IPFS is focused on the efficient distribution of content through content-addressing, Storj provides out-of-the-box redundant storage of files across the global network. The two technologies complement each other—by using IPFS to distribute content and Storj for reliable storage, developers benefit from the strengths of both systems, resulting in a robust, truly decentralized file storage solution. ## What is pinning? In IPFS, files can be stored with or without pinning. A *file that is not pinned* is not automatically distributed between nodes in the network, instead, a file is stored on a node that added it. It will be hosted there only as long as other nodes in the network interact with it. If the file stays idle, it can be eventually removed from the node. On the other hand, *pinned* files are marked to be preserved and available all the time, until the owner decides to delete them. Each file uploaded to the IPFS Storage implementation is *automatically pinned* so that the file is available until you delete it. ## What is CID? CID is a distinct identifier utilized in IPFS to represent files or folders uniquely. It is generated by a hash function applied to the content of a file or other data object. CIDs enable the retrieval of content from any node in the IPFS network, facilitate versioning of objects, and are crucial in building distributed applications that can share and access data in a peer-to-peer fashion, independent of traditional addressing. ## What are buckets, folders, and catalogs? A bucket is a top-level virtual collection of files or folders that can be stored and managed as a single entity. You can use buckets to organize and group related data objects, making them easier to manage and share. Buckets are useful for organizing large datasets: by grouping related files or data objects into buckets, users can manage and distribute these datasets more efficiently and effectively. Dedicated private gateways in IPFS Storage are associated with buckets, not folders. A catalog is a second-level virtual collection of data objects that you can create within buckets. Catalogs are immutable as they get pinned in the IPFS network and receive their own CID. Catalogs can contain other folders, files, and catalogs. A folder is a second-level virtual collection of data objects that you can create within buckets. Folders are mutable, they don't get pinned in the IPFS networks, and have no CID (instead, each file within a folder gets its own CID). Folders can contain other folders, files, and catalogs. ## What are different types of gateways? In IPFS, gateways are used to browse and fetch files and data objects stored on the IPFS network without using special software or knowledge of the underlying protocols. With IPFS Storage, Chainstack provides two types of gateways: * Public gateways — are publicly accessible and can be used by anyone to retrieve and serve content from the IPFS network. This can cause downtime or connectivity issues which can make it difficult or even impossible to access content. Additionally, a public gateway has rate and speed limitations. * Dedicated gateways — provide more reliable and secure access to content. IPFS Storage dedicated gateways are not rate- or speed-limited which makes them a perfect solution for storing sensitive and/or large-scale datasets that must be accessible all the time. IPFS Storage dedicated gateways can be created in one of the following modes: * Private — a dedicated private gateway must be assigned to a single bucket created within an organization. A dedicated private gateway only serves for content that exists in the bucket it’s assigned to. * Open — a dedicated open gateway serves for any content uploaded by an organization. # Kaia methods Source: https://docs.chainstack.com/docs/kaia-methods | Method | Availability | Comment | | -------------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | kaia\_accountCreated | | | | kaia\_accounts | | | | kaia\_blockNumber | | | | kaia\_call | | | | kaia\_chainId | | | | kaia\_clientVersion | | | | kaia\_decodeAccountKey | | | | kaia\_encodeAccountKey | | | | kaia\_estimateComputationCost | | | | kaia\_estimateGas | | | | kaia\_feeHistory | | | | kaia\_gasPrice | | | | kaia\_getAccount | | | | kaia\_getAccountKey | | | | kaia\_getBalance | | | | kaia\_getBlockByHash | | | | kaia\_getBlockByNumber | | | | kaia\_getBlockTransactionCountByHash | | | | kaia\_getBlockTransactionCountByNumber | | | | kaia\_getBlockWithConsensusInfoByHash | | | | kaia\_getBlockWithConsensusInfoByNumber | | | | kaia\_getBlockWithConsensusInfoByNumberRange | | | | kaia\_getChainConfig | | | | kaia\_getCode | | | | kaia\_getCommittee | | | | kaia\_getCommitteeSize | | | | kaia\_getCouncil | | | | kaia\_getCouncilSize | | | | kaia\_getFilterChanges | | | | kaia\_getHeaderByHash | | | | kaia\_getHeaderByNumber | | | | kaia\_getProof | | | | kaia\_getRewards | | | | kaia\_getStakingInfo | | | | kaia\_getStorageAt | | | | kaia\_getTransactionByBlockHashAndIndex | | | | kaia\_getTransactionByBlockNumberAndIndex | | | | kaia\_getTransactionByHash | | | | kaia\_getTransactionBySenderTxHash | | | | kaia\_getTransactionCount | | | | kaia\_getTransactionReceipt | | | | kaia\_getTransactionReceiptBySenderTxHash | | | | kaia\_isContractAccount | | | | kaia\_isSenderTxHashIndexingEnabled | | | | kaia\_lowerBoundGasPrice | | | | kaia\_maxPriorityFeePerGas | | | | kaia\_newBlockFilter | | | | kaia\_newFilter | | | | kaia\_newPendingTransactionFilter | | | | kaia\_nodeAddress | | | | kaia\_pendingTransactions | | | | kaia\_protocolVersion | | | | kaia\_recoverFromMessage | | | | kaia\_sendRawTransaction | | | | kaia\_signTransaction | | | | kaia\_subscribe | | | | kaia\_syncing | | | | kaia\_uninstallFilter | | | | kaia\_unsubscribe | | | | kaia\_upperBoundGasPrice | | | | kaia\_sha3 | | | | klay\_accountCreated | | | | klay\_accounts | | | | klay\_blockNumber | | | | klay\_call | | | | klay\_chainId | | | | klay\_clientVersion | | | | klay\_decodeAccountKey | | | | klay\_encodeAccountKey | | | | klay\_estimateComputationCost | | | | klay\_estimateGas | | | | klay\_feeHistory | | | | klay\_gasPrice | | | | klay\_getAccount | | | | klay\_getAccountKey | | | | klay\_getBalance | | | | klay\_getBlockByHash | | | | klay\_getBlockByNumber | | | | klay\_getBlockTransactionCountByHash | | | | klay\_getBlockTransactionCountByNumber | | | | klay\_getBlockWithConsensusInfoByHash | | | | klay\_getBlockWithConsensusInfoByNumber | | | | klay\_getBlockWithConsensusInfoByNumberRange | | | | klay\_getChainConfig | | | | klay\_getCode | | | | klay\_getCommittee | | | | klay\_getCommitteeSize | | | | klay\_getCouncil | | | | klay\_getCouncilSize | | | | klay\_getFilterChanges | | | | klay\_getHeaderByHash | | | | klay\_getHeaderByNumber | | | | klay\_getProof | | | | klay\_getRewards | | | | klay\_getStakingInfo | | | | klay\_getStorageAt | | | | klay\_getTransactionByBlockHashAndIndex | | | | klay\_getTransactionByBlockNumberAndIndex | | | | klay\_getTransactionByHash | | | | klay\_getTransactionBySenderTxHash | | | | klay\_getTransactionCount | | | | klay\_getTransactionReceipt | | | | klay\_getTransactionReceiptBySenderTxHash | | | | klay\_isContractAccount | | | | klay\_isSenderTxHashIndexingEnabled | | | | klay\_lowerBoundGasPrice | | | | klay\_maxPriorityFeePerGas | | | | klay\_newBlockFilter | | | | klay\_newFilter | | | | klay\_newPendingTransactionFilter | | | | klay\_nodeAddress | | | | klay\_pendingTransactions | | | | klay\_protocolVersion | | | | klay\_recoverFromMessage | | | | klay\_sendRawTransaction | | | | klay\_signTransaction | | | | klay\_subscribe | | | | klay\_syncing | | | | klay\_uninstallFilter | | | | klay\_unsubscribe | | | | klay\_upperBoundGasPrice | | | | klay\_sha3 | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Kaia (ex. Klaytn): Contract Sizzle 100 Source: https://docs.chainstack.com/docs/klaytn-contract-sizzle-100 **TLDR** * This tutorial shows how to scan recent blocks on Klaytn Mainnet to find the top 3 most interacted-with contracts. * It uses web3.py, checks if addresses are contracts by retrieving bytecode, and caches results for efficiency. * A heap is used to store and retrieve the top 3 busiest contracts, and the script supports multithreading. * Can be adapted for analytics, alerts, or bot flows leveraging real-time contract interaction data. ## Main article Klaytn is an EVM-compatible protocol with a few modifications. Check the Klaytn docs for a full break-down on the compatibility: [Ethereum Compatibility](https://docs.klaytn.foundation/docs/learn/transactions/ethereum/). In this tutorial, we'll build a quick Python project called Contract Sizzle 100. Contract Sizzle 100 prints the top 3 hottest contracts on the Klaytn Mainnet (aka Cypress Network) over the past 100 blocks. The way it works is very simple: the script ingests each new incoming block from the Klaytn Mainnet, extracts all contract interactions, counts them and prints the top3 contracts that had the most interactions. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Klaytn Mainnet node * [web3.py](https://web3py.readthedocs.io/) ## Step-by-step ### Get a Klaytn node Log in to your [Chainstack account](https://console.chainstack.com/) and deploy a node. ### Create the script A few details on the implementation. How do you actually identify the contract addresses? One easy way is to: 1. Retrieve off the network each block with all transactions in the block. 2. Extract the `to:` address. 3. Check if the extracted address is a contract by doing an `eth_getCode` to the address. This process in Python, however, can be slow, so let's optimize it a bit: * After doing an `eth_getCode` to an extracted address, cache the results so that if we get this address again, we don't do a `eth_getCode` to it as we already know this is a contract address. * Multithread the script by using our tutorial [Mastering multithreading in Python for Web3 requests: A comprehensive guide](/docs/mastering-multithreading-in-python-for-web3-requests-a-comprehensive-guide). Here's the final script: ```python Python from web3 import Web3 from collections import defaultdict import heapq from concurrent.futures import ThreadPoolExecutor # Connect to a Klaytn node w3 = Web3(Web3.HTTPProvider('CHAINSTACK_NODE')) # Data structure to hold the count of interactions per contract contract_interactions = defaultdict(int) # Priority queue to maintain top 3 contracts top_contracts = [] # Cache for storing contract check results is_contract_cache = {} def is_contract(address): if address not in is_contract_cache: code = w3.eth.get_code(address) is_contract = code != '0x' is_contract_cache[address] = is_contract return is_contract_cache[address] def process_block(block_number): print(f"Processing block {block_number}") block = w3.eth.get_block(block_number, full_transactions=True) for tx in block.transactions: if tx.to and is_contract(tx.to): contract_interactions[tx.to] += 1 # Main loop to process 100 blocks def main(): latest_block = w3.eth.block_number print("Starting to process blocks...") with ThreadPoolExecutor(max_workers=10) as executor: executor.map(process_block, range(latest_block - 100, latest_block)) # Identify top 3 contracts for contract, interactions in contract_interactions.items(): heapq.heappush(top_contracts, (interactions, contract)) if len(top_contracts) > 3: heapq.heappop(top_contracts) # Print top 3 contracts print("Top contracts:", top_contracts) while top_contracts: interactions, contract = heapq.heappop(top_contracts) print(f'Contract {contract} had {interactions} interactions') if __name__ == '__main__': try: main() except Exception as e: print(f"An error occurred: {e}") ``` where * CHAINSTACK\_NODE — your Klaytn node deployed with Chainstack * `max_workers=10` — set to however parallel threads you feel is reasonable. Make sure don't hit the limits: [Limits](/docs/limits). ## Conclusion This tutorial guided you through creating a basic setup to live-track the hottest contract on the Klaytn Network. There are many fun ways you can use it as basis to build upon — from passing the data to a Twitter bot account to setting up your own alerts or bot flow. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Kaia (ex. Klaytn) tooling Source: https://docs.chainstack.com/docs/klaytn-tooling ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## caver-js [caver-js](https://docs.klaytn.foundation/docs/references/sdk/caver-js/) is a JavaScript library enabling developers to interact with a Klaytn node via HTTP connection. It supports all [Klaytn APIs](https://docs.klaytn.foundation/docs/references/sdk/caver-js/api/). Install [caver-js](https://docs.klaytn.foundation/docs/references/sdk/caver-js/get-started/#installation-). Connect over HTTP. ```javascript Javascript const Caver = require("caver-js"); const caver = new Caver("YOUR_CHAINSTACK_ENDPOINT"); async function getData() { const blockNumber = await caver.rpc.klay.getBlockNumber(); console.log(blockNumber); } getData(); ``` ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Klaytn nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const {Web3} = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Klaytn nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Ethers.js Extension for Kaia See [Ethers.js Extension for Kaia](https://docs.kaia.io/references/sdk/ethers-ext/getting-started/). ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Klaytn nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: 1. Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). 2. In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Klaytn nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Throughput guidelines Source: https://docs.chainstack.com/docs/limits ## Rate limits ### Solana-specific * Solana Mainnet: * Developer plan: 5 requests per second (RPS) * Growth plan: 50 requests per second (RPS) * Solana Devnet: * Developer plan: 25 requests per second (RPS) * Growth plan: 250 requests per second (RPS) ### Arbitrum-specific * Arbitrum Mainnet: `debug_traceBlockByNumber` 20 RPS on all plans ### All other protocols * Developer plan: 25 requests per second (RPS) * Growth plan: 250 requests per second (RPS) * Pro plan: 400 requests per second (RPS) * Business: 600 requests per second (RPS) * Enterprise: unlimited To upgrade a subscription plan, do the following: 1. In the left navigation bar, click **Billing**. 2. Next to **Plan**, click **Change**. 3. Select a new subscription plan and click **Next**. 4. Check the details of your new subscription plan and click **Confirm**. Your subscription plan changes immediately. ### Dedicated nodes You can also order [dedicated nodes](/docs/dedicated-node) and pay for the node resource consumption only instead of per-request billing. ## EVM Range limits For `eth_newFilter` requests, Developer subscription plan users get capped at 10,000 blocks per request. For the `eth_getLogs`, the caps are: * Developer plan — 100 blocks * Growth plan — 10,000 blocks * Pro plan — 10,000 blocks * Business plan — 10,000 blocks * Enterprise — 10,000 blocks. Customization available on request. Learn more about `eth_getLogs` limits by reading [Understanding eth\_getLogs limitations](/docs/understanding-eth-getlogs-limitations). For users on the Developer subscription plan, Chainstack applies a specific range limit for certain requests. This limit is designed to optimize the performance and resource allocation for our users on this plan. ## Custom tracers on EVMs Custom JavaScript tracers are available as customized solutions on the [Enterprise plan](https://chainstack.com/pricing/) on [dedicated nodes](/docs/dedicated-node). ## Ethereum`eth_simulateV1` supports only full node Running [eth\_simulateV1 | Ethereum](/reference/ethereum_node_api/execute_transactions/eth_simulateV1) will yield only a full node response—i.e. the data from the latest 128 blocks. Archive data is not supported for this call and the node will respond with `missing trie node`. ## Fantom method limits The following limits are applied on all subscription plans: * `debug_traceBlockByNumber`: 5 RPS * `debug_traceBlockByHash`: 5 RPS ## Solana method limits The following limits are applied: * `getBlocks`: 500,000 blocks range. This is the [Solana architecture limit](https://solana.com/docs/rpc/http/getblocks). * `getBlocksWithLimit`: 500,000 blocks range. This is the [Solana architecture limit](https://solana.com/docs/rpc/http/getblockswithlimit). * `getBlock`: * Chainstack Global Network Worldwide `global1` RPS: 100 * Chainstack Cloud London `lon1` RPS: 100 * Chainstack Cloud New York City `nyc1` RPS: 100 * `getBlockTime`: * Chainstack Global Network Worldwide `global1` RPS: 100 * Chainstack Cloud London `lon1` RPS: 100 * Chainstack Cloud New York City `nyc1` RPS: 100 * `getProgramAccounts`: * Chainstack Global Network `global1` RPS: 3 * Chainstack Cloud London `lon1` RPS: 10 * Chainstack Cloud New York City `nyc1` RPS: 3 * `getConfirmedBlock`: * Chainstack Global Network `global1` RPS: 30 * Chainstack Cloud London `lon1` RPS: 30 * Chainstack Cloud New York City `nyc1` RPS: 30 * `getSupply`: * Chainstack Global Network Worldwide `global1` RPS: 2 * Chainstack Cloud London `lon1` RPS: 2 * Chainstack Cloud New York City `nyc1` RPS: 2 * `getLargestAccounts`: * Chainstack Global Network Worldwide `global1` RPS: 6 * Chainstack Cloud London `lon1` RPS: 6 * Chainstack Cloud New York City `nyc1` RPS: 6 * `getTokenAccountsByOwner`: * Chainstack Global Network Worldwide `global1` RPS: 100 * Chainstack Cloud London `lon1` RPS: 100 * Chainstack Cloud New York City `nyc1` RPS: 100 * `getTokenSupply`: * Chainstack Global Network Worldwide `global1` RPS: 80 * Chainstack Cloud London `lon1` RPS: 80 * Chainstack Cloud New York City `nyc1` RPS: 80 ## Solana method availability The following methods are available only on the paid plans: * `getProgramAccounts` * `getLargestAccounts` * `getSupply` * `getTokenAccountsByOwner` ## Solana archive methods availability While most methods are supported on Solana [global nodes](/docs/global-elastic-node), only the following methods can fetch archive data: * `getSignaturesForAddress` * `getTransaction` * `getBlock` * `getBlocks` * `getBlockHeight` * `getBlockTime` * `getBlocksgetBlocksWithLimit` # List your app on marketplace Source: https://docs.chainstack.com/docs/list-your-app-on-marketplace While we encourage app developers to submit a request to have their products listed on the Chainstack Marketplace, we cannot accept apps that do not meet our minimum requirements. This guide is designed to familiarize you with these requirements and provide instructions on submitting a request. ## Listing requirements We recommend that you read and consider the following minimum requirements before submitting a request. ## Security requirements All Chainstack Marketplace apps should do the following: * Encrypt sensitive customer data. * Not require the user to provide their Chainstack password. * Delete all Chainstack user data within 30 days of receiving a request from the user, or within 30 days of the end of the user's legal relationship with Chainstack. * Store client ID and client secret keys securely. * Encrypt data transferred over the public internet using HTTPS, with a valid TLS certificate. ## Integration requirements Ensure that your app meets the following necessary technical integration requirements: * Authorization — one of the following methods: * OAuth 2.0 integration — OAuth 2.0 is a protocol that lets external apps request authorization to private details in a user's account without accessing their password. * JSON web token integration — JWT is a digitally signed string of encoded information that contains user identity and authorization data, and it's commonly used for secure authentication in web applications. Learn more about JWT in the [Mastering JSON web tokens](/docs/tutorial-mastering-jwt-how-to-implement-secure-user-authentication) article. * Webhooks — for privacy-compliance purposes, your app must be set up to use webhooks. OAuth 2.0 authorization is required to access webhooks. * Scopes — apps must only request scopes that are required for your app. Before you apply, we recommend that you check to ensure that your scopes are not requesting unnecessary access. * Notifications to customers — support for handling and notifications to customers about plan upgrades or downgrades. ## Support requirements We require the following information to ensure that customers are provided with ongoing quality support: * A **one-pager** for the Chainstack support team, summarizing the key features of your app. * An **SLA** for our support team (community, business processes from the app side, etc.). * The app **support team contact details** and the **estimated response time**. * **FAQs** of the app. * An up-to-date list of any **known issues** of the app. ## Submit your request Once you have read and are sure that you can meet the minimum requirements, you can submit the [Chainstack Marketplace request form](https://chainstack.com/marketplace/#submit-app). You must complete and accurately fill out the following fields of the form: * **Name** — your full name. * **Email** — your contact email address. * **Product name** — the name of the product as it will be displayed on the Chainstack Marketplace. * **Developer name and website URL** — the name of the product developer and a link to the website. * **Description** — enter an informative, well-written description of your app. * **Screenshots** — some screenshots of your app that we can include on the Chainstack Marketplace. * **User documentation** — a URL to your up-to-date user documentation. * **Category** — choose from Application, Service, or Developer tool. * **Relevant protocols** — select all protocols from the list that are relevant to your app. * **Integration instructions** — a URL to a step-by-step guide for integrating your app. * **Terms of Service and Privacy Policy URLs** — links to your up-to-date terms of service and privacy policy. * **Pricing plan** — details of at least one pricing plan for your app. * **Support details** — the contact details of your main support channels and the estimated response time. ## What happens next? When you complete the form, the Chainstack team will review your submission. You will then receive an email notifying you if your request has been accepted or not. If accepted, you will be provided with details on how to proceed. # Make your dApp more reliable with Chainstack Source: https://docs.chainstack.com/docs/make-your-dapp-more-reliable-with-chainstack **TLDR** * Explains how Chainstack’s global node feature can boost your DApp’s reliability by balancing traffic automatically based on user location. * Demonstrates a JavaScript load balancer script using multiple Chainstack endpoints, distributing requests across different regions to avoid single-point failures. * Shows examples with both web3.js and ethers.js, detailing how to fail over to the next endpoint if one fails. * Concludes that both global nodes and custom load-balancing approaches help ensure your blockchain app can handle high traffic and unexpected downtimes. ## Main article Every developer wants the most reliable DApp. In this guide, we'll explore how to use multiple Chainstack nodes using load balancer logic to make your DApp more performant and reliable. Think of it as a well-coordinated team where the workload is evenly distributed, ensuring efficiency and eliminating any single point of failure. Whether you're a seasoned developer or a newcomer to the blockchain scene, this step-by-step guide will provide practical knowledge to enhance your applications. So, let's get started and dive into the world of efficient blockchain application management. This guide will show you how using a Chainstack global node can make your DApp more reliable and also how to use multiple Chainstack RPC nodes to create a load balancer in JavaScript. ## What is a load balancer In simplest terms, a load balancer is a technology that distributes network or application traffic across multiple servers or nodes, in our case. Imagine you're at a busy intersection; a load balancer is the one directing traffic, ensuring that no single server gets overwhelmed with too many requests. This way, it helps to optimize resource use, maximize throughput, minimize response time, and avoid system overload. In blockchain applications, a load balancer can help distribute incoming requests evenly across multiple nodes, ensuring your application runs smoothly and efficiently. So, it's a pretty handy tool in your blockchain toolkit. ## Global Nodes Chainstack provides [Global Nodes](/docs/global-elastic-node), but how can they help you make your DApp more reliable, and how do they differ from [Trader Nodes](/docs/trader-node)? [Trader Nodes](/docs/trader-node) represent traditional endpoints where users can select from various available locations. While this provides considerable flexibility, it also comes with certain limitations. For instance, there is less redundancy if the node encounters issues, and users may experience varying performance levels. This variation can occur if, for example, an application sends requests to the node from the client, which the user's location can influence. On the other hand, global nodes function as load-balanced nodes that direct requests to the nearest available location for a specific protocol based on the caller's location. This design ensures efficient service access for users worldwide by routing requests optimally. The main advantages of global nodes are the following: * **Enhanced load balancing**—global nodes include a large load balancer that can switch nodes if one fails or lags by more than 40 blocks, thus ensuring uninterrupted service. * **Reduced latency**—by distributing traffic to the nearest endpoint, the global node reduces latency, leading to faster transactions and improved user experience. * **Global reach**—Anyone from any location can access global nodes. These nodes direct users to the endpoints nearest to their location, maximizing service availability and responsiveness. * **High availability**—Global nodes are designed to be 99.95% available. This ensures that your DApp continues to run with minimal interruptions. * **Instant deployment** — unlike trader nodes, which take 3-6 minutes to deploy, the global node is ready in seconds. This leads to significant time savings. Check modes and protocols available for [globa nodes](/docs/global-elastic-node). Opting for a global node is generally the preferred choice. However, what if you require a protocol or mode that isn't currently supported? For such cases, we can BUIDL a simple load-balancing script using JavaScript. ## JavaScript load balancer project This project will be a simple implementation using node.js, with examples using `web3.js` and `ethers.js`. Therefore, it's essential to ensure a well-configured development environment before proceeding. Go over our node.js setup guide for Web3 projects, [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project), if you are starting from zero. ### Prerequisites For this project, make sure you have the following: * node.js V18 * web3.js * ethers.js Install the web3.js library with `npm i web3`. Install the ethers.js library with `npm i ethers`. ### The logic of the project In this simple project, we are trying to create a basic load balancer; in this case, we'll use multiple Chainstack trader nodes and alternate them between requests to spread the load between them. It's important to remember that this is only proof of concept and will require further optimization for deployment in a production environment. Let's deploy three Ethereum trader nodes in three locations to set things up. Learn how to [deploy a node with Chainstack](/docs/manage-your-node). This configuration guarantees global coverage, and the use of various hosting providers adds an extra layer of redundancy. This is the power of the geo-distributed infrastructure provided by Chainstack. Remeber that the deployments of multiple nodes is available starting from a [paid plan](https://chainstack.com/pricing/). ### Coding the load balancer Now that you have access to three RPC nodes, it's time to store them in a `.env` file located in your project's root directory. If you haven't already, install the `dotenv` package using the command `npm i dotenv`. This approach helps us manage sensitive information, preventing accidental pushes to a version control platform. ```Text .env VIRGINIA_RPC="YOUR_VIRGNIA_CHAINSTACK_ENDPOINT" LONDON_RPC="YOUR_LONDON_CHAINSTACK_ENDPOINT" SINGAPORE_RPC="YOUR_SINGAPORE_CHAINSTACK_ENDPOINT" ``` After configuring the endpoints, create a new `index.js` file and insert the following code. For illustrative purposes, we're employing a fairly straightforward use case. The script executes the `eth_getBlockByNumber` method every 10 seconds to fetch details of the latest block. Notably, each request is served by a different endpoint. ### Web3.js example ```javascript web3.js const {Web3} = require("web3"); require("dotenv").config(); // Initialize RPCs from environment variables const RPC_NODES = { web3Virginia: process.env.VIRGINIA_RPC, web3London: process.env.LONDON_RPC, web3Singapore: process.env.SINGAPORE_RPC, }; // Create Web3 instances for each RPC const web3Instances = {}; for (const [key, url] of Object.entries(RPC_NODES)) { web3Instances[key] = new Web3(url); } // Array of keys to cycle through const keys = Object.keys(web3Instances); // Counter to keep track of the current Web3 instance let counter = 0; async function getBlock(blockNumber) { // Select the current Web3 instance const key = keys[counter]; const web3 = web3Instances[key]; console.log(`Using ${key} RPC`); // Start the timer console.time("getBlock"); try { // Try to get the latest block const block = await web3.eth.getBlock(blockNumber, false); // Extract some fields to keep the response cleaner const blockSummary = { blockNumber: block.number, blockHash: block.hash, parentHash: block.parentHash, size: block.size, }; console.log(blockSummary); } catch (error) { // Log the error console.error(`Error fetching block: ${error.message}`); // Increment the counter and reset it if it's larger than the array length counter = (counter + 1) % keys.length; // Retry the request on the next Web3 instance console.log("Retrying request on next RPC..."); return getBlock(blockNumber); } // End the timer and log the time console.timeEnd("getBlock"); // Increment the counter and reset it if it's larger than the array length counter = (counter + 1) % keys.length; } // Call getBlock every 10 seconds console.log("Running load balanced script..."); setInterval(() => getBlock("latest"), 10000); ``` ### Ethers.js example ```javascript ethers.js const ethers = require('ethers'); require('dotenv').config(); // Initialize RPC nodes from environment variables const RPC_NODES = { ethersVirginia: process.env.VIRGINIA_RPC, ethersLondon: process.env.LONDON_RPC, ethersSingapore: process.env.SINGAPORE_RPC, }; // Create ethers providers for each RPC URL const providers = {}; for (const [key, url] of Object.entries(RPC_NODES)) { providers[key] = new ethers.JsonRpcProvider(url); } // Keys to cycle through providers const keys = Object.keys(providers); let counter = 0; // Counter to keep track of the current provider // Function to get the latest block using the current provider const getBlock = async () => { const key = keys[counter]; const provider = providers[key]; console.log(`Using ${key} RPC`); console.time("getBlock"); // Start timing the operation try { // Fetch the latest block information const blockByNumber = await provider.send("eth_getBlockByNumber", ["latest", false]); // Extract some fields to keep the response cleaner const blockSummary = { blockNumber: blockByNumber.number, blockHash: blockByNumber.hash, parentHash: blockByNumber.parentHash, size: blockByNumber.size, }; console.log(blockSummary); } catch (error) { // If there's an error, log it and move to the next provider console.error(`Error fetching block: ${error.message}`); counter = (counter + 1) % keys.length; // Increment and wrap the counter if necessary console.log("Retrying request on next RPC..."); return getBlock(); // Retry with the next provider } console.timeEnd("getBlock"); // End timing the operation counter = (counter + 1) % keys.length; // Move to the next provider for the next call }; // Start the process and call getBlock every 10 seconds console.log("Running load-balanced script with ethers.js..."); setInterval(getBlock, 2000); ``` ### Code breakdown Let's go over what we are doing in the code step by step. #### Initialize the Web3 instances ```javascript Javascript // Initialize RPCs from environment variables const RPC_NODES = { web3Virginia: process.env.VIRGINIA_RPC, web3London: process.env.LONDON_RPC, web3Singapore: process.env.SINGAPORE_RPC, }; // Create Web3 instances for each RPC const web3Instances = {}; for (const [key, url] of Object.entries(RPC_NODES)) { web3Instances[key] = new Web3(url); } // Array of keys to cycle through const keys = Object.keys(web3Instances); // Counter to keep track of the current Web3 instance let counter = 0; ``` * **Initialize RPC URLs from environment variables**. The script sets up an object named `RPC_NODES` that maps the variables of the Web3 instances to their respective RPC URLs, which are fetched from environment variables. This is mostly done so we can print in the console which variable we are using each time we send a request. * **Create Web3 instances for each RPC**. The script creates a new Web3 instance for each RPC URL and stores them in the `web3Instances` object. This object's keys are the instances' names, and the values are the instances themselves. This way, the instances are all created at once when the script is started; this logic also keeps the code more maintainable as you only need to edit the `RPC_NODES` object if you want to add or remove endpoints. * **Set up an array of keys and a counter**. The keys array contains the keys of the `web3Instances` object and the counter variable keep track of the current Web3 instance. #### Call the function to get the latest block details ```javascript Javascript async function getBlock(blockNumber) { // Select the current Web3 instance const key = keys[counter]; const web3 = web3Instances[key]; console.log(`Using ${key} RPC`); // Start the timer console.time("getBlock"); try { // Try to get the latest block const block = await web3.eth.getBlock(blockNumber, false); // Extract some fields to keep the response cleaner const blockSummary = { blockNumber: block.number, blockHash: block.hash, parentHash: block.parentHash, size: block.size, }; console.log(blockSummary); } catch (error) { // Log the error console.error(`Error fetching block: ${error.message}`); // Increment the counter and reset it if it's larger than the array length counter = (counter + 1) % keys.length; // Retry the request on the next Web3 instance console.log("Retrying request on next RPC..."); return getBlock(blockNumber); } // End the timer and log the time console.timeEnd("getBlock"); // Increment the counter and reset it if it's larger than the array length counter = (counter + 1) % keys.length; } ``` This asynchronous function fetches the latest block from the Ethereum blockchain using one of the Web3 instances. It selects the Web3 instance based on the counter, starts a timer, fetches the block, logs some information about the block, stops the timer and logs the elapsed time, and finally increments the counter (or resets it if it's larger than the number of Web3 instances). This function also incorporates an error handling mechanism. Specifically, if an endpoint becomes unavailable, the script will not halt. Instead, it will promptly switch to the next RPC to continue making requests, increasing reliability. The following part then calls the function every 10 seconds: ```javascript Javascript // Call getBlock every 10 seconds console.log("Running load balanced script..."); setInterval(() => getBlock("latest"), 10000); ``` ### Run the code Now to run the code, use the `node index.js` command in your terminal; the script will start and call the function every 10 seconds. Here is an example of what the response looks like: ``` Running load balanced script... Using web3Virginia RPC { blockNumber: 17629512, blockHash: '0xcf1535e6f7b84ba51e8ebb9bbf09be7f5caf99f6fc3ac063a5563be02d93f32f', parentHash: '0x7f812fd3d738b91c053a058d8bbc73fba839143f924f63371285545b35b7b460', size: 60657 } getBlock: 573.178ms Using web3London RPC { blockNumber: 17629513, blockHash: '0x1243ec3a24465f01758cdf0bb40f02a64964832a47972fcb26fef488293392a7', parentHash: '0xcf1535e6f7b84ba51e8ebb9bbf09be7f5caf99f6fc3ac063a5563be02d93f32f', size: 24745 } getBlock: 650.523ms Using web3Singapore RPC { blockNumber: 17629514, blockHash: '0x3a44d78b3c02e209ad671d6fa113b8e6ff4bfeafe8ee416573f4af668ac1fbed', parentHash: '0x1243ec3a24465f01758cdf0bb40f02a64964832a47972fcb26fef488293392a7', size: 50705 } getBlock: 1.768s ``` Note how each request uses a different endpoint, and the execution times accurately reflect their respective locations. The Virginia endpoint, the closest to my location, provides a quicker response, while Singapore, the furthest, takes a bit longer. ## Conclusion In this guide, we explored the robustness that Chainstack's global nodes can bring to your DApp. Additionally, we went into the creation of a load-balanced script using web3.js. This script not only distributes the load across various endpoints in different regions but also ensures redundancy, thereby enhancing the reliability and performance of your application. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Manage your account Source: https://docs.chainstack.com/docs/manage-your-account ## Create an account [Sign up](https://console.chainstack.com/user/account/create) with Chainstack in a couple of steps. ### Email verification Make sure you verify your email at the last step of the account creation process. Otherwise, your access to Chainstack will be restricted. For subscription plan details, see [Pricing](https://chainstack.com/pricing/). ## Change your password Navigate to your account **Settings** > **Personal**. In **Security**, click **Change password**. Provide your current password. Enter and repeat your new password. Click **Save**. ## Enable two-factor authentication Two-factor authentication (2FA) adds an extra layer of security to your Chainstack account. Once enabled, you will be prompted to enter a code generated by your mobile device each time you log in. You can use apps like Authy, Google Authenticator, or Microsoft Authenticator. To enable 2FA: Go to the personal settings page. In **Security**, click **Enable two-factor authentication**. Provide your current password. Click **Next**. Download, print, or copy your 2FA recovery codes. These are the backup codes that you will need to restore your 2FA access if you lose it. Scan the QR code with your authenticator app or enter the code manually. Enter the code provided by the authenticator app in the 2FA wizard. Click **Enable 2FA**. The 2FA recovery codes are shown once and cannot be retrieved. Make sure you keep the recovery codes secure. ## Disable two-factor authentication Two-factor authentication (2FA) adds an extra layer of security to your Chainstack account. Disabling 2FA will keep your account protected only with your password. To disable 2FA: Go to the personal settings page. In **Security**, click **Disable two-factor authentication**. Provide your current password. Click **Disable 2FA**. ## Recover your account protected with two-factor authentication When you enable two-factor authentication (2FA) for your Chainstack account, you are provided with a list of recovery codes. These are the backup codes that you must use to restore your 2FA access if you lose it. The default name of the file that you might have downloaded with the recovery codes is `recoveryCodes.txt`. To recover your account: Provide your username and password at the [log in](https://console.chainstack.com/) page. Click **Having problems? Enter your recovery code**. Provide your recovery code. Click **Verify**. Each recovery code can only be used once. # Manage your billing Source: https://docs.chainstack.com/docs/manage-your-billing Chainstack natively supports crypto payments. ### 150+ cryptocurrencies Chainstack supports topping up your balance with 150+ cryptocurrencies. Your crypto balance is the first to be charged for subscription, support, and node usage costs. If the balance is insufficient, your credit card will be charged for the outstanding amount. The billing page is available for users with the [Admin role](/docs/manage-your-organization#users-and-their-roles). Here you can: * Top up your balance. * Edit your credit card details. * Check for failed payments and settle them. * View your current metered usage by category and its cost. * Manage your organization subscription plan and support level. ## Manage the pay-as-you-go setting With the pay-as-you-go setting enabled in your [Billing](https://console.chainstack.com/user/settings/billing), your Chainstack services will stay operational on reaching your plan's quota limit. Going over the limit will trigger the pay-as-you-go charges. With the pay-as-you-go setting disabled, you introduce a hard limit to your plan. On reaching the quota, your Chainstack services will stop and you won't be charged on going over the limit. In both cases, you will receive at least two email notifications—on using up 80% of your plan's quota and on using up 100%. ## Top up your balance with crypto You can top up your balance with your Coinbase account or directly with cryptocurrency. If there's a token that you'd like to top up with, it's most likely supported. See the [150+ supported tokens and coins](https://nowpayments.io/supported-coins) with NOWPayments. ### Top up with a Coinbase account or wallet To top up the balance with your Coinbase account: Go to [Billing](https://console.chainstack.com/user/settings/billing). Click **Top up**. Click **Crypto**. Pick an amount to top up with Click **Coinbase Commerce**. Proceed to pay with a Coinbase account or any of your non-custodial wallets. ### Top up with NOWPayments Go to [Billing](https://console.chainstack.com/user/settings/billing). Click **Top up**. Click **Crypto**. Click **NOWPayments**. Pick an amount to top up with. Pick a token to pay with. Proceed to pay ### Top up with a non-custodial wallet You can top up the balance using your preferred non-custodial wallet. For example, [MetaMask](https://metamask.io/), [Coinbase Wallet](https://www.coinbase.com/wallet), and others. To top up the balance with a non-custodial wallet: Go to [Billing](https://console.chainstack.com/user/settings/billing). Click **Top up with crypto**. Select a predefined amount or type in your amount. Click **Next**. Select the cryptocurrency. Send the cryptocurrency to the respective generated address. ## Edit your credit card details If you need to edit your credit card details, do the following: Go to [Billing](https://console.chainstack.com/user/settings/billing). Next to **Credit card**, click **Edit**. Enter new details. Click **Update**. The billing page will reload automatically and show your new credit card details. ## Settle failed payments If both the crypto balance and credit card deduction tries were unsuccessful, a payment is considered failed. Your Chainstack services get suspended on the first failed payment attempt or remain operational for three days depending on the plan level and your API calls will stop processing as a result. Your Chainstack endpoints will remain the same and will be back to operational when the failed payments are settled. If the failed payments are not settled within 14 days after the first failed attempt, your Chainstack endpoints will be deleted. There are the following ways to settle failed payments: * Add a new credit card. If its balance is sufficient, failed payments will be deducted automatically. * Retry your payment manually and deduct the due amount either from your balance or credit card. To retry your payment manually: 1. Make sure that a payment method of your choice has sufficient funds. 2. Go to [Billing](https://console.chainstack.com/user/settings/billing) > **Balance**. 3. Next to the failed payment, click **Retry with** and select one of the following: * **Credit card** — to deduct the amount due from a credit card linked to your account. * **Balance** — to deduct the amount due from your crypto balance. See how to [top it up](/docs/manage-your-billing#top-up-your-balance-with-crypto). The amount due will be deducted from your crypto or credit card balance and the payment status will change to **Paid**. ## Manage your organization subscription plan and support level You can manage two major options of your organization billing: * Your subscription plan * Your support level Chainstack offers four types of subscription plans which differ depending on your Web3 objectives. The final price of each subscription plan is made up of a base charge for the features included and the number of API requests made to your deployed nodes. [Each subscription plan also has a default support level](https://chainstack.com/pricing/) included into it. You can change your subscription plan any time, while the support level can be changed only for paid subscriptions. Switching between subscription plans and support levels occurs independently from each other. ### Changing your subscription plan To change your subscription plan: In the left navigation bar, click **Billing**. Next to **Plan**, click **Change**. Select a new subscription plan and click **Next**. Check the details of your new subscription plan and click **Confirm**. Your subscription plan changes immediately. Funds for the unused period of your previous subscription will be automatically returned to your balance. You will also be billed for any uninvoiced metered usage during the used period of your previous subscription. ### Rate limits for Developer plans A free Developer plan has a rate limit of 30 request per second. However, this does not affect paid subscription plans. ### Changing your support level To change your support level: In the left navigation bar, click **Billing**. Next to **Support level**, click **Change**. Select a new support level and click **Next**. Check the details of your new support level and click **Confirm**. Your support level changes immediately. Funds that you haven’t spent during your previous upgrade option will be automatically used as a part of your new support level charge if it’s Professional or returned to your balance if you have switched to Standard. ## Chainstack Subgraphs billing details You can view your Chainstack Subgraphs usage details in the **Usage** section. The **Subgraphs requests** table provides the number of requests included in your plan, the number that you have used, and the number that you have used in excess of your allowance. The **Total subgraph hours** shows the total number of hours that your subgraphs have been active. For information about how to change your Chainstack subscription plan to increase your allowance of included requests, see [Manage your organization subscription plan and support level](#manage-your-organization-subscription-plan-and-support-level). ## Chainstack Marketplace billing details The Chainstack Marketplace is a collection of applications, including add-ons, plugins, and tools designed to extend the functionality of your nodes and infrastructure. Each application may have its own pricing model—such as free, subscription-based, or one-time payment—which is billed separately and in addition to your main Chainstack subscription plan. If a paid application is removed before the end of its billing cycle, a partial refund will be issued for the unused portion of the subscription. # Manage your networks Source: https://docs.chainstack.com/docs/manage-your-networks 1. Select a public chain project and click **Get started** or **Join network**. If you don't have a project yet, [create one](/docs/manage-your-project#create-a-project). 2. Select a network and a node you want to deploy. ## View network status You can view the node and network status in the **Status** column of your nodes list or your networks list. A network can be in one of the following states: | Network status | Description | | -------------- | ------------------------------------------- | | Running | The node is running and everything is okay. | | Failed | The node is `Failed` or in `Error`. | ## Delete a network To be able to delete a network, you must first delete all nodes associated with the network. See [Delete a node](/docs/manage-your-node#delete-a-node). To delete a network: Click your project. Click a network in the project. Click **Edit** > **Delete**. # Manage your nodes Source: https://docs.chainstack.com/docs/manage-your-node ## Add a node to a network On the [Nodes](https://console.chainstack.com/nodes) page, add a node; The status will change from **Pending** to **Running** once deployed. ## View node access and credentials To view the access information: Click your project. Click your network. Click the node name. This will give you the access and credentials to your nodes. Examples of your HTTPS node connection endpoints: ```sh Key-protected https://nd-123-456-789.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d ``` ```sh Password-protected https://user-name:pass-word-pass-word-pass-word@nd-123-456-789.p2pify.com ``` Examples of your WSS node connection endpoints: ```sh Key-protected wss://ws-nd-123-456-789.p2pify.com/3c6e0b8a9c15224a8228b9a98ca1531d ``` ```sh Password-protected wss://user-name:pass-word-pass-word-pass-word@ws-nd-123-456-789.p2pify.com ``` ## View node requests metrics You can view the data on requests made to your public chain project node. To view the requests data: Click your project. Click your network. Click the node name. This will show you the data on the requests made to your node: * Requests made — a chart of the total requests over HTTP and WebSocket for the selected period. * Method calls — a breakdown of the method calls to the node over the selected period. Only available for HTTP requests. * Response codes — a breakdown of the HTTP response status codes over the selected period. Only available for HTTP requests. The data granularity is 1 minute. The available timeframes are—1 hour, 6 hours, 12 hours, 24 hours, 7 days. You can also see your total organization's [statistics](/docs/see-statistics). ## Switch the period To switch the period of the displayed requests data: Navigate to the **Metrics** section. On your right, select the period. ## Download the aggregate data To download the aggregate requests data: In the **Requests made** section, click the hamburger button. Click the format to download the aggregate data. ## Check dedicated nodes resources allocation You can view the resources dynamically allocated to each of your dedicated nodes. To view the resources allocated to your nodes: Click your project. Click your network. Click the node name. Under **Resources**, hover over **Dynamic**. ## Delete a node To be able to delete a project or a network, you must first delete all nodes associated with the network. To delete a node: Click your project. Click a network in the project. Select a peer node to delete. Click **Edit** > **Delete**. # Manage your organization Source: https://docs.chainstack.com/docs/manage-your-organization ## Explaining organizations, users, and members ### Organization An organization is the hierarchical ancestor of all your projects and nodes at Chainstack. When you first sign up with Chainstack, an organization is automatically created for you and you become the Owner user of this organization. ### Users and their roles A user is an account invited to become a part of an organization. Each user in an organization can be assigned one of the following roles: * Admin — the role can access and change the organization's billing settings, invite users, create and delete projects, deploy and delete nodes and networks, change user roles. * Editor — the role can create and delete projects, deploy and delete nodes and networks. * Viewer — the role can view existing projects, nodes, and networks. ### Is Owner a separate role? Owner is a special tag for a user who created the organization. Owner users have the same rights as Admin users but cannot change their role. ### Members A member is an organization invited to become a part of a consortium project. A member can: * Deploy nodes and networks in the project they are invited to. * Access the project's nodes and network details. A member cannot: * Delete nodes and networks of other members. * Access the private details of the networks and nodes deployed by other members, such as the default private keys, identities, and endpoints. ## Invite a user to the organization To invite a user to your organization: Click **Settings** > **Organization**. Click **Invite user**. Provide the user email address. Select the role **Admin**, **Editor**, or **Viewer**. Click **Invite**. This will send an email invitation to join the organization. ## Change a user role in the organization Once the invited user joins the organization, you can change the user's role. To change the role: Click **Settings** > **Organization**. Next to the user, click the edit icon. Select the new role **Admin**, **Editor**, or **Viewer**. Click **Update**. This will change the role of the user. The user whose role is changed will need to relogin for the changes to take effect. ## Delete a user from the organization To delete a user from the organization: Click **Settings** > **Organization**. Next to the user, click the trash bin icon. Click **Yes, I'm sure** to confirm the action. The user will be instantly deleted from your organization. # Manage your projects Source: https://docs.chainstack.com/docs/manage-your-project ## Create a project You must create a project to be able to deploy a node or a network, or to join a network. To create a project: Log in to your [Chainstack account](https://console.chainstack.com/). Click **Create project**. Provide **Project name** Provide **Description** (optional). Click **Create**. ## Add a network to a project A project can have more than one network. To add a network to a project, see [Join a public network](/docs/manage-your-networks#join-a-public-network). ## Edit a project To edit a project: Click your project. Click **Edit**. Edit your **Project name** and/or **Description**. Click **Save** to apply changes. ## Delete a project To be able to delete a project, you must first delete all nodes associated with the project. See [Delete a node](/docs/manage-your-node#delete-a-node). To delete a project: Click your project. Click **Edit** > **Delete**. Click **Yes, I'm sure** to confirm the action. # Manage your subgraphs Source: https://docs.chainstack.com/docs/manage-your-subgraphs **TLDR:** * Easily view existing subgraphs in the Chainstack console, filtering by protocol and checking status and details like owner and region. * Monitor deployment status (sync in progress, up-to-date, or failed) and progress in real time using logs for error, warning, and debug filters. * Delete any subgraph directly from its details page to clean up your workspace. ## View your subgraph In [Subgraphs](https://console.chainstack.com/subgraphs), you can view a list of all existing subgraphs and can filter the view using the **Protocol** drop-down list. Click on a subgraph to open the subgraph details. The subgraph details include such information as the **Owner**, **Creation date**, **Region**, and **Protocol**. You can view the **Metrics** to check the number of requests made in the subgraph, view the subgraph **Deployment command**, and the **Subgraph Query** URLs. ## Monitor your subgraph status The status of a subgraph is displayed in the **Status** column of the [Subgraphs](https://console.chainstack.com/subgraphs) list. The status will be one of the following: * **Not deployed** — this status is displayed when you create a subgraph in the console but not yet deployed a subgraph. * **Deployed. Sync in progress** — the subgraph is in the process of being deployed and is syncing data. * **Up-to-date** — the subgraph has been successfully deployed. * **Failed** — the subgraph deployment was unsuccessful. When you click on a subgraph and open the subgraph details, you will see more information about the progress of the subgraph at the top of the page. If the subgraph status is **Deployed. Sync in progress**, you can view the percentage completion and the quantity of data that has synced. You can click **Logs** for real-time status information about the progress of the sync. By default, the **Info** filter is selected, but you can also filter the logs by **Error**, **Warning**, and **Debug**, and can click **Load older logs** to change the time frame of the logs displayed. ## Delete a subgraph To delete a subgraph, in the subgraph details page, click **Delete** and confirm by clicking **Yes, I am sure**. Your subgraph is deleted and you return to the [Subgraphs](https://console.chainstack.com/subgraphs) section of the console. # Mastering custom JavaScript tracing for Ethereum Virtual Machine Source: https://docs.chainstack.com/docs/mastering-custom-javascript-tracing-for-ethereum-virtual-machine **TLDR** * Provides an overview of custom JavaScript tracers on Ethereum (Geth or Erigon) for advanced debugging and selective data collection. * Demonstrates using JS code with debug\_traceTransaction (and others) to log or transform EVM execution details. * Explains how to flatten tracer scripts into JSON-RPC payloads, including a Node.js example to automate string conversion. * Highlights key methods (step, enter, exit, fault, result) for capturing and returning custom trace data. ## Main article ### Available on customized dedicated nodes only Custom JavaScript tracers are available as customized solutions on the [Enterprise plan](https://chainstack.com/pricing/) on [dedicated nodes](/docs/dedicated-node). If you've landed on this article, it's likely that you're already acquainted with the concept of tracing. If not, we recommend starting with our [Deep dive into Ethereum debug\_trace APIs](https://chainstack.com/deep-dive-into-ethereum-trace-apis/) for a comprehensive understanding of tracing and the prevalent EVM tracing methods. Tracing, in the context of Ethereum, refers to the process where Ethereum clients execute or re-execute a transaction or block, gathering crucial information produced during the execution. This process serves as an invaluable tool for debugging, performance analysis, and a multitude of other applications. Ethereum provides built-in tracing functionality through its JSON-RPC API. There are three types of tracing supported by Ethereum: You can also find details about [built-in tracers](/reference/ethereum-debug-trace-rpc-methods#pre-built-javascript-based-tracers) and [custom tracing](/reference/ethereum_node_api/debug_and_trace/custom_js_tracer) in the Chainstack documentation. **Basic tracing** and **Built-in tracers** serve as powerful debugging tools. They generate all the crucial data for most general purposes. Nevertheless, there can be situations that demand more specific details to align with particular requirements, or sometimes the standard tracer output too much data when only something specific is needed. This is where custom JavaScript tracing comes into play, allowing for enhanced customization of the tracing output. This article will discuss custom JS tracing, which is one of the custom tracing methods that Ethereum supports. ## What is custom JavaScript tracing? Custom JavaScript tracing enables developers to create custom tracing scripts that can be executed on the Ethereum node. It is available on both Geth and Erigon. For an in-depth understanding of the RPC methods available on Geth and Erigon, we invite you to explore our comprehensive guide [Geth vs Erigon: Deep dive into RPC methods on Ethereum clients](/docs/geth-vs-erigon-deep-dive-into-rpc-methods-on-ethereum-clients). Users can use custom JavaScript tracing in conjunction with the following popular tracing methods: Please note that JS tracing is a feature that can be disabled, and not all nodes or RPC providers support JS tracing. The simplest way to test if your endpoint supports JS tracing is to submit a sample request to the endpoint. To use JS tracing the vanilla way, pass the JS code to the "tracer" as a parameter and send it over an HTTPS JSON request. Below are some examples of using JS tracing. The following sample requests are taken from [the official Geth documentation](https://geth.ethereum.org/docs/developers/evm-tracing/custom-tracer). The code simply finds all the stack positions for all "CALL" opcodes using the `log.stack.peek` function. ### Example with the `debug_traceCall` method In this case, the custom tracer object encompasses JavaScript code that will execute at every stage of the Ethereum virtual machine (EVM) during transaction execution. This specific tracer captures the first value from the EVM's stack each time a `CALL` opcode is detected, storing the collected values in the `data` array. ```bash cURL curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'Content-Type: application/json' \ --data '{ "jsonrpc": "2.0", "method": "debug_traceCall", "params": [ { "from": "0xdeadbeef29292929192939494959594933929292", "to": "0xde929f939d939d393f939393f93939f393929023", "gas": "0x7a120", "data":"0xf00d4b5d00000000000000000000000001291230982139282304923482304912923823920000000000000000000000001293123098123928310239129839291010293810" }, "0x7765", { "tracer":"{data: [], fault: function(log) {}, step: function(log) { if(log.op.toString() == '\''CALL'\'') this.data.push(log.stack.peek(0)); }, result: function() { return this.data; }}" } ], "id": 1 }' ``` ### Example with the `debug_traceTransaction` method The custom tracer here checks each operation (step) in the transaction, and if the operation is a `CALL`, it pushes the top item on the stack (`log.stack.peek(0)`) into the `data` array. The result of the tracer is this 'data' array. ```bash cURL curl --location 'YOUR_CHAINSTACK_ENDPOINT' \ --header 'Content-Type: application/json' \ --data '{ "jsonrpc": "2.0", "method": "debug_traceTransaction", "params": ["0x2595b06198245b5b2c92b1316c5c5e92edac0a611250ae53f8961468a73a55a2", { "tracer":"{data: [], fault: function(log) {}, step: function(log) { if(log.op.toString() == '\''CALL'\'') this.data.push(log.stack.peek(0)); }, result: function() { return this.data; }}" }], "id": 1 }' ``` If running those examples returns an error, it means your node does not support custom JS tracing. ## node.js example In theory, it's possible to use custom JavaScript (JS) tracing solely through an HTTPS endpoint. However, this approach carries numerous restrictions. If your objective is to exclusively use HTTP requests, there are certain considerations you must take into account: * When using a JSON object in request parameters, double quote strings are not supported. Code like `log.op.toString() == "CALL"` needs to be converted to single quotes `log.op.toString() == 'CALL'` before using it as a parameter. * Simplify the code by flattening it and eliminating all comments. The [official example](https://geth.ethereum.org/docs/developers/evm-tracing/javascript-tutorial) demonstrates the construction of tracer objects as strings. ```javascript Javascript tracer = function (tx) { return debug.traceTransaction(tx, { tracer: '{' + 'retVal: [],' + 'step: function(log,db) {this.retVal.push(log.getPC() + ":" + log.op.toString())},' + 'fault: function(log,db) {this.retVal.push("FAULT: " + JSON.stringify(log))},' + 'result: function(ctx,db) {return this.retVal}' + '}' }); // return debug.traceTransaction ... }; // tracer = function ... ``` This is not a scalable approach and is incompatible with integrated development environments (IDEs) like Visual Studio Code. It also requires reformatting the code into a lengthy string every time you want to make changes before submitting it to the server. In the next section, we'll develop a simple JavaScript program that handles the process in a more elegant way. This code will handle a JavaScript tracer object, automating its transformation into the requisite tracer string. ## Prerequisites To follow along with this tutorial, ensure you have the following prerequisites: * An [Ethereum node that supports debug and trace API](/docs/debug-and-trace-apis), for example, Chainstack's node running on Erigon * [node.js](https://nodejs.org/en/download) and npm are installed on your system. ### Step 1: Initialize In this example, we will use node.js as our primary platform. Create an empty project directory and initialize the project using the following command: ```shell Shell npm init ``` ### Step 2: Install dependencies To communicate with a server over an HTTP connection, install the `node-fetch` package. ```shell Shell npm install node-fetch@2 ``` ### Step 3: Define the tracing script We will define a custom tracing script that outputs the execution trace of a smart contract. First, create a new JS file called `trace.js`: ```shell Shell echo $null >> trace.js ``` Paste the following code in `trace.js`: ```javascript trace.js const fetch = require("node-fetch"); const url = "YOUR_CHAINSTACK_ENDPOINT" tracer = { retVal: [], afterSload: false, callStack: [], byte2Hex: function(byte) { if (byte < 0x10) return "0" + byte.toString(16); return byte.toString(16); }, array2Hex: function(arr) { let retVal = ""; for (let i = 0; i < arr.length; i++) retVal += this.byte2Hex(arr[i]); return retVal; }, getAddr: function(log) { return this.array2Hex(log.contract.getAddress()); }, step: function(log, db) { let opcode = log.op.toNumber(); // SLOAD if (opcode == 0x54) { this.retVal.push(log.getPC() + ": SLOAD " + this.getAddr(log) + ":" + log.stack.peek(0).toString(16)); this.afterSload = true; } // SLOAD Result if (this.afterSload) { this.retVal.push(" Result: " + log.stack.peek(0).toString(16)); this.afterSload = false; } // SSTORE if (opcode == 0x55) this.retVal.push(log.getPC() + ": SSTORE " + this.getAddr(log) + ":" + log.stack.peek(0).toString(16) + " <- " + log.stack.peek(1).toString(16)); // End of step }, fault: function(log, db) { this.retVal.push("FAULT: " + JSON.stringify(log)) }, result: function(ctx, db) { return this.retVal } }; // Flatten tracer's code function getTracerString(tracer) { result = "{" for (property in tracer) { if (typeof tracer[property] == "function") result = result + property.toString() + ": " + tracer[property] else result = result + property.toString() + ": " + JSON.stringify(tracer[property]) result += "," } result += "}" return result.replace(/"/g, "'") } async function main() { const response = await fetch(url, { method: 'POST', headers: { 'Accept': 'application/json', 'Content-Type': 'application/json' }, body: JSON.stringify({ "method": "debug_traceTransaction", "params": ["0x2f1c5c2b44f771e942a8506148e256f94f1a464babc938ae0690c6e34cd79190", { "tracer": getTracerString(tracer) }], "id": 1, "jsonrpc": "2.0" }), }); result = await response.json() console.log(JSON.stringify(result)); console.log("***end***") } main() ``` To see the trace in action, add your Chainstack endpoint and run `node trace`. This example traces the contract creation code for USDT at transaction [0x2f1c5c2b44f771e942a8506148e256f94f1a464babc938ae0690c6e34cd79190](https://etherscan.io/tx/0x2f1c5c2b44f771e942a8506148e256f94f1a464babc938ae0690c6e34cd79190). ## Understanding the code The main function runs a POST request with the `debug_traceTransaction` method on a transaction hash. The `getTracerString` function is used to convert the `tracer` object to a string representation. This is necessary because the `tracer` object must be passed as a string in the JSON-RPC request. The `tracer` object comes equipped with a suite of methods designed to process transaction details: * `byte2Hex` and `array2Hex` — these are our utility functions responsible for converting byte and array data into hexadecimal form. * `getAddr` — this function is used to extract the contract's address from the log. * `step` — invoked for every opcode (operation code) in the transaction, this method is crucial for tracking operations and documenting any modifications to storage. For clarity, `SLOAD` and `SSTORE` refer to EVM opcodes for data loading and storage. * `fault` — if an error or exception arises within the transaction, this method gets called and logs the error for record-keeping. * `result` — at the end of execution, this method provides an array with all the logs collected during the process. ## How does JavaScript interact with Ethereum clients You may have observed that custom JavaScript tracing requires providing JavaScript code to the Ethereum client. But how does this work? The key lies in Ethereum clients' ability to execute JavaScript code, made possible through a JavaScript implementation written in the Go programming language. Let’s take Geth as an example, which uses [Goja](https://github.com/dop251/goja), which enables [JavaScript tracing on Geth](https://github.com/ethereum/go-ethereum/blob/9231770811cda0473a7fa4e2bccc95bf62aae634/eth/tracers/js/goja.go#L97). ### What is Goja? Goja is an ECMAScript 5.1 (JavaScript) implementation in Go, offering the ability to run, manipulate, and test JavaScript within Go applications. It enables JavaScript scripting in a Go environment, interoperability with JavaScript code, and server-side rendering. When JavaScript tracing code is sent to Geth, Goja interprets it and executes it alongside the main program. ### Custom JS tracing syntax To use JS tracing, you must define the following functions in your code: * `result` (mandatory) — defines what is returned to the RPC caller, and takes two arguments: `ctx` and `db`. * `fault` (mandatory) — is invoked when an error occurs during tracing, and takes two arguments: `log` and `db`. Information about the error can be obtained using the `log.getError()` method. * `setup` — is invoked once at the beginning when the tracer is being constructed. It takes in one argument, `config`, which is tracer-specific and allows users to pass in [options](https://geth.ethereum.org/docs/developers/evm-tracing/built-in-tracers#config) to the tracer. * `step` — is called for each execution step during tracing, and is the main execution function for JS tracing. It takes two arguments: `log` and `db`. * `enter` and `exit` (must be used in combination with each other) — are called when stepping in and out of an internal call. They are specifically called for the `CALL` and `CREATE` variants, as well as for the transfer implied by a `SELFDESTRUCT`. The `enter` function takes a `callFrame` object as an argument, and the `exit` function takes a `frameResult` object as an argument. To learn more about how these functions work, refer to [this official tutorial](https://geth.ethereum.org/docs/developers/evm-tracing/built-in-tracers#config). ## Conclusion Custom JS tracing allows developers to create scripts that can be executed on the Ethereum node using popular methods like `debug_traceCall`, `debug_traceTransaction`, `debug_traceBlockByHash`, and `debug_traceBlockByHash`. The article provides practical examples of how to use custom JS tracing, including running sample requests to endpoints and implementing a custom tracer script using node.js, which serves as a more scalable approach compared to constructing tracer objects as strings. This concludes the tutorial. I hope you found it useful. Thank you for reading. If you have any questions, feel free to reach out to me on Chainstack's [Telegram](https://t.me/chainstack) or [Discord](https://discord.gg/Cymtg2f7pX) or on my social media. Cheers! ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Mastering multithreading in Python for Web3 requests: Comprehensive guide Source: https://docs.chainstack.com/docs/mastering-multithreading-in-python-for-web3-requests-a-comprehensive-guide **TLDR** * Demonstrates how to use Python multithreading (via ThreadPoolExecutor and asyncio) to speed up I/O-bound Ethereum RPC calls. * Sequential requests can lead to heavy wait times; sending them concurrently leverages idle CPU cycles and improves efficiency. * Shows how to handle exceptions cleanly, share state safely, and choose a suitable number of worker threads. * While the GIL can limit CPU-bound threading, network requests benefit significantly from concurrency for faster performance. ## Main article This guide is designed for developers using the web3.py library to interact with Ethereum. We aim to unravel the mystique of multithreading and show you how to bring this powerful technique to bear on your Web3 requests. But first, what is multithreading? Multithreading allows a program to run multiple processes simultaneously, like sending a request while processing another. This is particularly useful when dealing with I/O tasks such as network requests where the program spends a lot of time waiting. Multithreading can significantly speed up your program by allowing it to do other tasks during these waiting times, which also allows you to use system resources efficiently. By the end of this guide, you'll have a good idea about working with and without multithreading, handling exceptions, and following best practices to ensure your code is efficient and robust. ## Understanding multithreading Multithreading is a technique that allows multiple parts of a single program to run concurrently. But what does that mean, exactly? Let's break it down. In the simplest terms, a thread is the smallest unit of processing that can be performed in an OS (operating system). In the context of programming, a thread is the flow of execution of a program. A single-threaded program has just one flow and can perform one operation at a time. Conversely, multithreading enables a program to control multiple threads, essentially allowing multiple operations to run in parallel. Multithreading is a technique to achieve asynchronous behavior in a program. This is where things get exciting. With multithreading, we can run different parts of our program at the same time, as if they were separate miniature programs. This is particularly handy when performing I/O operations, like network requests or file reads and writes, which can often take significant time to complete. Multithreading allows us to start multiple operations simultaneously instead of waiting for one operation to finish before the next. This means we can fetch data from one part of a blockchain while sending a transaction request to another, all within the same program. Multithreading is accessible in Python primarily because the language offers built-in support via the `threading` and `concurrent.futures` modules, among others. These modules provide high-level, easy-to-use APIs for creating and managing threads. They handle a lot of the complex details of thread management behind the scenes, making it easier for developers to leverage multithreading in their applications. But why does this matter? Well, think about the benefits: * **Efficiency**. By allowing multiple operations to run concurrently, we make better use of our system's resources. We can start processing a new request as soon as we've sent an old one rather than waiting for a response. * **Performance**. Multithreading can significantly speed up our programs. This is particularly noticeable in programs that make a lot of I/O requests, like those interacting with a blockchain. * **Responsiveness**. In user-facing applications, multithreading can make our program feel more responsive. We can continue to interact with the user interface while waiting for I/O operations to complete. However, it's important to note that multithreading comes with its own set of challenges. Issues such as thread synchronization, deadlocks, and race conditions need to be handled carefully; we'll tackle these in later sections. In the next sections, we'll dive into how to use multithreading in Python to improve the performance of your Chainstack RPC endpoint. You'll learn how to send multiple requests to your RPC node concurrently, which can significantly speed up tasks like fetching data from the blockchain or waiting for transactions to be mined. Let's get started! ## Setting up your environment Before we dive into the code, let's set up our development environment. This will ensure you have all the necessary tools and libraries installed to follow along with this guide. Here's what you'll need: * **Python**. We'll be using Python for our examples because of its readability and the robust libraries it offers for working with Ethereum. We recommend using Python 3.8 or newer. You can download Python from the [official website](https://www.python.org/downloads/) if you don't have it installed already. * **web3.py**. This is a Python library for interacting with Ethereum. It's a comprehensive tool that lets us connect to Ethereum nodes, make requests, and work with Ethereum data. You can install [web3.py](https://github.com/ethereum/web3.py) using pip, the Python package installer. Open a terminal and run the following command: ```shell Shell pip install web3 ``` * **An Ethereum archive node**. We'll interact with the Ethereum network for this guide. We'll need access to an Ethereum archive node since we’ll query older states. To get an RPC endpoint, follow these steps: [Sign up with Chainstack](https://console.chainstack.com/user/account/create). [Deploy a node](/docs/manage-your-networks#join-a-public-network). [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ## Creating a simple Web3 script without multithreading Let's start with a simple example of making Web3 requests without multithreading. We'll write a script fetching an Ethereum address's balance at various block numbers. Here's the code: ```python Python from web3 import Web3 import time web3 = Web3(Web3.HTTPProvider("YOUR_CHAINSTACK_ENDPOINT")) address = "0x1f9090aaE28b8a3dCeaDf281B0F12828e676c326" start_block = web3.eth.block_number end_block = start_block - 500 def get_balance_at_block(block_num): balance = web3.eth.get_balance(address, block_identifier=block_num) print(f"Balance at block {block_num}: {web3.from_wei(balance, 'ether')} ETH") start = time.time() for block_num in range(start_block, end_block, -1): get_balance_at_block(block_num) print(f"Time taken: {time.time() - start}") ``` Now, let's walk through the code, line by line: 1. **Import necessary modules**. We start by importing the `Web3` module from the `web3` package. We also import the `time` module, which we'll use to measure the execution time of our script. 2. **Set up the Web3 provider**. We set up a connection to our Ethereum node using `Web3.HTTPProvider`. Replace `YOUR_CHAINSTACK_ENDPOINT` with the link to your Ethereum node. 3. **Define the Ethereum address and block range**. We specify the Ethereum address we're interested in and the range of block numbers we want to check. We use `web3.eth.block_number` to get the latest block number and subtract 500 to get the start of our range. 4. **Define the function for fetching the balance**. We define a function `get_balance_at_block` that takes a block number as input, fetches the balance of our specified address at that block number, and prints the balance in ether (converted from wei). 5. **Fetch the balance at each block number**. We loop over the range of block numbers from `start_block` to `end_block`, calling `get_balance_at_block` for each one. Since we're not using multithreading, these requests are made sequentially, meaning the script waits for each request to complete before moving on to the next one. 6. **Measure and print the execution time**. We use `time.time()` to get the current time at the start and end of our script, subtract the two to get the total execution time, and print the result. This script provides a baseline for understanding how long it takes to make these Web3 requests without multithreading. The result is heavily dependent on your location compared to the node’s location. The next section modifies this script to use multithreading and compares the performance. ## Creating a simple Web3 script with multithreading Now let's take our previous example and modify it to use multithreading. This will allow us to make multiple Web3 requests concurrently, potentially speeding up our script. Here's the modified code: ```python Python import asyncio from concurrent.futures import ThreadPoolExecutor from web3 import Web3 import time web3 = Web3(Web3.HTTPProvider("YOUR_CHAINSTACK_ENDPOINT")) address = "0x1f9090aaE28b8a3dCeaDf281B0F12828e676c326" start_block = web3.eth.block_number end_block = start_block - 500 max_workers = 100 def get_balance_at_block(block_num): balance = web3.eth.get_balance(address, block_identifier=block_num) print(f"Balance at block {block_num}: {web3.from_wei(balance, 'ether')} ETH") async def main(): with ThreadPoolExecutor(max_workers=max_workers) as executor: tasks = [ loop.run_in_executor( executor, get_balance_at_block, block_num ) for block_num in range(start_block, end_block, -1) ] await asyncio.gather(*tasks) loop = asyncio.get_event_loop() start = time.time() loop.run_until_complete(main()) print(f"Time taken: {time.time() - start}") ``` Now let's go through the changes step by step: 1. **Additional imports**. We import `asyncio` and `ThreadPoolExecutor` from `concurrent.futures`. `asyncio` is a library for writing single-threaded concurrent code using coroutines, multiplexing I/O access over sockets and other resources, running network clients and servers, and other related primitives. `ThreadPoolExecutor` is a class that creates a pool of worker threads and provides a simple way to offload tasks to them. 2. **Creating the `ThreadPoolExecutor`**. Inside the `main` function, we create a `ThreadPoolExecutor` with a maximum of 100 worker threads. These threads will be used to run our `get_balance_at_block` function concurrently. 3. **Creating tasks**. We create a list of tasks, where each task is a call to `get_balance_at_block` for a different block number. Each of these tasks is run in the executor, meaning it's run in a separate thread. This is done using the `loop.run_in_executor` method, which schedules the callable to be executed and returns a `Future` object representing the execution of the callable. 4. **Running tasks concurrently**. We use `asyncio.gather(*tasks)` to run these tasks concurrently. This function returns a Future aggregating result from the given Future or coroutine objects. This Future completes when all of the given Futures are complete. 5. **Running the event loop**. Finally, we use `loop.run_until_complete(main())` to run the event loop until the `main()` function has been completed. This starts the execution of the tasks in the executor and waits for them to complete. Using a `ThreadPoolExecutor` and `asyncio`, we can make multiple Web3 requests concurrently, potentially speeding up our script significantly compared to the sequential version. In the next section, we'll compare the performance of this multithreaded version with the sequential version and discuss some of the considerations and best practices when using multithreading in Python. ### Explain `ThreadPoolExecutor` and workers Now, let's dive deeper into `ThreadPoolExecutor` and the concept of worker threads, as this is the main concept. In Python's `concurrent.futures` module, a `ThreadPoolExecutor` is a class that creates a pool of worker threads and provides methods to submit tasks to this pool. Once a task is submitted, a worker thread picks it up and executes it. When a worker thread finishes executing a task, it becomes available to pick up another task. The parameter `max_workers` defines the maximum number of worker threads the executor can use. This doesn't mean it will always use this many threads; it won't use more than this. If you submit more tasks than `max_workers`, the executor will queue the extra tasks and execute them as worker threads become available. Choosing the right value for `max_workers` depends on the nature of the tasks and the resources available. * If the tasks are I/O-bound (for example, making network requests), like in this case, you can benefit from a relatively high number of worker threads, as these tasks spend much of their time waiting for I/O operations to complete. While one thread is waiting for its I/O operation, other threads can be executing. * However, if the tasks are CPU-bound, having more worker threads than the number of CPUs can lead to context switching overhead and won't usually provide any speedup due to Python's global interpreter lock (GIL). * You also need to consider the resources available. Each thread consumes some amount of memory, so having many worker threads could consume a lot of memory. In the example, we set `max_workers` to 100, which means the executor will use up to 100 threads to execute the `get_balance_at_block` function concurrently. I used this number because although my machine runs a 16-core CPU, the tasks are I/O, so we can leverage the CPU idle time while waiting for the server to respond. If the task was CPU bound, we would want to cap the workers to 16. Also, after running multiple tests, this number of workers gives me the best performance between speed, resource consumption, and server response/stability. This provides a significant speedup compared to the sequential version, as while one thread is waiting for a response from the Ethereum node, other threads can send their requests or process received data. Be aware that the Ethereum node might also have limits on how many concurrent requests it can handle. If you make too many requests at once, it might slow down and start rejecting requests. However, Chainstack does not throttle the requests, meaning you should not experience issues if you keep the requests under 3,000 requests per second. ## Understanding the differences Now that we've seen both a sequential and a multithreaded approach to making Web3 requests let's compare the two and understand the differences in performance and efficiency. * **Sequential approach**. In the sequential version of the script, we made one request at a time. After sending a request, we waited for the response before sending the next request. This is a straightforward approach, and it's easy to understand what's happening at each step. However, this approach doesn't make efficient use of our resources. While waiting for a response from the Ethereum node, our program isn't doing anything else. In my test, this approach took 88 seconds to complete. * **Multithreaded approach**. In the multithreaded version of the script, we used a `ThreadPoolExecutor` to create a pool of worker threads. We then used these threads to send multiple requests concurrently. While one thread is waiting for a response, other threads can send their requests or process received data. This approach can be more efficient because it allows us to do more work at the same time. In my test, the multithreaded approach took only 2 seconds, about 97% faster compared to the sequential approach. These results demonstrate the potential benefits of multithreading for making Web3 requests. By sending multiple requests concurrently, we can significantly speed up our script and make better use of our system's resources. However, it's important to note that multithreading comes with its own set of challenges. Managing multiple threads can add complexity to our code, and we need to be careful to avoid issues like race conditions, where two threads try to modify a shared resource at the same time. Also, if we try to make too many requests at once, we might overwhelm the Ethereum node or run into rate limits. ## Organizing the response While running the code, you may observe that the results are presented in the order they are received, rather than in a block-wise sequence, which would seem more logical. With a small modification to the code, we can align the output to display results in accordance with their respective block numbers. ```python Python import asyncio from concurrent.futures import ThreadPoolExecutor from web3 import Web3 import time web3 = Web3(Web3.HTTPProvider("YOUR_CHAINSTACK_ENDPOINT")) address = "0x1f9090aaE28b8a3dCeaDf281B0F12828e676c326" start_block = web3.eth.block_number end_block = start_block - 500 max_workers = 100 def get_balance_at_block(block_num): try: balance = web3.eth.get_balance(address, block_identifier=block_num) return (block_num, web3.from_wei(balance, 'ether')) # Return a tuple except Exception as e: print(f"Error occurred while getting balance at block {block_num}: {e}") async def main(): with ThreadPoolExecutor(max_workers=max_workers) as executor: loop = asyncio.get_event_loop() futures = [ loop.run_in_executor( executor, get_balance_at_block, block_num ) for block_num in range(start_block, end_block, -1) ] results = [] for future in asyncio.as_completed(futures): try: # This will raise an exception if the thread raised an exception result = await future results.append(result) # Collect the results except Exception as e: print(f"Error occurred in thread: {e}") # Sort the results by block number and print them results.sort(key=lambda x: x[0]) for block_num, balance in results: print(f"Balance at block {block_num}: {balance} ETH") loop = asyncio.get_event_loop() start = time.time() loop.run_until_complete(main()) print(f"Time taken: {time.time() - start}") ``` In this version, the `get_balance_at_block` function returns a tuple `(block_num, balance)`. These tuples are collected in the `results` list. After all futures are completed, the `results` list is sorted by block number (the first element of each tuple), and the results are printed in order. ## Handling errors and exceptions in multithreaded architecture As with any code, errors and exceptions can occur in a multithreaded architecture. Handling these errors properly is crucial for ensuring the robustness and reliability of your code. In a multithreaded context, error handling can be slightly more complex because errors can occur in multiple threads simultaneously. Here are some common errors and exceptions that you might encounter and how to handle them: 1. Rate limit errors. Since multithreading allows you to make more requests quickly, you might encounter a rate limit. Chainstack does not pose limits, but you might get errors due to too many requests, so it is recommended to stay below 3,000 RPS. Rate limit errors can be handled by catching the appropriate exception in a `try`/`except` block. You should also include logic to delay or reduce the rate of your requests if you encounter a rate limit error. 2. Thread errors. These are errors related to the management of threads, such as creating too many threads or problems with thread synchronization. If too many threads are created, you might run into system limits or performance issues due to the overhead of managing a large number of threads. 3. Synchronization errors. These occur when multiple threads try to modify a shared resource at the same time, leading to race conditions, inconsistent results, or even data corruption. While these errors are less likely to occur in the context of Web3 requests, they're an important consideration when working with multithreaded applications. 4. Unhandled exceptions in threads. If an unhandled exception occurs in a thread, it can cause the thread to terminate unexpectedly. This might lead to resource leaks if the thread doesn't clean up its resources before terminating, and it can also lead to incomplete results if the thread doesn't finish its task. Handling these errors and exceptions properly is crucial for ensuring the robustness and reliability of your multithreaded application. Further, let us discuss some strategies. **Handling thread errors**. You can handle thread errors by limiting the number of threads that your program creates and by catching and handling any exceptions that occur when creating or managing threads. For example: ```python Python import asyncio from concurrent.futures import ThreadPoolExecutor from web3 import Web3 import time web3 = Web3(Web3.HTTPProvider("YOUR_CHAINSTACK_ENDPOINT")) address = "0x1f9090aaE28b8a3dCeaDf281B0F12828e676c326" start_block = web3.eth.block_number end_block = start_block - 500 max_workers = 100 def get_balance_at_block(block_num): try: balance = web3.eth.get_balance(address, block_identifier=block_num) print(f"Balance at block {block_num}: {web3.from_wei(balance, 'ether')} ETH") except Exception as e: print(f"Error occurred while getting balance at block {block_num}: {e}") async def main(): with ThreadPoolExecutor(max_workers=max_workers) as executor: loop = asyncio.get_event_loop() futures = [ loop.run_in_executor( executor, get_balance_at_block, block_num ) for block_num in range(start_block, end_block, -1) ] for future in asyncio.as_completed(futures): try: # This will raise an exception if the thread raised an exception result = await future except Exception as e: print(f"Error occurred in thread: {e}") loop = asyncio.get_event_loop() start = time.time() loop.run_until_complete(main()) print(f"Time taken: {time.time() - start}") ``` **Handling synchronization errors**. If your threads are sharing resources, you should use synchronization primitives like locks, semaphores, or condition variables to ensure that only one thread modifies the shared resource at a time. **Handling unhandled exceptions in threads**. To handle unhandled exceptions in threads, you can catch and handle exceptions within each thread, or you can use the `Future.result()` method, which reraises any exception that occurred in the thread. If an exception occurs in a thread, it's stored in the `Future` object for that thread, and calling `Future.result()` will raise that exception in the main thread. This allows you to handle the exception in the main thread and decide how to proceed. ## Best practices for multithreaded Web3 requests When implementing multithreading, you can follow several best practices to ensure your application is efficient, robust, and easy to maintain. * **Choose an appropriate number of threads**. The optimal number of threads depends on a variety of factors, including the nature of the tasks (I/O-bound vs CPU-bound), the specifications of the system (number of CPUs, memory), and the server's capacity to handle concurrent requests. Too many threads can lead to excessive context switching, memory usage, and potentially hitting rate limits on the Ethereum node. Too few threads might not fully utilize the available resources. Typically, for I/O-bound tasks like network requests, a higher number of threads can be beneficial. Start with a reasonable number and adjust based on performance observations and system characteristics. * **Handle exceptions properly**. Unhandled exceptions in a thread can cause the thread to terminate unexpectedly, which can lead to unpredictable behavior and resource leaks. Use try/except blocks to catch and handle exceptions in each thread. Also, consider using the `Future.result()` method, which reraises any exception that occurred in the thread, allowing you to handle the exception in the main thread. * **Manage thread lifecycles**. Be sure to clean up after your threads when they're done, especially for long-running applications. Using a context manager (the `with` keyword) with `ThreadPoolExecutor` automatically starts and stops the threads. * **Avoid shared state when possible**. Shared state can lead to race conditions and make your code harder to reason. Whenever possible, design your threads to be independent of each other. * **Use appropriate synchronization primitives**. If you must use shared state, use locks, semaphores, or other synchronization primitives to ensure that threads don't interfere. However, be aware that improper use of synchronization primitives can lead to deadlocks and other issues. * **Don't ignore the GIL**. Python's global interpreter lock can limit the performance benefits of multithreading for CPU-bound tasks. However, for I/O-bound tasks like making web requests, multithreading can still provide significant performance benefits. * **Respect the server's limits**. Be aware that there may be limits on the number of requests you can make per minute or day to a server. Making too many requests in a short period of time may lead to your requests being throttled or your IP being blocked. ## Conclusion Multithreading is a powerful technique that can help you make the most efficient use of your resources when making blockchain requests. By making multiple requests concurrently, you can significantly speed up your scripts and get more work done in the same amount of time. In this guide, we've explored how to set up your Python environment for multithreading, create a simple script using both sequential and multithreaded approaches, and handle errors and exceptions in multithreaded code. We've also discussed some best practices for writing multithreaded code, such as choosing an appropriate number of threads, managing thread lifecycles, and handling shared states properly. However, it's important to remember that multithreading comes with its own set of challenges. Managing multiple threads can add complexity to your code, and you need to be careful to avoid issues like race conditions and resource leaks. Also, you need to be aware of the server's capacity and respect any rate limits that might be in place. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Mempool configurations Source: https://docs.chainstack.com/docs/mempool-configuration Mempool, also known as the memory pool, transaction pool or txpool, is a dynamic, in-memory space for transactions awaiting block inclusion. Rather than a single, unified pool, each node maintains its own transaction pool, collectively constituting the global memory. This constantly changing dataset, containing thousands of pending transactions, represents millions of dollars in flux at any given moment. Mempool configurations vary wildly across different protocols & node client configurations. The table here is a maintained reference of mempool configurations & specifics across all the protocols that Chainstack supports. The table structure: * Protocol — protocol name. * Protocol availability — details of the mempool availability on the protocol level. * Chainstack availability — how to access the protocol's mempool on Chainstack. * Client configuration — the default node client configuration for the mempool as deployed at Chainstack. * Example — a simple curl example. Remember to replace YOUR\_CHAINSTACK\_NODE with your Chainstack node endpoint for that particular protocol. Remember that whatever the default configuration, we can always [customize](/docs/features-availability-across-subscription-plans#node-customization) it for you on a [dedicated node](/docs/dedicated-node) . | Protocol | Protocol availability | Chainstack availability | Client configuration | Example | | --------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------------------- | ----------------------------------------------------------- | ------------------------------------------------------ | | Ethereum | Available to everyone | [Archive](/docs/protocols-modes-and-types) with [Debug and trace APIs](/docs/debug-and-trace-apis) | 4096 pending transactions | [Ethereum txpool\_content](#ethereum-txpool_content) | | Polygon | Available to everyone | [Archive](/docs/protocols-modes-and-types) with [Debug and trace APIs](/docs/debug-and-trace-apis) | 4096 pending transactions | [Polygon txpool\_content](#polygon-txpool_content) | | BNB Smart Chain | Available to everyone | [Archive](/docs/protocols-modes-and-types) with [Debug and trace APIs](/docs/debug-and-trace-apis) | 4096 pending transactions | [BSC txpool\_content](#bsc-txpool_content) | | Base | [Private to Sequencer](https://github.com/base-org/node/issues/78). | N/A | N/A | N/A | | Avalanche | [Available only to validators](https://support.avax.network/en/articles/6158842-nodes-faq). | Not available on Chainstack | N/A | N/A | | TON | External messages are available in mempool | Available | | | | Arbitrum | No mempool. Only [Sequencer FIFO](https://docs.arbitrum.io/learn-more/faq#does-arbitrum-have-a-mempool). | N/A | N/A | N/A | | zkSync Era | Sequencer only | N/A | N/A | N/A | | Polygon zkEVM | Sequencer only | N/A | N/A | N/A | | Optimism | [Private to Sequencer](https://docs.optimism.io/stack/differences#mempool-rules) | N/A | N/A | N/A | | Oasis Saphire | [Confidential mempool](https://oasisprotocol.org/blog/web3-mev-protection) | N/A | N/A | N/A | | NEAR | [No mempool](https://gov.near.org/t/transaction-fee-discussion/58) | N/A | N/A | N/A | | Aurora | [No mempool](https://gov.near.org/t/transaction-fee-discussion/58) | N/A | N/A | N/A | | Solana | [No mempool](https://docs.solana.com/integrations/retrying-transactions) | N/A | N/A | N/A | | Scroll | [Publicly available through Sequencer](https://docs.scroll.io/en/technology/chain/rollup/). | N/A | N/A | N/A | | Ronin | Available to everyone | Not available on Chainstack | N/A | N/A | | Aptos | [Available to everyone](https://aptos.dev/concepts/validator-nodes#mempool) but there is no [transaction stream](https://github.com/aptos-labs/aptos-core/issues/5200) | [Full or archive node deployment](/docs/protocols-modes-and-types) | N/A | N/A | | Gnosis Chain | Available to everyone | On a [dedicated node](/docs/dedicated-node). | Any configuration | [Gnosis txpool\_content](#gnosis-txpool_content) | | Cronos | No mempool. [Only Sequencer FIFO](https://docs.cronos.org/cronos-chain-protocol/cronos-general-faq#if-i-increase-the-gas-price-does-it-help-to-speed-up-my-transaction). | N/A | N/A | N/A | | Filecoin | Available to everyone | [Full node deployment](/docs/protocols-modes-and-types) | 20,000 pending messages | [Filecoin MpoolPending](#filecoin-mpoolpending) | | Sonic | Available to everyone | On a [dedicated node](/docs/dedicated-node) | 4096 pending transactions | `txpool_*` | | Fantom | Available to everyone | On a [dedicated node](/docs/dedicated-node) | 4096 pending transactions | `txpool_*` | | TRON | Available to everyone | On a [dedicated node](/docs/dedicated-node) | 4096 pending transactions | `txpool_*` | | Starknet | Private to Sequencer | N/A | N/A | N/A | | Tezos | Available to everyone | [Full or archive node deployment](/docs/protocols-modes-and-types) | 240 blocks | [Tezos pending\_operations](#tezos-pending_operations) | | Bitcoin | Available to everyone | [Full node deployment](/docs/protocols-modes-and-types) | mempool size: 300 MB mempool tx expiry: 336 hours (14 days) | [Bitcoin getrawmempool](#bitcoin-getrawmempool) | ## Ethereum txpool\_content ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"txpool_content","params":[],"id":1}' \ YOUR_CHAINSTACK_NODE ``` ## Polygon txpool\_content ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"txpool_content","params":[],"id":1}' \ YOUR_CHAINSTACK_NODE ``` ## BSC txpool\_content ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"txpool_content","params":[],"id":1}' \ YOUR_CHAINSTACK_NODE ``` ## Gnosis txpool\_content ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"txpool_content","params":[],"id":1}' \ YOUR_CHAINSTACK_NODE ``` ## Filecoin MpoolPending ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{ "jsonrpc": "2.0", "method": "Filecoin.MpoolPending", "params": [null], "id": 1 }' \ YOUR_CHAINSTACK_NODE ``` ## Fantom txpool\_content ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc":"2.0","method":"txpool_content","params":[],"id":1}' \ YOUR_CHAINSTACK_NODE ``` ## Tezos pending\_operations ```bash Shell curl -X GET "YOUR_CHAINSTACK_NODE/chains/main/mempool/pending_operations" ``` ## Bitcoin getrawmempool ```bash Shell curl -X POST \ -H "Content-Type: application/json" \ --data '{"jsonrpc": "2.0", "id":"curltest", "method": "getrawmempool", "params": [] }' \ YOUR_CHAINSTACK_NODE ``` Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Integrating Multi-Endpoint Shared Configuration (MESC) with Chainstack: Comprehensive guide Source: https://docs.chainstack.com/docs/mesc-and-chainstack **TLDR** * `getrawtransaction` retrieves raw or decoded Bitcoin transaction data by `txid`, optionally with a specific block hash. * Verbose mode (`true`) returns a detailed JSON, while omitting it yields the raw hex string. * Transactions typically remain in the mempool for around 14 days; dropped transactions require resubmission with adequate fees. * Bitcoin Core runs many operations single-threaded, so complex or parallel requests may slow under heavy load. ## Main article In Web3 and blockchain, we get a new chain or a new RPC provider almost daily, and the ability to interact with multiple blockchain networks through RPC endpoints is crucial for developers, applications, and services. The Multi-Endpoint Shared Configuration (MESC) emerges as a pivotal standard in this ecosystem, addressing the pain points of configuring and managing RPC endpoints across diverse tools and platforms. MESC introduces a standard for configuring RPC endpoints in DApps, emphasizing the ease of sharing configuration data and managing many endpoints. It focuses on making RPC configuration sharing across tools, languages, and environments straightforward, alongside simplifying the management of numerous RPC endpoints. This guide will teach us how to use the MESC CLI using Chainstack endpoints. ## Key components of MESC MESC represents a significant leap toward simplifying the complexities developers face daily. Let's get into the core aspects that make MESC an indispensable tool for developers navigating the Web3 space. ### Typical usage scenario Consider a decentralized voting application, `VoteChain`, designed to operate across multiple blockchain networks. This application needs to interact with different RPC endpoints to submit votes or query vote counts. The traditional method requires hardcoding or manually specifying the RPC URL for each network, which is error-prone and cumbersome, especially when aiming to support numerous networks. ### Common interface One of MESC's standout features is its common interface, implemented across different programming languages and platforms. This uniformity allows developers to interact with the MESC configuration seamlessly, regardless of their development environment. Whether it's Python, Rust, or any other supported language, the interface remains consistent, ensuring developers can easily switch between tools and languages without relearning how to access and manage their RPC configurations. ### MESC specifications MESC employs a detailed schema to encapsulate all necessary information for RPC configurations, comprising three key-value data schemas: `RpcConfig`, `Endpoint`, and `Profile`. Each schema is meticulously designed to cover all aspects of RPC configuration, from global settings to specific endpoint details and user profiles. * **RpcConfig schema**: The backbone of MESC, detailing the version, default endpoints, and default networks, mapping chain IDs to endpoint names, network names to chain IDs, and specific endpoint configurations. It also includes profiles for user-specific settings and global metadata for additional information. * **Endpoint schema**: Specifies individual RPC endpoints, including their names, URLs, associated chain IDs, and any metadata related to the endpoint. This schema ensures each endpoint is uniquely identifiable and adequately linked to its network. * **Profile schema**: Users can define custom configurations, including a default endpoint, network defaults, and profile-specific metadata. Enabling or disabling MESC for specific profiles allows users to manage their RPC connections. The specification enforces strict requirements to maintain integrity and consistency within the configuration. All keys within the `RpcConfig` and `Endpoint` schemas are mandatory, with no allowance for unspecified keys except within metadata sections. It mandates that endpoint names referenced in the configuration must correspond to actual endpoints defined within the schema, ensuring referential integrity. Find the full [spefications](https://github.com/paradigmxyz/mesc/blob/main/SPECIFICATION.md) in the MESC repository. #### MESC's role in simplifying configurations By leveraging MESC, `VoteChain` can dynamically select the appropriate RPC endpoint based on user input or application context, such as the specific blockchain network for casting votes. This is made possible through MESC's ability to interpret various forms of identifiers: 1. **Direct URL**: For direct access to a specific RPC endpoint. 2. **Network name**: Specify the blockchain network using familiar names. 3. **Chain ID**: Enabling selection based on the unique identifier of the blockchain. 4. **Custom profile**: Supporting configurations for various deployment scenarios or user preferences. This functionality allows `VoteChain` to be highly adaptable and user-friendly, significantly reducing the barrier to entry for users unfamiliar with the specific details of blockchain networks. #### Hypothetical implementation To bring this scenario to life, imagine `VoteChain` incorporates a feature allowing users to select the blockchain network for their voting campaign directly from the application interface. Behind the scenes, `VoteChain` uses MESC to manage and resolve the appropriate RPC endpoints: **For user-friendly network selection**: Users can choose a network by name through the `VoteChain` interface. The application then queries the MESC configuration to retrieve the corresponding RPC URL, ensuring seamless interaction with the chosen network. #### Pseudo code example ```pseudo pseudo # Assume user_input is the network name or a custom endpoint identifier user_input = get_user_input() # MESC resolves the appropriate RPC endpoint if is_custom_endpoint(user_input): endpoint = mesc.get_endpoint_by_name(user_input) else: endpoint = mesc.get_endpoint_by_network_name(user_input) # Use the resolved endpoint URL in VoteChain rpc_url = endpoint.url use_rpc_endpoint(rpc_url) ``` ## Get started with the MESC CLI Now that we understand MESC well and what it's trying to solve, let's learn how to use the MESC CLI. ### Prerequisites Before we install the MESC CLI, ensuring your development environment is properly set up is essential. The MESC CLI is developed in Rust; having Rust and Cargo (the Rust package manager) installed on your system is a prerequisite. To install Rust and Cargo, execute the following command in your terminal: ```shell Shell curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh ``` This command downloads and runs the Rust installation script, setting up Rust and Cargo on your system. #### Get Chainstack endpoints To configure the MESC CLI you'll also need some endpoints. Follow these steps to get them from Chainstack; remember, you can use any EVM-compatible endpoint. ### Instal the MESC CLI With Rust and Cargo ready, installing the MESC CLI is straightforward. Use Cargo to install the MESC CLI with the following command: ```shell Shell cargo install mesc_cli ``` This command fetches the latest version of the MESC CLI from [crates.io](https://crates.io/crates/mesc_cli) (the Rust package registry) and installs it on your system. For additional installation methods and further information, consult the [MESC User Guide](https://paradigmxyz.github.io/mesc/cli.html#installation). ### Understand `mesc.json` configuration After installing the MESC CLI, the next step is configuring your MESC environment. This is where `mesc.json` comes into play. `mesc.json` is a JSON-formatted configuration file MESC utilizes to manage your RPC endpoints. It is the central repository for all your endpoint configurations, including the default endpoint, network-specific defaults, and any custom profiles you may need for different projects or environments. The `mesc.json` file is structured to include vital details such as the version of MESC being used (`mesc_version`), a list of RPC endpoints (`endpoints`), and mappings for network defaults (`network_defaults`) and network names (`network_names`). This structured approach allows for a high degree of flexibility and control over how your applications interact with blockchain networks. ### Create `mesc.json` There are a few ways to manage the `mesc.json` file. The first method we'll explore is creating one directly from the MESC CLI. Open a terminal on your system and run: ```shell Shell mesc setup ``` This will start the MESC CLI and prompt you to set up your environment. Select `1) File (recommended)` and follow the instructions to add your endpoints and create `mesc.json`. Here is how the process will look like in the console: ```shell Shell MESC is disabled because no MESC env vars are set To enable MESC, set one of the MESC env vars Do you want to store your MESC config in a file or in an env var? File (recommended) Where to save MESC config file? ~/mesc.json Need to set MESC_PATH in order to use this file MESC_PATH can be automatically updated for 2 shell config files: 1. /Users/name/.bashrc 2. /Users/name/.profile What do you want to do? Edit these files automatically (recommended) Edited 2 config files Config file does not exist, do you want to create one? Yes Created blank config at /Users/name/mesc.json Current config has 0 endpoints and 0 profiles What do you want to do? Add new endpoint New endpoint URL? https://ethereum-mainnet.core.chainstack.com/AUTH Querying chain id... Using chain_id 1 New endpoint name? eth_chainstack New endpoint added What do you want to do? Add new endpoint New endpoint URL? https://avalanche-mainnet.core.chainstack.com/ext/bc/C/rpc/AUTH Querying chain id... Using chain_id 43114 New endpoint name? avax_chainstack New endpoint added What do you want to do? Exit and save changes config written to /Users/name/mesc.json Shell config files were modified. Restart shell to load these files. ``` The CLI leads you through the creation of the `mesc.json` file, which is saved by default in the root user directory. Additionally, it automatically sets the `MESC_PATH` environment variable in both `.bashrc` and `.profile`. `.bashrc` and `.profile` are shell scripts for configuring user environment variables and settings. The `mesc.json` file created will look like this: ```Json JSON { "mesc_version": "0.2.0", "default_endpoint": null, "endpoints": { "avax_chainstack": { "name": "avax_chainstack", "url": "https://avalanche-mainnet.core.chainstack.com/ext/bc/C/rpc/AUTH", "chain_id": "43114", "endpoint_metadata": {} }, "eth_chainstack": { "name": "eth_chainstack", "url": "https://ethereum-mainnet.core.chainstack.com/AUTH", "chain_id": "1", "endpoint_metadata": {} } }, "network_defaults": {}, "network_names": {}, "profiles": {}, "global_metadata": {} } ``` ### Edit `mesc.json` We have a basic MESC configuration file; let's customize it a bit, adding default networks and default endpoints. In this example, I created the file with two Chainstack endpoints, Ethereum mainnet and Avalanche mainnet. Let's assume I want to default to Ethereum; we can configure the file as follows: ``` { "mesc_version": "0.2.0", "default_endpoint": "eth_chainstack", "endpoints": { "avax_chainstack": { "name": "avax_chainstack", "url": "https://avalanche-mainnet.core.chainstack.com/ext/bc/C/rpc/AUTH", "chain_id": "43114", "endpoint_metadata": { "host": "chainstack", "node_client": "avalanchego/v0.12.7", "namespaces": ["eth", "net", "web3"], "location": "Global" } }, "eth_chainstack": { "name": "eth_chainstack", "url": "https://ethereum-mainnet.core.chainstack.com/AUTH", "chain_id": "1", "endpoint_metadata": { "host": "chainstack", "node_client": "Geth/v1.13.2-stable/linux-amd64/go1.21.1", "namespaces": ["eth", "net", "web3", "debug", "trace"], "location": "Global" } } }, "network_defaults": { "1": "eth_chainstack", "43114": "avax_chainstack" }, "network_names": { "eth_mainnet" : "1", "avax_mainnet": "43114" }, "profiles": { "my_special_tool": { "name": "my_special_tool", "default_endpoint": "eth_chainstack", "network_defaults": { "1": "eth_chainstack" }, "profile_metadata": {}, "use_mesc": true } }, "global_metadata": { "conceal": true } } ``` We added metadata to the endpoints, network names, and a profile here. With extra metadata and configurations, we are ready to test it and learn how to use the MESC CLI to send requests without ever touching the endpoints again. Note that you can also add metadata directly from the setup interface. Learn more about the MESC configuration and metadata options in the [MESC User Guide](https://paradigmxyz.github.io/mesc/using_mesc.html) ### MESC CLI: Overview of basic commands Let's get into the essentials with the MESC CLI. Starting with `mesc --help`, you'll uncover a comprehensive list of commands and their structure alongside valuable help topics for an in-depth understanding of specific commands or steps. If the `MESC_PATH` environment variable isn't correctly configured, you might encounter issues. Use `mesc help setup` for guidance on proper setup. In case you've stored the `mesc.json` file in its default location, setting the environment variable is straightforward: Shell ``` export MESC_PATH=~/mesc.json ``` #### Explore available endpoints Use the `mesc ls` command to display a detailed list of available endpoints. This command also serves as a quick check to ensure MESC is correctly set up, indicating any setup errors directly: ```shell Shell mesc ls ``` Expect an output similar to the following, listing configured endpoints: ```shell Shell endpoint │ network │ url ─────────────────┼───────────┼────────── eth_chainstack │ 1 │ ******** avax_chainstack │ 43114 │ ******** ``` You can also use the `mesc endpoint` command to print the full info about endpoints. ```shell Shell mesc endpoint ``` This command will print the data about the default endpoint: ```shell Shell Endpoint: eth_chainstack - url: https://ethereum-mainnet.core.chainstack.com/AUTH - chain_id: 1 - metadata: {"location": String("Global"), "node_client": String("Geth/v1.13.2-stable/linux-amd64/go1.21.1"), "namespaces": Array [String("eth"), String("net"), String("web3")], "host": String("chainstack")} ``` Add the name of the endpoint to print data about a specific endpoint: ```shell Shell mesc endpoint avax_chainstack ``` #### Verify endpoint connectivity To test the connectivity and performance of the endpoints, use the `mesc ping` command. This command not only verifies the operational status of each endpoint but also provides metrics such as latency and the latest block number: ```shell Shell mesc ping ``` Sample output showcasing the performance metrics: ```shell Shell endpoint │ network │ latency │ block ─────────────────┼───────────┼───────────┼────────── avax_chainstack │ 43114 │ 564.4 │ 41465592 eth_chainstack │ 1 │ 574.9 │ 19192953 2 endpoints responded without error Additional fields available: ip, client, location ``` To retrieve more specific data, such as the client information, append the desired field to the ping command: ```shell Shell mesc ping client ``` In scenarios where endpoints fail to respond, consider specifying a manual timeout to differentiate between genuinely offline endpoints and those that merely exceed the default timeout period: ```shell Shell mesc ping --timeout 5 ``` ## Send requests using MESC At this point, MESC is fully set up, and we can start using it to send RPC requests. The most straightforward way to test and explore its features is by sending cURL requests. The power of MESC lies in the, as we discussed earlier, MESC allows us to access endpoints intuitively like **Endpoint Name** or **Network Name**. This is the structure to send a basic cURL request: ```shell Shell curl $(mesc url) --data RPC_REQUEST ``` `$(mesc url)` will use the default endpoint, in our case `eth_chainstack`, let's fetch the chain ID, for instance: ```shell Shell curl $(mesc url) --data '{"jsonrpc": "2.0", "method": "eth_chainId", "id": "mesc_test"}' ``` Which will return: ```shell Shell {"jsonrpc":"2.0","id":"mesc_test","result":"0x1"} ``` We can confirm that it used the Ethereum endpoint as the chain ID returned is `1`. To use the Avalanche endpoint, simply add its name to the command: ```shell Shell curl $(mesc url avax_chainstack) --data '{"jsonrpc": "2.0", "method": "eth_chainId", "id": "mesc_test"}' ``` And in this case, we'll get Avalanche chain ID: ```shell Shell {"jsonrpc":"2.0","id":"mesc_test","result":"0xa86a"} ``` You can use the [Chainstack EVM-Knife](https://web3tools.chainstacklabs.com/hexadecimal-decimal) to convert hexadecimal values to decimal. `0xa86a` is `43114` as decimal, which is Avalanche chain ID. Using the endpoint's name is only one way to use MESC; you can also pick the endpoint by a network default, giving the chain ID: ```shell Shell curl $(mesc url 43114) --data '{"jsonrpc": "2.0", "method": "eth_chainId", "id": "mesc_test"}' ``` Or by naming a network directly: ```shell Shell curl $(mesc url avax_mainnet) --data '{"jsonrpc": "2.0", "method": "eth_chainId", "id": "mesc_test"}' ``` You can also invoke the default endpoint associated with a profile: ```shell Shell curl $(mesc url --profile my_special_tool) --data '{"jsonrpc": "2.0", "method": "eth_chainId", "id": "mesc_test"}' ``` Check out the [MESC User Guide](https://paradigmxyz.github.io/mesc/cli.html)for more information. ## Conclusion In the rapidly evolving landscape of blockchain development, the Multi-Endpoint Shared Configuration (MESC) stands out as a critical innovation for simplifying RPC endpoint management. By centralizing and standardizing RPC configurations, MESC significantly reduces the complexity and enhances the security of interacting with blockchain networks. Developers gain the ability to manage endpoints efficiently without directly handling sensitive endpoint details, thereby reducing exposure to security risks. The streamlined process fosters better organization, collaboration, and productivity by allowing quick switches between networks and configurations with minimal overhead. # MEV protection Source: https://docs.chainstack.com/docs/mev-protection With on, your mainnet transaction is redirected to a partner network that works directly with block builders. This approach bypasses the public mempool, helping protect against front-running attacks. This currently applies to: Blink Labs as a partner Merkle as a partner ### How it works 1. You construct and sign your transaction as usual and send it through our endpoint. This is sent as `eth_sendRawTransaction`. 2. Instead of broadcasting to the global mempool, our proxy forwards it to the partner network for direct inclusion consideration by block builders. 3. A builder includes your transaction in a block. Once confirmed, it appears on-chain as normal. If you have the transaction hash, you can retrieve it from the private pool with the regular [eth\_getTransactionByHash](/reference/ethereum_node_api/transaction_info/eth_getTransactionByHash) but only through the same Chainstack node endpoint you used to submit the transaction. Once the transaction is in the block, you can retrieve it using all regular methods, of course: and so on. ### Benefits * Reduced front-running risk — your transaction isn’t visible in the public mempool, minimizing the chance of adversary trades. * Reliable delivery — direct builder routes ensure predictable inclusion. ### Opting out If you prefer the traditional public mempool route, you can disable private routing. Just switch **MEV protection** to off on the node details page. # Migrating from Helius getTokenAccounts to standard Solana RPC methods Source: https://docs.chainstack.com/docs/migrating-from-helius-gettokenaccounts-to-standard-solana-rpc-methods **TLDR** * Helius’ `getTokenAccounts` can be replaced with standard Solana JSON-RPC calls like `getProgramAccounts` and `getTokenAccountBalance`. * `getProgramAccounts` with filters and pagination helps gather all token accounts for a mint, though be mindful of rate limits and data size. * For larger tokens (like USDC), consider chunking requests and storing results in a database for performance. * Overall, the standard RPC approach is fully portable and avoids provider lock-in. ## Main article For starters, this is not a poke at Helius. It's something that builders ask for and we provide a simple tutorial. ## Overview This tutorial shows how to migrate from Helius' custom `getTokenAccounts` method to standard Solana JSON-RPC methods when switching node providers. We'll demonstrate how to achieve the same functionality using `getProgramAccounts` with proper pagination and data filtering. ## Implementation For illustration purposes, we'll provide you with a Helius script, a standard methods script, and a sample script to print token holders using standard methods. We'll be using the [ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx](https://solscan.io/token/ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx) token in all examples. So feel free to replace is with any other token. Although do note that tokens with higher holder count USDC will most likely need a more optimized approach. Also be sure to check [Limits](/docs/limits) as `getProgramAccounts` is a heavy call and in general this sort of operation on Solana infrastructurally is definitely not even in the cruiserweight class. ### Get you own node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ### Helius Here's a simple Helius example: Remember to replace `ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx` with your token if want. ```python Python import requests import logging logging.basicConfig( level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s', handlers=[logging.StreamHandler()] ) url = "HELIUS_RPC" def get_token_accounts(contract_address: str, batch_size: int = 1000): cursor = None while True: try: params = { "limit": batch_size, "mint": contract_address } if cursor: params["cursor"] = cursor response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": "helius-test", "method": "getTokenAccounts", "params": params }, timeout=30 ) response.raise_for_status() data = response.json() if not data.get("result"): logging.error("Invalid response format - missing 'result' field") break token_accounts = data["result"].get("token_accounts", []) if not token_accounts: logging.info("No more token accounts to process") break for account in token_accounts: logging.info(f"Account owner: {account['owner']}, Amount: {account['amount']}") cursor = data["result"].get("cursor") if not cursor: logging.info("Reached end of pagination - all accounts processed") break except requests.exceptions.RequestException as e: logging.error(f"API request failed: {str(e)}") break if __name__ == "__main__": atlas_contract_address = "ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx" get_token_accounts(atlas_contract_address) ``` ### Standard JSON-RPC example Here's a standard one: Remember to replace `ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx` with your token if want. ```python Python import requests import logging import time from base64 import b64decode logging.basicConfig( level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s', handlers=[logging.StreamHandler()] ) url = "CHAINSTACK_RPC" TOKEN_PROGRAM_ID = "TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA" def get_token_accounts(contract_address: str, batch_size: int = 25): total_processed = 0 try: response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": 1, "method": "getProgramAccounts", "params": [ TOKEN_PROGRAM_ID, { "dataSlice": { "offset": 0, "length": 0 }, "filters": [ { "dataSize": 165 }, { "memcmp": { "offset": 0, "bytes": contract_address } } ] } ] }, timeout=60 ) response.raise_for_status() data = response.json() total_accounts = len(data.get("result", [])) logging.info(f"Total accounts to process: {total_accounts}") # Now fetch accounts in smaller batches with minimal data for offset in range(0, total_accounts, batch_size): logging.info(f"Fetching batch starting at offset {offset}") response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": 1, "method": "getProgramAccounts", "params": [ TOKEN_PROGRAM_ID, { "encoding": "base64", "filters": [ { "dataSize": 165 }, { "memcmp": { "offset": 0, "bytes": contract_address } } ], "dataSlice": { "offset": 32, "length": 32 }, "limit": batch_size } ] }, timeout=60 ) response.raise_for_status() data = response.json() if "result" not in data: logging.error("Invalid response format - missing 'result' field") break accounts = data["result"] if not accounts: break for account in accounts: try: raw_data = account["account"]["data"][0] owner_bytes = b64decode(raw_data) owner = ''.join([f'{b:02x}' for b in owner_bytes]) logging.info(f"Account owner: {owner}") total_processed += 1 except (KeyError, IndexError, TypeError) as e: logging.error(f"Error processing account data: {e}") continue logging.info(f"Processed {total_processed}/{total_accounts} accounts") if offset + batch_size < total_accounts: time.sleep(0.5) except requests.exceptions.RequestException as e: logging.error(f"API request failed: {str(e)}") if __name__ == "__main__": atlas_token = "ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx" get_token_accounts(atlas_token, batch_size=25) ``` ### Standard JSON-RPC example with DB connection Here's a *sample* of how you'd be using the standard [getProgramAccounts](/reference/solana_node_api/getProgramAccounts) method to monitor the holders of a token and then dumping the data to a PostgreSQL DB and keeping key-value store of accounts holding the token and the number of tokens up to date. ```python Python import requests from decimal import getcontext, Decimal, InvalidOperation import time from datetime import datetime from typing import Dict, List, Tuple, Any, Optional from base64 import b64decode import psycopg2 # PostgreSQL connection configuration db_params = { 'host': '', 'database': '', 'user': '', 'password': '', 'port': '5432' } url = "CHAINSTACK_RPC" getcontext().prec = 18 TOKEN_PROGRAM_ID = "TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA" def get_token_accounts(contract_address: str, batch_size: int = 25) -> Dict[str, int]: """ Gets token holders using standard Solana RPC methods and returns holder data """ holder_data = {} total_processed = 0 try: # Initial request to get total count response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": 1, "method": "getProgramAccounts", "params": [ TOKEN_PROGRAM_ID, { "dataSlice": { "offset": 0, "length": 0 }, "filters": [ { "dataSize": 165 }, { "memcmp": { "offset": 0, "bytes": contract_address } } ] } ] }, timeout=60 ) response.raise_for_status() data = response.json() total_accounts = len(data.get("result", [])) # Process accounts in batches for offset in range(0, total_accounts, batch_size): response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": 1, "method": "getProgramAccounts", "params": [ TOKEN_PROGRAM_ID, { "encoding": "base64", "filters": [ { "dataSize": 165 }, { "memcmp": { "offset": 0, "bytes": contract_address } } ], "dataSlice": { "offset": 32, "length": 32 }, "limit": batch_size } ] }, timeout=60 ) response.raise_for_status() data = response.json() if "result" not in data: continue accounts = data["result"] if not accounts: break for account in accounts: try: raw_data = account["account"]["data"][0] owner_bytes = b64decode(raw_data) owner = ''.join([f'{b:02x}' for b in owner_bytes]) # Get token balance balance_response = requests.post( url, headers={"Content-Type": "application/json"}, json={ "jsonrpc": "2.0", "id": 1, "method": "getTokenAccountBalance", "params": [owner] }, timeout=30 ) balance_data = balance_response.json() if "result" in balance_data and "value" in balance_data["result"]: amount = int(balance_data["result"]["value"]["amount"]) if amount > 0: # Only count non-zero balances holder_data[owner] = amount total_processed += 1 except (KeyError, IndexError, TypeError) as e: continue if offset + batch_size < total_accounts: time.sleep(0.5) except requests.exceptions.RequestException: pass return holder_data def update_holder_data(contract_address: str, holder_data: Dict[str, int]): """ Updates the database with new holder data, tracking changes """ conn = None try: conn = psycopg2.connect(**db_params) cursor = conn.cursor() # Get existing holder data cursor.execute(""" SELECT holder_address, amount FROM token_holders WHERE contract_address = %s """, (contract_address,)) existing_holders = {row[0]: row[1] for row in cursor.fetchall()} # Calculate changes current_time = int(time.time()) date_time = datetime.fromtimestamp(current_time) # Insert new/updated holders for holder, amount in holder_data.items(): if holder not in existing_holders or existing_holders[holder] != amount: cursor.execute(""" INSERT INTO token_holders (contract_address, holder_address, amount, timestamp, date_time) VALUES (%s, %s, %s, %s, %s) ON CONFLICT (contract_address, holder_address) DO UPDATE SET amount = EXCLUDED.amount, timestamp = EXCLUDED.timestamp, date_time = EXCLUDED.date_time """, (contract_address, holder, amount, current_time, date_time)) # Record total holder count total_holders = len([amount for amount in holder_data.values() if amount > 0]) cursor.execute(""" INSERT INTO holder_counts (contract_address, timestamp, date_time, count) VALUES (%s, %s, %s, %s) """, (contract_address, current_time, date_time, total_holders)) conn.commit() except (psycopg2.Error, Exception) as e: if conn: conn.rollback() raise finally: if conn: conn.close() def main(): try: conn = psycopg2.connect(**db_params) cursor = conn.cursor() # Get contracts to process cursor.execute(""" SELECT contract_address FROM contracts WHERE active = true """) contracts = [row[0] for row in cursor.fetchall()] for contract in contracts: holder_data = get_token_accounts(contract) update_holder_data(contract, holder_data) time.sleep(1) # Rate limiting between contracts except Exception as e: print(f"Error in main process: {str(e)}") finally: if 'conn' in locals() and conn: conn.close() if __name__ == "__main__": # Example token addresses tokens = [ "ATLASXmbPQxBUYbxPsV97usA3fPQYEqzQBUHgiFCUsXx", # ATLAS token # Add more token addresses here ] for token in tokens: try: holder_data = get_token_accounts(token) update_holder_data(token, holder_data) time.sleep(1) # Rate limiting between tokens except Exception as e: print(f"Error processing token {token}: {str(e)}") ``` Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Monitoring transaction propagation from node to mempool in EVM Networks with Python Source: https://docs.chainstack.com/docs/monitoring-transaction-propagation-from-node-to-mempool-in-evm-networks-with-python * Demonstrates how to monitor new Ethereum transactions in the mempool using a Python script (web3.py + concurrency). * Explains why transaction propagation matters for network performance and how to approximate how quickly transactions spread across different nodes. * Shows how to tweak your endpoint usage (same vs. different) to measure local or network-wide propagation times more accurately. * Concludes that regular mempool propagation testing helps reveal bottlenecks and fortifies blockchain reliability. ## Main article With the adoption of Ethereum and other blockchains that use the same kind of technology (we call these EVM blockchains), it's important to make sure that the network is stable, safe, and efficient. One big thing developers and the people who manage the network need to keep an eye on is how transactions move around in the mempool. In this article, we're going to show you how to test how blocks move in the mempool for EVM nodes using Chainstack. We'll talk about why these tests are so important, the tools and tricks you'll need, and some common problems and slowdowns that can happen when blocks are moving around. ## Why test transaction propagation? The mempool is a temporary storage space where unconfirmed transactions wait to be included in a block. In a decentralized system, such as Ethereum, multiple nodes maintain separate mempools, but they constantly communicate with each other to sync up on the latest state of pending transactions. Propagation in the mempool predominantly affects two major factors: latency and network efficiency. Poor propagation can lead to increased transaction confirmation times and the risk of chain reorganizations or forks. Thus, testing propagation ensures the network performs optimally and maintains consensus. Learn more about the mempool by reading [A developer’s guide to transactions in Ethereum mempool](https://chainstack.com/a-developers-guide-to-the-transactions-in-mempool-metamask-edition/). ## Roll your own propagation test If you want to see how transactions and blocks are moving in a blockchain network, tools like [web3.js](/docs/web3-development-frameworks-and-libraries-glossary#web3js) or [web3.py](/docs/web3-development-frameworks-and-libraries-glossary#web3py) are your friends. They let you write your own scripts to understand better how things like the network setup, network conditions, and the number of transactions can make a difference. You can run these tests on your private networks or public ones like Goerli and Sepolia to mimic what happens in the real world. Pair this with network monitoring tools; you'll get a clear image of your blockchain's performance and where things might be slowing down. This can help you adjust your setup and keep things moving smoothly. We'll be using the [web3.py](https://github.com/ethereum/web3.py) library to interact with the Ethereum network. ### Prerequisites * Python 3.6 or higher * A [Sepolia node](/reference/ethereum-getting-started#how-to-start-using-the-ethereum-api) with Chainstack * [web3.py](https://github.com/ethereum/web3.py) library. Install it with: ``` pip install web3 ``` ### Getting started Let's start by importing the libraries we need and connecting to an Ethereum node using `Web3 HTTPProvider`. ```javascript Javascript from web3 import Web3 import time from web3.exceptions import TransactionNotFound from concurrent.futures import ThreadPoolExecutor, as_completed import threading # Connect to the Ethereum node w3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) # Target Ethereum address address = 'YOUR_ETHEREUM_ADDRESS' ``` Add your Sepolia node URL instead of `YOUR_CHAINSTACK_ENDPOINT`. ### Making responses look nice Next, we'll create a function to make the transaction details we get from the Ethereum network look pretty. This function will also wait for the transaction receipt to update the block hash and block number when the transaction gets validated. ```javascript Javascript def pretty_print_transaction(tx): def wait_for_confirmation(tx_hash, block_info): while True: try: receipt = w3.eth.get_transaction_receipt(tx_hash) if receipt is not None and receipt['blockHash']: block_info['blockHash'] = receipt['blockHash'] block_info['blockNumber'] = receipt['blockNumber'] break except TransactionNotFound: pass time.sleep(1) block_info = {'blockHash': None, 'blockNumber': None} # Start a separate thread to wait for the block confirmation_thread = threading.Thread(target=wait_for_confirmation, args=(tx['hash'], block_info)) confirmation_thread.start() print("Transaction details:") print(f" Block hash: {block_info['blockHash']}") print(f" Block number: {block_info['blockNumber']}") print(f" From: {tx.get('from')}") print(f" Gas: {tx.get('gas')}") print(f" Gas price: {tx.get('gasPrice')}") if 'maxFeePerGas' in tx: print(f" Max fee per gas: {tx['maxFeePerGas']}") if 'maxPriorityFeePerGas' in tx: print(f" Max priority fee per gas: {tx['maxPriorityFeePerGas']}") print(f" Transaction hash: {tx.get('hash').hex()}") print(f" Input: {tx.get('input')}") print(f" Nonce: {tx.get('nonce')}") print(f" To: {tx.get('to')}") print(f" Transaction index: {tx.get('transactionIndex')}") print(f" Value: {tx.get('value')}") print(f" Type: {tx.get('type')}") print(f" Access list: {tx.get('accessList')}") print(f" Chain ID: {tx.get('chainId')}") print(f" v: {tx.get('v')}") print(f" r: {tx.get('r').hex() if tx.get('r') else None}") print(f" s: {tx.get('s').hex() if tx.get('s') else None}") # Wait for the confirmation thread to complete confirmation_thread.join() print(f"Updated block hash: {block_info['blockHash'].hex()}") print(f"Updated block number: {block_info['blockNumber']}") ``` ### Monitoring the mempool Now, let's whip up a two functions to check if there are any transactions from or to the address you're watching in the mempool: ```python Python def check_pending_transaction(tx_hash, target_address_lower, w3): try: tx = w3.eth.get_transaction(tx_hash) if tx['from'].lower() == target_address_lower or ( tx['to'] and tx['to'].lower() == target_address_lower): return tx except TransactionNotFound: pass return None # Function to check if any transactions from/to the target address are in the mempool def find_mempool_transactions(target_address): local_w3 = w3 transaction_list = [] target_address_lower = target_address.lower() current_block = local_w3.eth.block_number pending_transactions = local_w3.eth.get_block('pending')['transactions'] with ThreadPoolExecutor() as executor: futures = [ executor.submit(check_pending_transaction, tx_hash, target_address_lower, local_w3) for tx_hash in pending_transactions ] for future in as_completed(futures): result = future.result() if result is not None: transaction_list.append(result) return transaction_list ``` You'll notice we're using `ThreadPoolExecutor` to check pending transactions simultaneously, making everything quicker. ### Keeping an eye on transactions With those helper functions ready, we can put together the main function to keep an eye on the mempool for new transactions involving the address you're interested in: ```python Python # Main function to monitor the mempool def monitor_mempool(address): seen_transactions = set() # Add the print statement here print("Mempool monitoring starting...") while True: current_block = w3.eth.block_number pending_block = w3.eth.get_block('pending') pending_transactions = pending_block['transactions'] print( f"Current block: {current_block}. Pending transactions: {len(pending_transactions)}" ) # Record the start time start_time = time.time() transactions = find_mempool_transactions(address) if transactions: new_transactions = [ tx for tx in transactions if tx.get('hash') not in seen_transactions ] if new_transactions: # Calculate the time taken time_taken = time.time() - start_time print(f"\nTime taken since last check: {time_taken:.2f} seconds\n") print( f"Found {len(new_transactions)} new transaction in the mempool involving {address}:" ) for tx in new_transactions: pretty_print_transaction(tx) seen_transactions.add(tx.get('hash')) break monitor_mempool(address) ``` When a new transaction involving the address you're watching pops up, we'll make it look pretty with `pretty_print_transaction()`. After that, the main function will stop, but you can tweak the loop if you want it to keep watching for new transactions. ### The full code Here you can find the entire code for the monitoring tool: ```python Python from web3 import Web3 import time from web3.exceptions import TransactionNotFound from concurrent.futures import ThreadPoolExecutor, as_completed import threading # Connect to the Ethereum node w3 = Web3( Web3.HTTPProvider( 'YOUR_CHAINSTACK_ENDPOINT')) # Ethereum address to monitor address = 'YOUR_ETHEREUM_ADDRESS' def pretty_print_transaction(tx): def wait_for_confirmation(tx_hash, block_info): while True: try: receipt = w3.eth.get_transaction_receipt(tx_hash) if receipt is not None and receipt['blockHash']: block_info['blockHash'] = receipt['blockHash'] block_info['blockNumber'] = receipt['blockNumber'] break except TransactionNotFound: pass time.sleep(1) block_info = {'blockHash': None, 'blockNumber': None} # Start a separate thread to wait for the block confirmation_thread = threading.Thread(target=wait_for_confirmation, args=(tx['hash'], block_info)) confirmation_thread.start() print("Transaction details:") print(f" Block hash: {block_info['blockHash']}") print(f" Block number: {block_info['blockNumber']}") print(f" From: {tx.get('from')}") print(f" Gas: {tx.get('gas')}") print(f" Gas price: {tx.get('gasPrice')}") if 'maxFeePerGas' in tx: print(f" Max fee per gas: {tx['maxFeePerGas']}") if 'maxPriorityFeePerGas' in tx: print(f" Max priority fee per gas: {tx['maxPriorityFeePerGas']}") print(f" Transaction hash: {tx.get('hash').hex()}") print(f" Input: {tx.get('input')}") print(f" Nonce: {tx.get('nonce')}") print(f" To: {tx.get('to')}") print(f" Transaction index: {tx.get('transactionIndex')}") print(f" Value: {tx.get('value')}") print(f" Type: {tx.get('type')}") print(f" Access list: {tx.get('accessList')}") print(f" Chain ID: {tx.get('chainId')}") print(f" v: {tx.get('v')}") print(f" r: {tx.get('r').hex() if tx.get('r') else None}") print(f" s: {tx.get('s').hex() if tx.get('s') else None}") # Wait for the confirmation thread to complete confirmation_thread.join() print(f"Updated block hash: {block_info['blockHash'].hex()}") print(f"Updated block number: {block_info['blockNumber']}") def check_pending_transaction(tx_hash, target_address_lower, w3): try: tx = w3.eth.get_transaction(tx_hash) if tx['from'].lower() == target_address_lower or ( tx['to'] and tx['to'].lower() == target_address_lower): return tx except TransactionNotFound: pass return None # Function to check if any transactions from/to the target address are in the mempool def find_mempool_transactions(target_address): local_w3 = w3 transaction_list = [] target_address_lower = target_address.lower() current_block = local_w3.eth.block_number pending_transactions = local_w3.eth.get_block('pending')['transactions'] with ThreadPoolExecutor() as executor: futures = [ executor.submit(check_pending_transaction, tx_hash, target_address_lower, local_w3) for tx_hash in pending_transactions ] for future in as_completed(futures): result = future.result() if result is not None: transaction_list.append(result) return transaction_list # Main function to monitor the mempool def monitor_mempool(address): seen_transactions = set() # Add the print statement here print("Mempool monitoring starting...") while True: current_block = w3.eth.block_number pending_block = w3.eth.get_block('pending') pending_transactions = pending_block['transactions'] print( f"Current block: {current_block}. Pending transactions: {len(pending_transactions)}" ) # Record the start time start_time = time.time() transactions = find_mempool_transactions(address) if transactions: new_transactions = [ tx for tx in transactions if tx.get('hash') not in seen_transactions ] if new_transactions: # Calculate the time taken time_taken = time.time() - start_time print(f"\nTime taken since last check: {time_taken:.2f} seconds\n") print( f"Found {len(new_transactions)} new transaction in the mempool involving {address}:" ) for tx in new_transactions: pretty_print_transaction(tx) seen_transactions.add(tx.get('hash')) break monitor_mempool(address) ``` ## Running the test To conduct this test, you'll first need to execute the Python script. Then, initiate a transaction. For simplicity, we'll use MetaMask in this scenario. Your objective determines whether you use the same or a different endpoint in comparison to your script. If you wish to measure the speed at which a transaction reaches the mempool of your own node, stick with the same endpoint as used in the script. However, if your goal is to determine the time taken for the transaction to propagate across other nodes in the Ethereum network, opt for a different endpoint. This will provide a more accurate depiction of transaction propagation times across the network. Learn how to add your Chainstack endpoint to MetaMask by reading [Fault-tolerant transactions with MetaMask and Chainstack nodes](https://chainstack.com/reliable-transactions-with-chainstack/). In the Python script, input your endpoint and the Ethereum address that you wish to monitor. This could be either the sending or receiving address for the transaction, as the script is designed to track the transaction in both cases. Once the script detects a new transaction in the mempool involving the target address, it will display the transaction's details in the console. Additionally, it will provide an estimated duration that the script took to locate the transaction. While this figure might not be entirely accurate, it serves as a useful approximation of the transaction's propagation speed across the Ethereum network. 1. Start the script. 2. Send a transaction using MetaMask. You will receive a similar log in the console: ```shell Shell Mempool monitoring starting... Current block: 3772987. Pending transactions: 84 Current block: 3772987. Pending transactions: 102 Time taken since last check: 2.22 seconds Found 1 new transaction in the mempool involving 0x8f8e7012F8F974707A8F11C7cfFC5d45EfF5c2Ae: Transaction details: Block hash: None Block number: None From: 0x8f8e7012F8F974707A8F11C7cfFC5d45EfF5c2Ae Gas: 21000 Gas price: 2181505086 Max fee per gas: 2181505086 Max priority fee per gas: 1500000000 Transaction hash: 0xebbeaa0ee6e787fa3486db9e1b8ad9ccb1e3ab982462c51fca8fa41143be053d Input: 0x Nonce: 59 To: 0x7ea178aE883bC78Fa540b15F36b1e2a8Ea90F7F7 Transaction index: None Value: 1000000000000000000 Type: 2 Access list: [] Chain ID: 11155111 v: 0 r: 0x1483859043ee02820eead543ce58bf9f5a6ec3cd3b339dc709e1860781aa1e57 s: 0x045fb5f1bb7caf42cbeb2d480fbb1a3ed1a85408154bcb052fbb17417eab5e84 Updated block hash: 0x2b120a75e3a97ba9b77d3764945c4c3c2a328699c13327538fb6dacc4642ff57 Updated block number: 3772988 ``` ## Conclusion Testing transaction propagation in mempool for EVM nodes is crucial to maintaining a reliable and efficient blockchain network. By simulating realistic scenarios and using a combination of network monitoring and custom tests, developers and infrastructure operators can ensure that nodes process transactions in a timely and secure manner. Regularly conducting propagation tests will help identify potential issues, optimize resources, and contribute to the overall health of your blockchain ecosystem. # Moonbeam methods Source: https://docs.chainstack.com/docs/moonbeam-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Moonbeam: Monitoring the Conviction Voting contract Source: https://docs.chainstack.com/docs/moonbeam-monitoring-the-conviction-voting-contract * Moonbeam is an EVM-compatible parachain on Polkadot, built with Substrate and featuring unique Solidity precompiles. * We focus on monitoring its conviction voting precompile, which implements a quadratic-like voting mechanism with a time-based conviction factor. * The tutorial demonstrates how to detect, decode, and extract data from conviction voting transactions in Python. * By invoking the contract’s functions (e.g., `votingFor`, `classLocksFor`), developers can retrieve voting states and locked GLMR details in real time. ## Main article Moonbeam is an EVM-compatible network, however it's not a run-of-the-mill one. Instead, it's pretty interesting. Moonbeam is one the first parachains launched on Polkadot. See the Polkadot blog post from 2021: [Parachains are Live!](https://polkadot.network/blog/parachains-are-live-polkadot-launch-is-now-complete) Moonbeam (the nodes specifically) is built using the Substrate framework and the Polkadot SDK. Being EVM-compatible, it's a path for Ethereum developers to get familiar with the Polkadot ecosystem. Moonbeam has a few Solidity precompiles to ease the developer life and to cross-operate with Polkadot. Check them out in the Moonbeam docs: [Overview of the Precompiled Contracts on Moonbeam](https://docs.moonbeam.network/builders/pallets-precompiles/precompiles/overview/). The precompiled contract that we'll have a look at as part of this tutorial is the conviction voting one. See: [Conviction Voting Precompile Contract](https://docs.moonbeam.network/builders/pallets-precompiles/precompiles/conviction-voting/). The contract is pretty active in receiving transactions, so it's perfect for our demo. What we are going to do is monitor it for incoming transactions, extract the transaction data, and call the contract to retrieve the voting state based on the transactions data. We are going to do this in Python and with the Chainstack Moonbeam nodes. First, a few words on conviction voting and the Moonbeam contract. Conviction voting is a relatively novel iteration of a voting algorithm that's been out for a while and that's called [quadratic voting](https://en.wikipedia.org/wiki/Quadratic_voting). The short version of quadratic voting & conviction voting is: * Quadratic voting — not everyone is equal in voting and those with the most real currency or artificial currency (token) have the most power in voting. * Conviction voting — this one expands on quadratic voting and adds a decay of the support of a decision to be made through voting. Conviction voting calculates the conviction of each token holder for a proposal based on the tokens they stake, how long they support it, and how soon the holder switches to voting for another proposal, which starts the conviction decay. You can check the following resources for more details: * Moonbeam docs: [Governance](https://docs.moonbeam.network/learn/features/governance/) * [Moonbeam governance dapp](https://apps.moonbeam.network/moonbeam/open-gov) Let's implement a simple monitoring of the conviction voting contract in Python. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Moonbeam Mainnet node * [web3.py](https://web3py.readthedocs.io/) ## Step-by-step ### Get a Moonbeam node Log in to your [Chainstack account](https://console.chainstack.com/) and deploy a node. ### Get the ABI To interact with the contract, you need the contract ABI. Since it's a precompile, it's not readily available on the [Moonbeam explorer](https://moonscan.io/address/0x0000000000000000000000000000000000000812). Let's generate one: Go to [Remix IDE](https://remix.ethereum.org/). Create the `ConvictionVoting.sol` file and paste the actual Solidity code from the Moonbeam GitHub repository: [ConvictionVoting.sol](https://github.com/moonbeam-foundation/moonbeam/blob/master/precompiles/conviction-voting/ConvictionVoting.sol). In Remix IDE, click **Solidity Compiler** > **Compile**. When compiled, hit **ABI**. This will copy the ABI of the contract. Save the ABI as a `ConvictionVoting.abi` file in the same directory where you are going to have the Python script. ### Create the script Now that you have ABI, create the script. A few details on the implementation: * The precompile address of the conviction voting contract is `0x0000000000000000000000000000000000000812`. * You can provide the block number of when to start iterating through the blocks to watch for the voting transactions. * Once the script extracts a voting transaction, it uses the transaction data to call the two publicly readable functions in the contract `votingFor` and `classLocksFor` to retrieve the voting track ID, the index of poll, and the locked amount of the GLMR tokens. You can check what these are in the [Moonbeam voting dapp](https://apps.moonbeam.network/moonbeam/open-gov). ```python Python import json from web3 import Web3 # Сonnect to the Moonbeam node web3 = Web3(Web3.HTTPProvider('NODE_URL')) # The address of the conviction voting contract contract_address = '0x0000000000000000000000000000000000000812' # Load the ABI from the file for both functions with open('ConvictionVoting.abi', 'r') as abi_file: abi = json.load(abi_file) # Create the contract instance contract = web3.eth.contract(address=contract_address, abi=abi) # Specify the starting block number start_block = BLOCK_NUMBER # Get the latest block number latest_block = web3.eth.block_number # Filter transactions to the Conviction Voting contract for block_number in range(start_block, latest_block + 1): block = web3.eth.get_block(block_number, full_transactions=True) for tx in block.transactions: if tx.to and (tx.to.lower() == contract_address.lower()): print(f"Transaction from account {tx['from']} in block {block_number}:") print(f" Transaction Hash: {tx.hash.hex()}") # Decode the transaction input if tx.input != '0x': func_obj, func_params = contract.decode_function_input(tx.input) print(f" Function called: {func_obj.fn_name}") for name, value in func_params.items(): print(f" {name}: {value}") # Call the classLocksFor function class_locks = contract.functions.classLocksFor(tx['from']).call() for lock in class_locks: track_id, amount = lock amount_in_glmr = web3.from_wei(amount, 'ether') print(f" Track ID: {track_id}, Locked Amount: {amount_in_glmr} GLMR") # Call the votingFor function voting_info = contract.functions.votingFor(tx['from'], track_id).call() is_casting, is_delegating, casting_details, delegating_details = voting_info print(" Voting Information:") print(f" Is Casting: {'Yes' if is_casting else 'No'}") print(f" Is Delegating: {'Yes' if is_delegating else 'No'}") # Convert and print voteAmount in GLMR if is_casting: for vote in casting_details[0]: poll_index, vote_details = vote is_standard, is_split, is_split_abstain, standard_vote, split_vote, split_abstain_vote = vote_details if is_standard: aye, conviction = standard_vote[0] balance = standard_vote[1] balance_in_glmr = web3.from_wei(balance, 'ether') print(f" Vote in Favor: {'Yes' if aye else 'No'}, Conviction Level: {conviction}") print(f" Balance: {balance_in_glmr} GLMR") # Call the classLocksFor function class_locks = contract.functions.classLocksFor(tx['from']).call() for lock in class_locks: track_id, amount = lock print(f" Track ID: {track_id}, Locked Amount: {amount_in_glmr}") # Call the votingFor function voting_info = contract.functions.votingFor(tx['from'], track_id).call() is_casting, is_delegating, casting_details, delegating_details = voting_info print(" Voting Information:") print(f" Is Casting: {'Yes' if is_casting else 'No'}") print(f" Is Delegating: {'Yes' if is_delegating else 'No'}") ``` where * `NODE_URL` — your Moonbeam node endpoint * `BLOCK_NUMBER` — the block number from which to start iterating over the blocks for the conviction voting transactions. ## Conclusion This tutorial guided you through setting up the monitoring of the Moonbeam conviction voting system and how retrieve the data off the contract based on the decoded transactions. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Moonbeam tooling Source: https://docs.chainstack.com/docs/moonbeam-tooling Find a complete list of tools on the [Moonbeam docs](https://docs.moonbeam.network/builders/build/). ## Substrate tools Polkadot's Substrate provides a powerful toolkit for building and managing custom blockchains within the Moonbeam ecosystem. Key components include modular pallets, forkless upgrades, and native interoperability. Find the [Substrate tools](https://docs.moonbeam.network/builders/build/substrate-api/overview/) in the Moonbeam docs. ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Moonbeam nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const {Web3} = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where `YOUR_CHAINSTACK_ENDPOINT` is your node HTTPS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Moonbeam nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Moonbeam nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.JsonRpcProvider(urlInfo.url, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Moonbeam network ID: * Moonbeam Mainnet: `1284` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Moonbeam nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: 1. Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). 2. In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Moonbeam nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where `YOUR_CHAINSTACK_ENDPOINT` is your node HTTPS endpoint protected either with the key or password # Navigating the Web3 landscape: How to choose the right blockchain network for your dApp Source: https://docs.chainstack.com/docs/navigating-the-web3-landscape-how-to-choose-the-right-blockchain-network-for-your-dapp * Blockchain solutions fall into two main categories, layer 1 (L1) and layer 2 (L2), each balancing scalability, decentralization, and security differently. * Layer 1 networks like Ethereum, BNB Smart Chain, and Bitcoin are foundational and often the go-to for robust smart contract functionality or simple store-of-value use cases. * Layer 2 solutions (e.g., Polygon, zkSync Era, Arbitrum) offload transactions from mainnets, resulting in faster throughput and lower costs, while still benefiting from the security of L1. * Chainstack supports a broad range of L1 and L2 networks, so you can pick the one that best matches your DApp’s performance, security, and cost requirements. ## Main article Blockchain technology has transformed various industries, offering transparency, security, and decentralization. Selecting the right blockchain network can significantly impact the performance, security, and user experience of your DApp. With many networks available, this article will guide you in making an informed decision. Chainstack supports many networks, including Ethereum, Bitcoin, BNB Smart Chain, and many more. ## Layer 1 vs layer 2: A brief overview Blockchain solutions come in various flavors, each with unique features tailored for specific use cases. The two primary categories within the blockchain ecosystem are layer 1 (L1) and layer 2 (L2) blockchains. While L1 networks offer robust security and decentralization, L2 solutions might be more suitable for applications requiring high transaction speeds and lower fees. ### Layer 1 network Layer 1 networks form the foundation of the blockchain ecosystem, serving as the base layer for DApps and cryptocurrencies. Their key characteristics include: * Decentralization — highly decentralized with a distributed network of nodes. Examples: Bitcoin and Ethereum. * Security — robust consensus mechanisms like PoW or PoS ensure security. * Scalability — some L1 networks face scalability challenges, leading to congestion and high fees. * Smart contracts — support for decentralized applications with programmable logic. * Native tokens — used as gas fees or as a store of value. ### Layer 2 network Layer 2 networks address the scalability and cost issues of layer 1 networks, operating atop L1 networks and leveraging their security. L2 networks' key characteristics include: * Scalability solutions — designed to offload transactions from L1, increasing throughput and reducing congestion. * Cost efficiency — offer cost-effective alternatives for microtransactions. * Interoperability — can connect different L1 networks or bridge to traditional financial systems. * Security — benefit from the security of L1 but may have unique security considerations. * Use cases — ideal for high transaction throughput applications like gaming and DeFi. Read [Solving the blockchain trilemma: A look at some scaling solutions](https://chainstack.com/solving-the-blockchain-trilemma-scaling-solutions-for-ethereum/) to better understand layer 2 solutions. ## Major layer 1 networks supported by Chainstack * **Bitcoin**. The foundational technology behind the first cryptocurrency. *Best for*: secure, transparent transactions and a store of value. * **Ethereum**. A decentralized platform for DApps and smart contracts. *Best for*: decentralized applications requiring robust smart contract functionality. * **BNB Smart Chain**. High-performance network compatible with the Ethereum Virtual Machine. *Best for*: DApps requiring high-speed and low-cost transactions. * **Avalanche**. Offers scalable, customizable, and interoperable blockchain solutions with a unique consensus mechanism. *Best for*: developers and projects seeking customizable blockchain-based applications with specific requirements. * **Fantom**. Known for fast transaction processing, scalability, and many use cases. *Best for*: high-performance decentralized applications and diverse use cases. * **Filecoin**. A marketplace for storage providers and users to exchange digital storage space and services. *Best for*: decentralized data storage and retrieval using a network of storage nodes. ## Layer 2 networks supported by Chainstack * **Polygon**. A layer 2 scaling solution for Ethereum. *Best for*: addressing Ethereum's high gas fees and slow transaction times. * **Polygon zkEVM**. The first zero-knowledge scaling solution compatible with the Ethereum Virtual Machine. *Best for*: integrating smart contracts and developer tools with enhanced scalability. * **zkSync Era**. Scales Ethereum with cutting-edge ZK tech and is fully compatible with the Ethereum Virtual Machine. *Best for*: applications requiring high scalability without compromising Ethereum compatibility. Account abstraction out of the box. * **Optimism**. A scaling solution for Ethereum using optimistic rollups. *Best for*: achieving high throughput and low fees on the Ethereum network. * **Base**. An Ethereum layer 2 scaling solution built on top of Optimism. *Best for*: secure, low-cost, and developer-friendly scaling on Ethereum. * **Arbitrum**. Designed to enhance the performance of Ethereum-based DApps by providing a more efficient platform. *Best for*: addressing scalability and high gas fee issues on Ethereum. * **Starknet**. Enables Ethereum to scale securely using zk-STARKs technology. *Best for*: enhancing data security, privacy, and scalability on Ethereum. * **Scroll**. A reliable and scalable layer 2 network that extends Ethereum's capabilities. *Best for*: scaling applications on Ethereum without compromising on performance. ### What other networks does Chainstack supports? Find the full list of [supported networks](https://chainstack.com/protocols/). ## Conclusion Choosing the right blockchain network is pivotal for the success of your DApp. When deciding, reflect on your application's specific requirements, such as transaction speed, security, and cost. With the diverse options available, you can find a solution tailored to your needs. # NEAR tooling Source: https://docs.chainstack.com/docs/near-tooling ### No NEAR support Chainstack deprecated support for NEAR nodes. This page here is for legacy and in case you may find it useful. Note that Aurora nodes are available for deployment. ## NEAR CLI Install the [NEAR CLI](https://docs.near.org/tools/near-cli). Use the `--node_url` flag to operate through your NEAR node: ```bash Shell near COMMAND --node_url YOUR_CHAINSTACK_ENDPOINT ``` where * COMMAND — a supported [NEAR CLI](https://docs.near.org/tools/near-cli) command * NEAR\_ENDPOINT — your node HTTPS or WSS endpoint. ## JSON-RPC API Interact with your NEAR node using [JSON-RPC API](https://docs.near.org/api/rpc/introduction). Use [curl](https://curl.haxx.se) or [Postman](https://www.getpostman.com). Example to get the latest block: ```bash cURL curl -H "Content-Type: application/json" \ -d '{"jsonrpc": "2.0", "id": "1", "method": "block", "params": {"finality": "final"}}' \ NEAR_ENDPOINT ``` where NEAR\_ENDPOINT is your node HTTPS or WSS endpoint. ## near-api-js Install [near-api-js](https://docs.near.org/tools/near-api-js/quick-reference). Use `JsonRpcProvider` to connect to your NEAR node. ```javascript Javascript const nearAPI = require("near-api-js"); const connectionInfo = { url: "NEAR_ENDPOINT" }; const provider = new nearAPI.providers.JsonRpcProvider(connectionInfo); async function main() { const response = await provider.block({ finality: "final", }); console.log(response) } main(); ``` where NEAR\_ENDPOINT is your node HTTPS or WSS endpoint. # NEAR: Creating & U=upgrading a simple message contract Source: https://docs.chainstack.com/docs/near-tutorial-creating-and-upgrading-a-simple-message-contract ### NEAR support is deprecated Chainstack deprecated support for NEAR nodes. This page here is for legacy and in case you may find it useful. Aurora nodes are still supported. This tutorial guides you through creating a simple message contract, deploying the contract, interacting with it, and upgrading it. In short, you will do the following: * Create a simple message contract in Rust. * Deploy the contract using [NEAR CLI](https://docs.near.org/tools/near-cli). * Change the contract state by interacting with it using NEAR CLI. * Upgrade the contract with new functionality and redeploy the contract while migrating the contract state. * Retrieve the old contract state and set a new state. In the process, you will gain first-hand knowledge of how to create metamorphic smart contracts on NEAR. ## Prerequisites * A NEAR node endpoint. * [Rust](https://www.rust-lang.org/tools/install) to create the contract. * [NEAR CLI](https://docs.near.org/tools/near-cli) to work with accounts, deploy contracts, and interact with the network. ## Overview To get from zero to an upgraded contract on the NEAR testnet, do the following: Get a testnet NEAR node endpoint. Set up the [NEAR CLI](https://docs.near.org/tools/near-cli). Create an account on the NEAR testnet using the [NEAR wallet](https://wallet.testnet.near.org/). Set up the project. Deploy an initial version of the contract. Change the contract state by interacting with it. Deploy an upgraded version of the contract and migrate the contract state. Retrieve the old contract state and set the new state. ### Create an account on the NEAR testnet Go to the [NEAR testnet wallet](https://wallet.testnet.near.org/). Click **Create Account**. Provide any account name that is not taken. Example: `upgradablecontract.testnet`. Click **Reserve My Account ID** and secure your account as prompted. This will initiate your account on the NEAR testnet with testnet NEAR. You can also additionally fund your account on the testnet with the [NEAR faucet](https://near-faucet.io/). ### Set up your project Add the WebAssembly target to your environment: ```shell Shell rustup target add wasm32-unknown-unknown ``` Initialize your project by running in your project directory: ```shell Shell cargo init --lib ``` Edit the generated `Cargo.toml` file to provide your project details and setup: ```toml TOML [package] name = "simple-message" version = "0.1.0" authors = ["Your name "] edition = "2021" [lib] crate-type = ["cdylib", "rlib"] [dependencies] serde = { version = "*", features = ["derive"] } serde_json = "*" near-sdk = "4.0.0" [profile.release] codegen-units = 1 # Tell `rustc` to optimize for small code size. opt-level = "z" lto = true debug = false panic = "abort" # Opt into extra safety checks on arithmetic operations https://stackoverflow.com/a/64136471/249801 overflow-checks = true ``` ### Create the initial contract This is a simple contract that lets you record a message and then retrieve the recorded message. In the `src/lib.rs` file, provide your smart contract: ```Rust Rust use near_sdk::borsh::{self, BorshDeserialize, BorshSerialize}; use near_sdk::collections::LookupMap; use near_sdk::{env, near_bindgen, AccountId}; #[near_bindgen] #[derive(BorshDeserialize, BorshSerialize)] pub struct StatusMessage { records: LookupMap, } impl Default for StatusMessage { fn default() -> Self { Self { records: LookupMap::new(b"r".to_vec()), } } } #[near_bindgen] impl StatusMessage { pub fn set_status(&mut self, message: String) { let account_id = env::signer_account_id(); self.records.insert(&account_id, &message); } pub fn get_status(&self, account_id: AccountId) -> Option { return self.records.get(&account_id); } } ``` ### Compile the contract Run: ```shell Shell cargo build --target wasm32-unknown-unknown --release ``` ### Create a contract account Log in to your account that you created earlier: 1. Run `near login`. 2. Use the generated link to authorize the account in your browser. Once logged in NEAR CLI, run: ```shell NEAR CLI near create-account simplemessage.upgradablecontract.testnet --masterAccount upgradablecontract.testnet --initialBalance 20 --node_url NEAR_ENDPOINT ``` where: * `simplemessage` — any name to give to the contract that you will deploy. * `upgradablecontract` — the name of the account on the NEAR testnet that you created earlier. * `--initialBalance` — providing this flag will fund the created contract with 20 testnet NEAR. * NEAR\_ENDPOINT — your NEAR node HTTPS endpoint. ### Deploy the initial contract At this point, you have the following: * NEAR CLI is set up to work through a NEAR node. * A compiled simple message contract. * A funded contract account. You can now deploy the contract on the NEAR testnet. To deploy, run: ```bash Shell near deploy --wasmFile target/wasm32-unknown-unknown/release/simple_message.wasm --accountId CONTRACT_NAME.ACCOUNT_NAME.testnet --node_url YOUR_CHAINSTACK_ENDPOINT ``` where: * `simple_message.wasm` — the name of the contract that you compiled to. * `simplemessage` — the name of the contract account that you created earlier. * `upgradablecontract` — the name of the account on the NEAR testnet that you created earlier. * NEAR\_ENDPOINT — your NEAR node HTTPS endpoint. Example of the deployed contract: [997VvZ2EAss2NjdoGMJkZwJCMTNLZmKbXP3ZnFGkCMAh](https://explorer.testnet.near.org/transactions/997VvZ2EAss2NjdoGMJkZwJCMTNLZmKbXP3ZnFGkCMAh). ### Interact with the contract Set the contract status message: ```bash Shell near call simplemessage.upgradablecontract.testnet set_status '{"message": "Hello"}' --accountId upgradablecontract.testnet --node_url NEAR_ENDPOINT ``` where: * `simplemessage` — the name of the contract account that you deployed the compiled contract to. * `upgradablecontract` — the name of the account on the NEAR testnet that you created earlier. * NEAR\_ENDPOINT — your NEAR node HTTPS endpoint. Retrieve the contract status message: ```bash Shell near view simplemessage.upgradablecontract.testnet get_status '{"account_id": "upgradablecontract.testnet"}' --node_url NEAR_ENDPOINT ``` ### Change the contract source code You are now going to upgrade the contract by changing the following: * Change the key-value pair name `records` to `taglines` and the corresponding state changing function `set_status` to `set_tagline`. * Add the new callable key-value pair `bios`. * Provide a way to retrieve with the new contract code the original state on the chain set through the `records` key-value pair. This is done through the `migrate` function. Edit the `src/lib.rs` file to provide the upgraded version of the contract: ```Rust Rust use near_sdk::borsh::{self, BorshDeserialize, BorshSerialize}; use near_sdk::collections::LookupMap; use near_sdk::{env, near_bindgen, AccountId}; #[derive(BorshDeserialize, BorshSerialize)] pub struct OldStatusMessage { records: LookupMap, } #[near_bindgen] #[derive(BorshDeserialize, BorshSerialize)] pub struct StatusMessage { taglines: LookupMap, bios: LookupMap, } impl Default for StatusMessage { fn default() -> Self { Self { taglines: LookupMap::new(b"r".to_vec()), bios: LookupMap::new(b"b".to_vec()), } } } #[near_bindgen] impl StatusMessage { pub fn set_tagline(&mut self, message: String) { let account_id = env::signer_account_id(); self.taglines.insert(&account_id, &message); } pub fn get_tagline(&self, account_id: AccountId) -> Option { return self.taglines.get(&account_id); } pub fn set_bio(&mut self, message: String) { let account_id = env::signer_account_id(); self.bios.insert(&account_id, &message); } pub fn get_bio(&self, account_id: AccountId) -> Option { return self.bios.get(&account_id); } #[private] #[init(ignore_state)] pub fn migrate() -> Self { let old_state: OldStatusMessage = env::state_read().expect("failed"); Self { taglines: old_state.records, bios: LookupMap::new(b"b".to_vec()), } } } ``` ### Compile the upgraded contract Run: ```bash Shell cargo build --target wasm32-unknown-unknown --release ``` ### Deploy the upgraded contract You will now deploy the upgraded contract code on the same contract account where the initial contract is running. Since you interacted with the original contract, you have changed the chain state. Deploying an upgraded version of the contract does not purge the changed state. This means that you must have a way to retrieve the changed state through the upgraded contract interface. In our simple message contract, this is done through the `migrate` function. You will migrate the original state at the upgraded contract deployment. Run: ```bash Shell near deploy --wasmFile target/wasm32-unknown-unknown/release/simple_message.wasm --initFunction "migrate" --initArgs "{}" --accountId simplemessage.upgradablecontract.testnet --node_url NEAR_ENDPOINT ``` where: * `simple_message.wasm` — the name of the upgraded contract that you compiled to. * `simplemessage` — the name of the contract account where the original contract is running. * `upgradablecontract` — the name of the account on the NEAR testnet that you used to deploy the original contract. * NEAR\_ENDPOINT — your NEAR node HTTPS endpoint. Example of the upgraded contract deployment: [997VvZ2EAss2NjdoGMJkZwJCMTNLZmKbXP3ZnFGkCMAh](https://explorer.testnet.near.org/transactions/997VvZ2EAss2NjdoGMJkZwJCMTNLZmKbXP3ZnFGkCMAh). ### Interact with the upgraded contract Retrieve the original state message: ```bash Shell near view simplemessage.upgradablecontract.testnet get_tagline '{"account_id": "upgradablecontract.testnet"}' --node_url NEAR_ENDPOINT ``` where: * simplemessage — the name of the contract account that you deployed the upgraded contract to. * upgradablecontract — the name of the account on the NEAR testnet that you used to set the original message. * NEAR\_ENDPOINT — your NEAR node HTTPS endpoint. Note that since this is an upgraded contract, you are retrieving the original message through the new interface: through `get_tagline` instead of `get_status`. You can also change the state with the new function that you added to the contract: `set_bio`. Example: ```bash Shell near call simplemessage.upgradablecontract.testnet set_bio '{"message": "Bye"}' --accountId upgradablecontract.testnet --node_url NEAR_ENDPOINT ``` And retrieve the new state: ```bash Shell near view simplemessage.upgradablecontract.testnet get_bio '{"account_id": "upgradablecontract.testnet"}' --node_url NEAR_ENDPOINT ``` ## Conclusion This tutorial guided you through the basics of creating a simple NEAR contract, deploying the initial contract on the NEAR network, interacting with the contract, and upgrading the contract. This tutorial uses the testnet, however, the exact same instructions and sequence will work on the mainnet as well. ### About the author Developer Advocate @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Available clouds, regions, & locations Source: https://docs.chainstack.com/docs/nodes-clouds-regions-and-locations ### Dedicated nodes are also available Dedicated nodes are also available for most of the regions, even if not listed here. [Talk to us](https://support.chainstack.com/hc/en-us). ### Trader nodes details See [Trader nodes](/docs/warp-transactions). ## Ethereum | Network | Cloud | Region | Location | Mode | Debug & trace | Trader nodes | | --------------- | ---------------- | ------- | ----------------- | ------- | ------------- | ------------ | | Mainnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Full | NA | Available | | Mainnet | Chainstack Cloud | lon1 | London | Full | NA | Available | | Mainnet | Chainstack Cloud | sgp1 | Singapore | Full | NA | Available | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | NA | | Mainnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Archive | Available | NA | | Mainnet | Chainstack Cloud | lon1 | London | Archive | Available | NA | | Sepolia Testnet | Chainstack Cloud | global1 | Global | Archive | Available | NA | | Sepolia Testnet | Virtuozzo | eu3 | Amsterdam | Full | NA | NA | | Holesky Testnet | Chainstack Cloud | global1 | Global | Archive | Available | NA | ## Solana | Network | Cloud | Region | Location | Mode | Trader nodes | | ------- | ---------------- | ------- | ----------------- | ------- | ------------ | | Mainnet | Chainstack Cloud | global1 | Global | Full | NA | | Mainnet | Chainstack Cloud | lon1 | London | Full | Available | | Mainnet | Chainstack Cloud | nyc1 | New York | Full | Available | | Mainnet | Chainstack Cloud | global1 | Global | Archive | NA | | Devnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Full | NA | ## BNB Smart Chain | Network | Cloud | Region | Location | Mode | Debug & trace | Trader nodes | | ------- | ---------------- | ------- | ----------------- | ------- | ------------- | ------------ | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | NA | | Mainnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Full | NA | Available | | Mainnet | Chainstack Cloud | lon1 | London | Full | NA | Available | | Mainnet | Chainstack Cloud | ash1 | Ashburn | Archive | Available | NA | | Mainnet | Chainstack Cloud | lon1 | London | Archive | Available | NA | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | NA | | Testnet | Virtuozzo | ams1 | Amsterdam | Full | NA | NA | ## Polygon | Network | Cloud | Region | Location | Mode | Debug & trace | | ------------ | ---------------- | ------- | ----------------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Full | NA | | Mainnet | Chainstack Cloud | lon1 | London | Full | NA | | Mainnet | Chainstack Cloud | ash1 | Ashburn, Virginia | Archive | Available | | Mainnet | Chainstack Cloud | lon1 | London | Archive | Available | | Amoy Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Arbitrum | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Archive | Available | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Full | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Base | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Optimism | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Avalanche | Network | Cloud | Region | Location | Mode | Debug & trace | | ------------ | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Full | Available | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Full | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Archive | NA | | Fuji Testnet | Chainstack Cloud | global1 | Global | Archive | Available | | Fuji Testnet | Virtuozzo | ams1 | Amsterdam | Full | NA | ## TON | Network | Cloud | Region | Location | Mode | v2 | v3 | | ------- | ---------------- | ------- | -------- | ------- | --------- | --------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | Available | ## Ronin | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | -------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Blast | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ------------ | ------- | --------- | ------- | ------------- | | Mainnet | POKT Network | global2 | Worldwide | Archive | NA | ## zkSync Era | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lon1 | London | Full | NA | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Starknet | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | -------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Scroll | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | -------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | NA | ## opBNB | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ------------ | ------- | --------- | ------- | ------------- | | Mainnet | POKT Network | global2 | Worldwide | Archive | NA | ## Aptos | Network | Cloud | Region | Location | Mode | | ------- | ---------------- | ------- | --------- | ------- | | Mainnet | Chainstack Cloud | global1 | Global | Full | | Mainnet | Virtuozzo | ams1 | Amsterdam | Full | | Mainnet | Virtuozzo | ams1 | Amsterdam | Archive | | Testnet | Chainstack Cloud | global1 | Global | Full | | Testnet | Virtuozzo | ams1 | Amsterdam | Full | ## Sonic | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | -------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | ## Fantom | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | ----------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Archive | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Full | NA | | Testnet | Chainstack Cloud | fra1 | Frankfurt | Full | NA | ## TRON | Network | Cloud | Region | Location | Mode | Debug & trace | | ------------ | ---------------- | ------- | -------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | NA | | Mainnet | Chainstack Cloud | global1 | Global | Full | NA | | Nile testnet | Chainstack Cloud | global1 | Global | Full | NA | ## Cronos | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | --------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Full | NA | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Archive | NA | | Testnet | Virtuozzo | eu3 | Amsterdam | Full | NA | ## Gnosis Chain | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | --------- | ------- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Archive | Available | | Mainnet | Chainstack Cloud | lo1 | London | Archie | Available | | Mainnet | Chainstack Cloud | lon1 | London | Full | NA | | Testnet | Chainstack Cloud | global1 | Global | Archive | Available | | Testnet | Virtuozzo | eu3 | Amsterdam | Full | NA | ## Kaia (ex Klaytn) | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ------------ | ------- | --------- | ---- | ------------- | | Mainnet | POKT Network | global2 | Worldwide | Full | NA | ## Celo | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | --------- | ---- | ------------- | | Mainnet | Chainstack Cloud | global1 | Worldwide | Full | NA | ## Moonbeam | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ------------ | ------- | --------- | ---- | ------------- | | Mainnet | POKT Network | global2 | Worldwide | Full | NA | ## Aurora | Network | Region | Cloud | Location | Mode | Debug & trace | | ------- | ------- | ---------------- | -------- | ---- | ------------- | | Mainnet | global1 | Chainstack Cloud | Global | Full | NA | ## Oasis Sapphire | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------- | -------- | ---- | ------------- | | Mainnet | Chainstack Cloud | global1 | Global | Full | NA | | Testnet | Chainstack Cloud | global1 | Global | Full | NA | ## Polygon zkEVM | Network | Cloud | Region | Location | Mode | Debug & trace | | ------- | ---------------- | ------ | ----------- | ---- | ------------- | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | NA | ## Bitcoin | Network | Cloud | Region | Location | Mode | | ------- | ---------------- | ------- | ----------- | ---- | | Mainnet | Chainstack Cloud | global1 | Global | Full | | Mainnet | Chainstack Cloud | lax1 | Los Angeles | Full | | Mainnet | Chainstack Cloud | fra1 | Frankfurt | Full | | Testnet | Virtuozzo | ams1 | Amsterdam | Full | ## Harmony | Network | Mode | Cloud | Region | Location | | ------- | ---- | --------- | ------ | --------- | | Mainnet | Full | Virtuozzo | ams1 | Amsterdam | # Oasis Sapphire methods Source: https://docs.chainstack.com/docs/oasis-sapphire-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | # Oasis Sapphire tooling Source: https://docs.chainstack.com/docs/oasis-sapphire-tooling ## Interaction tools ### Geth Interact with your Oasis Sapphire node using [Geth](https://geth.ethereum.org/docs/getting-started). Install [Geth](https://github.com/ethereum/go-ethereum). Use `geth attach` command with the node endpoint. Invoke any methods from [Web3 JavaScript API (opens new window)](https://web3js.readthedocs.io/). ```bash Shell geth attach YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Example below demonstrates how to get the balance of an address in wei value and convert it to ether value: ```js JavaScript web3.fromWei(web3.eth.getBalance("0x4EA0911033792C93639bEd297B9289E136d86F89")) ``` ### MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Development tools ### Truffle Configure [Truffle Suite](https://truffleframework.com/) to deploy contracts to your Oasis Sapphire nodes. Install [Truffle Suite](https://truffleframework.com/), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Install the [sapphire-hardhat](https://www.npmjs.com/package/@oasisprotocol/sapphire-hardhat) plugin. Install the [dotenv](https://www.npmjs.com/package/dotenv) package to securely load your sensitive variables from a `.env` file Create a new environment, and define a new network, in `truffle-config.js`: ```js truffle-config.js require('dotenv').config(); const sapphire = require("@oasisprotocol/sapphire-paratime"); const HDWalletProvider = require("@truffle/hdwallet-provider"); module.exports = { networks: { sapphire_testnet: { provider: () => { sapphire.wrap( new HDWalletProvider([process.env.YOUR_PRIVATE_KEY], `${process.env.YOUR_CHAINSTACK_ENDPOINT}`)); }, network_id: 0x5aff, }, }, compilers: { solc: { version: "0.8.13", }, }, }; ``` ### Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Oasis Sapphire nodes. Install [Hardhat](https://hardhat.org/) and create a project. Install the [sapphire-hardhat](https://www.npmjs.com/package/@oasisprotocol/sapphire-hardhat) plugin. Install the [dotenv](https://www.npmjs.com/package/dotenv) package to securely load your sensitive variables from a `.env` file. Create a new environment in `hardhat.config.js`: ```js hardhat.config.js require("@nomicfoundation/hardhat-toolbox"); require('@oasisprotocol/sapphire-hardhat'); require('dotenv').config(); module.exports = { solidity: "0.8.18", defaultNetwork: "sapphire_testnet", networks: { sapphire_testnet: { url: `${process.env.YOUR_CHAINSTACK_ENDPOINT}`, accounts: [process.env.YOUR_PRIVATE_KEY], chainId: 0x5aff, }, }, }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run --network sapphire_testnet scripts/deploy.js`, and Hardhat will deploy using Chainstack. ### web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Oasis Sapphire nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. #### HTTPS Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```js JavaScript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. #### WSS Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```js JavaScript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ### node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```js JavaScript const axios = require("axios"); const payload = { jsonrpc: "2.0", id: 0, method: "query", params: [] }; (async () => { const response = await axios.post(`YOUR_CHAINSTACK_ENDPOINT`, payload) console.log(response.data) })(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your JSON-RPC query. In this case, to get the latest block number. ### ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Oasis Sapphire nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTPS or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). #### HTTPS Use the `JsonRpcProvider` object to connect to your node endpoint and get the balance of any address: ```js JavaScript const ethers = require('ethers'); const NODE_URL = "YOUR_CHAINSTACK_ENDPOINT "; const provider = new ethers.JsonRpcProvider(NODE_URL, NETWORK_ID); const eth_getBalance = async () => { const balance = await provider.getBalance("ACCOUNT_ADDRESS"); console.log(balance); }; eth_getBalance() ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * NETWORK\_ID — Oasis Sapphire network ID: * Mainnet: `23294` * Testnet: `23295` * ACCOUNT\_ADDRESS — the Oasis Sapphire account address #### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```js JavaScript const ethers = require('ethers'); const NODE_URL = "YOUR_CHAINSTACK_ENDPOINT"; const provider = new ethers.WebSocketProvider(NODE_URL, NETWORK_ID); const eth_getBalance = async () => { const balance = await provider.getBalance("ACCOUNT_ADDRESS"); console.log(balance); }; eth_getBalance() ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Oasis Sapphire network ID: * Mainnet: `23294` * Testnet: `23295` * ACCOUNT\_ADDRESS — the Oasis Sapphire account address ### Remix IDE To make Remix IDE interact with the network through a zkEVM node deployed with Chainstack: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Oasis Sapphire tooling: MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. # Oasis Sapphire: Understanding confidential smart contracts Source: https://docs.chainstack.com/docs/oasis-sapphire-tutorial-understanding-confidential-smart-contracts-with-oasis-sapphire * The Oasis Network is a proof-of-stake chain focused on privacy and scalability, where “ParaTimes” separate consensus and execution. * You can deploy the same Solidity contract on both Ethereum Sepolia (fully public) and Oasis Sapphire (confidential by default). * On Ethereum, all transaction data and storage is publicly visible, which can compromise “secret” games or auctions. * On Oasis Sapphire, the data is end-to-end encrypted for privacy, so you can only access the parts you explicitly expose in your contract. ## Oasis Network The Oasis Network is a proof-of-stake (PoS) based blockchain network that provides a robust, secure, and sustainable infrastructure for decentralized applications. By introducing a modular network architecture, Oasis is able to offer a high degree of scalability, interoperability, and privacy. Oasis provides several features to protect user data. These include confidential smart contracts, which allow users to execute code without revealing sensitive information, and secure enclaves, which provide hardware-level protection for private data. ### Architecture The modularity in the Oasis Network architecture comes from the separation of the network into two distinct layers: one handles the consensus (consensus layer) and the other takes care of the smart contract execution (ParaTime layer). The **consensus layer** utilizes a proof-of-stake consensus mechanism that is run by a decentralized set of validator nodes. This mechanism is scalable and ensures security. The **ParaTime layer** hosts multiple parallel runtimes (ParaTimes), each representing a separate computational environment with a shared state. By using a parallel runtime model, the platform can process a large number of transactions in parallel, increasing its throughput and reducing latency. ### Paratimes **ParaTime layer** is the layer of the Oasis Network that handles smart contract execution. A ParaTime is a separate computational environment with a shared state that can execute smart contracts. The Oasis network comes with multiple in-house ParaTimes, each representing a different type of computational environment. These include: * **Emerald ParaTime** which is designed to provide a scalable and efficient environment for executing EVM-based smart contracts. It is fully compatible with the Ethereum Virtual Machine (EVM), making it easy to migrate existing Ethereum smart contracts to the Oasis Network. * **Sapphire ParaTime** which is designed to provide an environment for executing EVM-compatible smart contracts with on-chain data confidentiality. It allows developers to execute code without revealing sensitive information. This makes it an excellent choice for applications that require high levels of privacy and security. * **Cipher ParaTime** which is designed to provide a confidential ParaTime for executing WASM smart contracts. You can either use Rust or Go language to develop smart contracts for Cipher ParaTime. This tutorial demonstrates the difference between confidential and non-confidential computation by analyzing the working of a Solidity smart contract on the Ethereum Sepolia Testnet and the Oasis Sapphire Testnet. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an Oasis Sapphire node * [Node.js](https://nodejs.org/en/download/) ^16.17.0 as the JavaScript framework * MetaMask wallet ## Dependencies * Hardhat: ^2.12.7 * dotenv: ^16.0.3 ## Overview Log in to your Chainstack account and create a public chain project. With Chainstack, join the Ethereum Sepolia Testnet and the Oasis Sapphire Testnet. With Chainstack, access your nodes' credentials. Add the details of the Oasis Sapphire node to your MetaMask wallet. Fund your accounts with test tokens. Set up a Hardhat project. Install the required dependencies. Create a `.env` file to store the private key of your MetaMask account and node endpoints. Create a Solidity contract in your Hardhat project. Write the Hardhat scripts for deploying and interacting with the contract. Deploy the contract onto the Ethereum Sepolia Testnet. Deploy the contract onto the Oasis Sapphire Testnet. Analyze the difference in the functioning of the contract. ## Step-by-Step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia Testnet and the Oasis Sapphire Testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get endpoints for your Ethereum Sepolia Testnet and Oasis Sapphire Testnet See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Add the details of the Oasis Sapphire network to your MetaMask wallet See [Oasis Sapphire Tools: MetaMask](/docs/oasis-sapphire-tooling#metamask). ### Fund your accounts with test tokens To get test tokens for your Oasis Sapphire Testnet account, do the following: * Go to the official [Oasis Network Testnet Faucet](https://faucet.testnet.oasis.dev/). * In the drop-down list, click **Sapphire**. * Go to your MetaMask wallet and change the network to **Sapphire testnet**. * On the faucet page, copy your MetaMask account address and paste it into the field below the drop-down list. * Click **Request Test Token**. * 10 test tokens will be sent to your account. To get test tokens for your Ethereum Sepolia account, use the [Ethereum Sepolia Faucet](https://faucet.chainstack.com). ### Set up a Hardhat project Create a new directory for your project. In your project directory, initialize the npm project: ```bash Shell npm init -y ``` This command creates a `package.json` file with default settings. Install Hardhat locally: ```bash Shell npm install --save-dev hardhat ``` Initialize the Hardhat project: ```bash Shell npx hardhat init ``` This command will prompt you to choose a template for your project. Select `Create an empty hardhat.config.js` if you want to start from scratch. In your project root directory, create two directories: `/contracts` and `/scripts`. This is where you will store the contracts and deployment scripts respectively. By now, your Hardhat project directory will have the following structure: ```bash Shell ├── contracts ├── node_modules ├── hardhat.config.js ├── package.json └── scripts ``` ### Install the dependencies In the root directory of your project run the following command: ```js JavaScript npm install --save-dev @nomicfoundation/hardhat-toolbox @oasisprotocol/sapphire-hardhat ``` This command will install the `hardhat-toolbox` plugin which, along with other useful functionalities, lets you use libraries and frameworks like ethers.js, Mocha, and Chai for developing and testing your smart contracts. The `sapphire-hardhat` plugin helps port your application onto the Oasis Sapphire by *wrapping* the provider used in the project. This will help your application interact with the Oasis Sapphire network by enabling functionalities like transaction encryption and signing. In this project, you also require the `dotenv` package for handling the environment variables. To install the package, run the following command: ```bash Shell npm install dotenv ``` ### Create a `.env` file In order to deploy and interact with the smart contract on various networks, you need to provide access to the network accounts via the account private key. Since Ethereum accounts are compatible with the Oasis Sapphire, you can use a single MetaMask account to interact with both Ethereum and Oasis testnets. Do the following: Get the private key of your MetaMask account. In the root directory of your project, create a new `.env` file. Save the private key by pasting the following into the `.env` file: ```sh .env PRIVATE_KEY="YOUR_PRIVATE_KEY" ``` Once the account private key is added, you may also add the endpoints of your Ethereum and Oasis testnets nodes deployed with Chainstack to the `.env` file: ```sh .env PRIVATE_KEY="YOUR_PRIVATE_KEY" SEPOLIA_ENDPOINT="YOUR_ETHEREUM_SEPOLIA_NODE_ENDPOINT" SAPPHIRE_TESTNET_ENDPOINT="YOUR_OASIS_SAPPHIRE_ENDPOINT" ``` ### Create and compile a Solidity contract in your Hardhat project In the `/contracts` directory of your Hardhat project, create a new file named `SecretNumberGame.sol` and add the following code to it: ```js SecretNumberGame.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.0; /** * @title SecretNumberGame * @dev A simple game where participants submit secret numbers. * The game ends when the maximum number of entries is reached. * The participant with the highest secret number wins. */ contract SecretNumberGame { uint256 highestNumber; uint256 public constant maxEntries = 5; uint256 public entryCount; address public winner; // Entry structure to store participant's address and submitted number struct Entry { address sender; uint256 number; } Entry[] public entries; // Event to declare the winner when the game ends event WinnerDeclared(address winner, uint256 highestNumber); // Modifier to check if the game is still accepting entries modifier gameInProgress() { require(entryCount < maxEntries, "Game is over"); _; } /** * @notice Submit a secret number to the game. * @dev Participants can submit secret numbers while the game is in progress. * The game ends when maxEntries is reached. * @param number The secret number to submit. */ function submitNumber(uint256 number) external gameInProgress { Entry memory newEntry = Entry(msg.sender, number); entries.push(newEntry); entryCount++; // Update the winner and the highest number if the submitted number is greater than the current highest number if (number > highestNumber) { highestNumber = number; winner = msg.sender; } // If maxEntries is reached, the game is over, and the winner is determined if (entryCount == maxEntries) { emit WinnerDeclared(winner, highestNumber); } } /** * @notice Get the entry at a specific index. * @dev This function returns the participant's address and submitted number at the given index. * @param index The index of the entry to fetch. * @return The entry's sender address and number. */ function getEntry(uint256 index) external view returns (address, uint256) { require(index < entryCount, "Index out of bounds"); Entry memory entry = entries[index]; return (entry.sender, entry.number); } } ``` The `SecretNumberGame` contract is a simple game where participants submit secret numbers. The game ends when the maximum number of entries is reached, and the participant with the highest secret number wins. The smart contract includes a modifier to check if the game is still accepting entries and an event to declare the winner when the game ends. The contract also includes functions to submit a secret number, get an entry at a specific index, and retrieve the submitted entries. Once you add the contract, open a terminal and use the following command to compile the smart contract: ```bash Shell npx hardhat compile ``` If the contract is successfully compiled, you will see a corresponding message, and the output will be stored in the newly generated `/artifacts` directory at the root of your project. ### Write the Hardhat scripts for deploying and interacting with the contract Once the contract is set, you can create a new file, `deploy.js`, in the `/scripts` directory of the project and add the following code to it: ```js deploy.js const hre = require("hardhat"); /** * @notice Fetch the storage data at a given slot number for a specified contract address. * @param _address The contract address. * @param _slotNumber The slot number of the storage. * @return The decimal state data at the specified slot. */ async function getStorageAt(_address, _slotNumber) { const provider = ethers.provider; const result = await provider.send("eth_getStorageAt", [ _address, _slotNumber, "latest", ]); const decimalStateData = result === "0x0" ? "0" : ethers.BigNumber.from(result).toString(); return decimalStateData; } /** * @notice Decode the transaction input data using the ABI. * @param _abi The ABI of the contract. * @param _inputData The input data of the transaction. * @return The decoded transaction data, or an empty object if decoding fails. */ function decodeTransactionInput(_abi, _inputData) { try { const iface = new ethers.utils.Interface(_abi); const decodedData = iface.parseTransaction({ data: _inputData }); return decodedData; } catch (error) { console.error("Error decoding transaction input:", error.message); return {args : []}; } } async function main() { // Compile and deploy the SecretNumberGame contract await hre.run("compile"); const SecretNumberGame = await hre.ethers.getContractFactory( "SecretNumberGame" ); const secretNumberGame = await SecretNumberGame.deploy(); await secretNumberGame.deployed(); console.log("SecretNumberGame deployed to:", secretNumberGame.address); // Submit a secret number (1000) from the single account const secretNumber = 1000; const tx = await secretNumberGame.submitNumber(secretNumber); await tx.wait(); console.log(`Account submitted secret number ${secretNumber}`); // Decode the transaction input data const decodedInput = decodeTransactionInput( SecretNumberGame.interface.format(), tx.data ); console.log(`Decoded Transaction Input: ${decodedInput.args}`); // Fetch the state data of the highestNumber variable using eth_getStorageAt method const stateData = await getStorageAt(secretNumberGame.address, "0x0"); console.log("State Data of highestNumber variable:", stateData); } // We recommend this pattern to be able to use async/await everywhere // and properly handle errors. main() .then(() => process.exit(0)) .catch((error) => { console.error(error); process.exit(1); }); ``` This Hardhat script performs the following actions: 1. Compiles and deploys the `SecretNumberGame` contract to an Ethereum network. 2. Submits a secret number (1,000) from a single account. 3. Decodes the transaction input data using the contract's ABI. 4. Fetches the state data of the `highestNumber` variable using the `eth_getStorageAt` method. The script includes two utility functions: * `getStorageAt` — it fetches the storage data at a given slot number for a specified contract address. * `decodeTransactionInput` — it decodes the transaction input data using the contract's ABI. The `main` function contains the core logic of the script. It first compiles and deploys the contract, then submits a secret number, decodes the transaction input data, and finally fetches the state data of the `highestNumber` variable. ### Deploy the contract onto the Ethereum Sepolia Testnet To deploy the contract onto the Ethereum Sepolia Testnet, add the following details to your `hardhat.config.js` file in the root directory of the project: ```js hardhat.config.js // Import necessary modules require("@nomicfoundation/hardhat-toolbox"); // Load environment variables from .env file require('dotenv').config({path:__dirname+'/.env'}) /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { // Specify Solidity version for compiling the contracts solidity: "0.8.18", networks: { // Specify the network used for deployment sepolia_testnet: { // HTTPS endpoint of the Chainstack Sepolia testnet node url: process.env.SEPOLIA_ENDPOINT? process.env.SEPOLIA_ENDPOINT: "", // Accounts used for deployment accounts: // Use the private key defined in the .env file for deployment process.env.PRIVATE_KEY? [process.env.PRIVATE_KEY]: [], }, } }; ``` The `hardhat.config.js` file is a key configuration file used in this project. It is used to configure the project settings, such as the Solidity version, networks, and other customizations. In this specific file, the Solidity version is set to `0.8.18`, indicating that the project uses the specified version of Solidity for compiling the contracts. The `networks` object in the file specifies the test network used for deployment, which in this case is the Sepolia Testnet. The `url` key is used to specify the URL of the Sepolia Testnet node used for deployment. Additionally, the `accounts` key specifies the accounts used for deployment. This key uses an environment variable `PRIVATE_KEY` and checks if it is defined in the `.env` file. If it is defined, the private key is used for deployment. Once the `hardhat.config.js` file is modified, open a terminal in the root directory of your project and use the following command for deploying and interacting with your contract: ```bash Shell npx hardhat run scripts/deploy.js --network sepolia_testnet ``` This will automatically execute the `deploy.js` file that we created. The script will produce the following output: ```bash Shell SecretNumberGame deployed to: 0xf4bFc2e6f51F64D3328Cdd3570F48e9962009c82 Account submitted secret number 1000 Decoded Transaction Input: 1000 State Data of highestNumber variable: 1000 ``` As you can see in this output, every aspect of the transaction including the input and state data is easily accessible, and given the nature of the game, running the contract on the Ethereum Sepolia Testnet presents a few challenges: 1. Data transparency. In Ethereum, all contract data and transactions are public, which means that anyone can inspect the submitted secret numbers. This can compromise the secrecy of the game, as participants could inspect the submitted numbers and choose their bids accordingly. 2. The prospect of cheating. Due to the data transparency, participants could monitor the submitted numbers and wait until the last moment to submit their bids. This would allow them to submit a number slightly higher than the current highest number, which could be considered an unfair advantage. Now, let us look at how a confidential execution environment like Oasis Sapphire fares against these challenges. ### Deploy the contract onto the Oasis Sapphire Testnet To deploy the contract onto the Oasis Sapphire Testnet, add the following details to the `hardhat.config.js` file: ```js hardhat.config.js // Import necessary modules require("@nomicfoundation/hardhat-toolbox"); // import the sapphire-hardhat module require('@oasisprotocol/sapphire-hardhat'); // Load environment variables from .env file require('dotenv').config({path:__dirname+'/.env'}) /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { // Specify Solidity version for compiling the contracts solidity: "0.8.18", networks: { // Specify the network used for deployment sepolia_testnet: { // HTTPS endpoint of the Chainstack Sepolia testnet node url: process.env.SEPOLIA_ENDPOINT? process.env.SEPOLIA_ENDPOINT: "", // Accounts used for deployment accounts: // Use the private key defined in the .env file for deployment process.env.PRIVATE_KEY? [process.env.PRIVATE_KEY]: [], }, // Specify the Oasis Sapphire testnet network for deployment sapphire_testnet: { // URL of the Chainstack Oasis Sapphire node used for deployment url: process.env.SAPPHIRE_TESTNET_ENDPOINT? process.env.SAPPHIRE_TESTNET_ENDPOINT: "", // Accounts used for deployment accounts: process.env.PRIVATE_KEY? [process.env.PRIVATE_KEY]: [], // Chain ID of the Sapphire Testnet chainId: 0x5aff, }, } }; ``` Here, we have added the details of the Oasis Sapphire Testnet. Apart from the network details, we have also imported the `@oasisprotocol/sapphire-hardhat` package onto the config file. This package handles the encryption of transactions and calls to the contract and thus ensuring the confidentiality and safety of the data involved. Once the `hardhat.config.js` file is modified, open a terminal in the root directory of your project and use the following command for deploying and interacting with your contract: ```bash Shell npx hardhat run scripts/deploy.js --network sapphire_testnet ``` This will generate the following output: ```bash Shell SecretNumberGame deployed to: 0xB7bcE348F31972B7d062B7bb234786F150A3FCE7 Account submitted secret number 1000 Error decoding transaction input: no matching function (argument="sighash", value="0xa264626f", code=INVALID_ARGUMENT, version=abi/5.7.0) Decoded Transaction Input: State Data of highestNumber variable: 0 ``` Here, as you can see, we are facing errors while trying to decode the transaction inputs, and the state data is given as zero. This is due to the fact that in Oasis Sapphire, transactions and calls are end-to-end encrypted into the contract. Only the caller and the contract can see the data sent to or received from Oasis, also, the contract state is only visible to the contract that initially defined it. This means that no external entities can directly access the state value and thus the function returns the default value instead (0, in our case). The only way to access the value of `highestNumber` variable is to write explicit *getter* functions in the contract. To demonstrate this, we have included the `getEntry` function within the contract. The function takes in an index value and returns the details of the `Entry` struct instance stored against that index in the `entries` list. To call the function, add the following code to your `deploy.js` script and execute it: ```js deploy.js //fetch the entry at a specific index const index = 0 const entry = await secretNumberGame.getEntry(index); console.log(`Entry at index ${index} : ${entry}`) ``` This will return the following output: ```bash Shell SecretNumberGame deployed to: 0x41024F891ae49c0CA459bA41e34e15024209FB82 Account submitted secret number 1000 Error decoding transaction input: no matching function (argument="sighash", value="0xa264626f", code=INVALID_ARGUMENT, version=abi/5.7.0) Decoded Transaction Input: State Data of highestNumber variable: 0 Entry at index 0 : 0x459AC2794A3386cD5a8ca7aeAed07D9525e600Ed,1000 ``` Apart from explicit *getter* functions, in Oasis Sapphire, we can also expose the state data using contract logs/events. Thus, one should be cautious with the design of the contact so as to avoid any unnecessary data exposure. ## Conclusion The mechanics of the `SecretNumberGame` resembles a typical auction/bidding contract, and as we saw previously, the contract also faces similar challenges. Now, with the `SecretNumberGame`, the user stakes are pretty much nonexistent, but if we re-imagine the scenario and add in an NFT bid or a token exchange, the challenges that we saw earlier will gain a greater impact. Running this contract on Ethereum Sepolia Testnet and the Oasis Sapphire Testnet reveals differences in data privacy, security, and efficiency. Ethereum Sepolia is a public blockchain, meaning all contract data and transactions are public. This transparency could compromise the game's secrecy. In contrast, Oasis Sapphire emphasizes confidentiality and data privacy through advanced encryption techniques and confidential data storage. These features, when applied in a wider scope, provide unparalleled data protection and can even prevent common challenges like front/back running, which are especially relevant in decentralized finance (DeFi) and non-fungible token (NFT) markets. ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, NEAR , Graph Protocol and Oasis. Majored in computer science and technology. [](https://github.com/SethuRamanOmanakuttan) [](https://twitter.com/Sethu_Raman_O) [](https://www.linkedin.com/in/sethuraman-omanakuttan) # opBNB: How to listen to deposits on the opBNB bridge Source: https://docs.chainstack.com/docs/opbnb-how-to-listen-deposits-bridge **TLDR** * opBNB is a Layer 2 scalability solution for the BNB Smart Chain built on the Optimism OP Stack. * The opBNB Bridge contract emits `DepositFinalized` events whenever deposits from BNB Chain occur. * This tutorial shows how to listen for these deposit events using ethers.js, enabling real-time tracking of cross-chain activity. * By monitoring transaction hashes and deposit details, developers can interact directly with the opBNB ecosystem. ## Main article opBNB is an EVM-compatible Layer 2 scalable network that brings unique features. Developed as an extension of the Binance Smart Chain (BNB Chain) ecosystem, opBNB aims to provide high-performance blockchain solutions. It leverages the bedrock version of the Optimism OP Stack to offer a Layer 2 scaling solution for the BNB Smart Chain. The opBNB network enhances scalability by offloading transaction processing and resource usage from the BNB Smart Chain while posting data to the underlying mainnet. This approach enables high throughput and low fees, making opBNB an attractive choice for developers and users. One of the key components of opBNB is the opBNB Bridge, which facilitates the secure transfer of assets between BNB Chain and opBNB. This bridge is essential for those who need to move assets while maintaining the benefits of both networks. Users interact with the opBNB network by depositing funds from BNB and using applications and contracts on opBNB. This tutorial will guide you through setting up a listener for deposit events on the opBNB Bridge. You can then capture deposit events and extract relevant transaction data. Understanding how to monitor and interact with these on-chain events is crucial for developers leveraging opBNB's capabilities. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an opBNB Mainnet node * [Node.js](https://nodejs.org/en) * ether.js ## Step-by-step ### Get an opBNB node Log in to your [Chainstack account](https://console.chainstack.com/) and deploy a node. ### Create a new ethers project Once you have Node.js installed, create a new project in a new directory. ``` npm init --yes ``` This will create a new project and a `package.json` file. You can then install the ethers library. ``` npm i ethers ``` Learn more about Node projects for Web3 by reading [Web3 node.js: From zero to a full-fledged project](/docs/web3-nodejs-from-zero-to-a-full-fledged-project). ### Create the script Now that the project is set, create a new file named `index.js`. With this DApp, we listen to the `DepositFinalized` event on the opBNB Bridge smart contract, `0x4200000000000000000000000000000000000010` in this case. This even is emitted when a deposit is made, you can use this data to monitor activity. Note that this address is a proxy contract. To find the contract event to use in the code, check the actual bridge contract: [0xc0d3c0d3c0d3c0d3c0d3c0d3c0d3c0d3c0d30010](https://opbnbscan.com/address/0xc0d3c0d3c0d3c0d3c0d3c0d3c0d3c0d3c0d30010?view=contract_code\&p=1\&tab=Contract#F7#L58). The event is defined in the contract as the following: ```sol L2StandardBridge.sol event DepositFinalized( address indexed l1Token, address indexed l2Token, address indexed from, address to, uint256 amount, bytes extraData ); ``` Paste the following code. ```javascript Javascript const { ethers } = require("ethers"); // Connect to your opBNB node const provider = new ethers.JsonRpcProvider( "YOUR_CHAINSTACK_NODE" ); // Define the contract ABI const abi = [ "event DepositFinalized(address indexed l1Token, address indexed l2Token, address indexed from, address to, uint256 amount, bytes extraData)", ]; // Define the contract address const contractAddress = "0x4200000000000000000000000000000000000010"; // Create a contract instance const contract = new ethers.Contract(contractAddress, abi, provider); // Define the event filter (optional: add any specific filters if required) const filter = contract.filters.DepositFinalized(); console.log("Listening for DepositFinalized events on the opBNB Bridge..."); // Listen for the event contract.on(filter, (event) => { console.log("Deposit Finalized Event:"); console.log(`Transaction Hash: ${event.log.transactionHash}`); console.log( `Explorer link: https://opbnbscan.com/tx/${event.log.transactionHash}` ); console.log(`From: ${event.args[2]}`); // from console.log(`Amount: ${ethers.formatEther(event.args[4].toString())} BNB`); // amount console.log( "------------------------------------------------------------------------------------------" ); }); ``` where * `YOUR_CHAINSTACK_NODE` — your opBNB node endpoint ## Code breakdown Here's a brief breakdown explaining the code step-by-step: The code starts by importing the Ethers.js library, which is essential for interacting with the Ethereum network and its derivatives like opBNB. Using a JSON-RPC provider, it connects to an opBNB node. Replace `"YOUR_CHAINSTACK_NODE"` with your actual Chainstack node URL. ```javascript Javascript const { ethers } = require("ethers"); // Connect to your opBNB node const provider = new ethers.JsonRpcProvider( "YOUR_CHAINSTACK_NODE" ); ``` The ABI (Application Binary Interface) is defined for the `DepositFinalized` event. This ABI tells Ethers.js how to interpret the event data. ```javascript Javascript // Define the contract ABI const abi = [ "event DepositFinalized(address indexed l1Token, address indexed l2Token, address indexed from, address to, uint256 amount, bytes extraData)", ]; ``` The address of the opBNB Bridge contract is specified. This address points to the smart contract on the opBNB network that emits the `DepositFinalized` events. ```javascript Javascript // Define the contract address const contractAddress = "0x4200000000000000000000000000000000000010"; ``` The contract address, ABI, and provider are used to create an instance of the contract. This instance allows interaction with the contract and listening for events. ```javascript Javascript // Create a contract instance const contract = new ethers.Contract(contractAddress, abi, provider); ``` A filter for the `DepositFinalized` event is defined. If needed, this filter can be customized to listen for specific events based on additional parameters. ```javascript Javascript // Define the event filter (optional: add any specific filters if required) const filter = contract.filters.DepositFinalized(); ``` The script sets up an event listener that triggers when a `DepositFinalized` event is detected. * **Logging Event Details**: It logs the event details, such as the transaction hash, which can be used to view the transaction on the opBNB block explorer. * **Extracting Event Data**: The `from` address and the `amount` of BNB deposited are extracted from the event arguments and logged. The amount is formatted to be readable in BNB units. ```javascript Javascript // Listen for the event contract.on(filter, (event) => { console.log("Deposit Finalized Event:"); console.log(`Transaction Hash: ${event.log.transactionHash}`); console.log( `Explorer link: https://opbnbscan.com/tx/${event.log.transactionHash}` ); console.log(`From: ${event.args[2]}`); // from console.log(`Amount: ${ethers.formatEther(event.args[4].toString())} BNB`); // amount console.log( "------------------------------------------------------------------------------------------" ); }); ``` The script listens for `DepositFinalized` events on the opBNB Bridge and prints relevant details to the console whenever such an event is emitted by the smart contract. ## Conclusion By following this tutorial, you have successfully set up a listener for `DepositFinalized` events on the opBNB Bridge. This allows you to capture and analyze deposit events in real-time, providing valuable insights into the activity on the opBNB network. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # opBNB tooling Source: https://docs.chainstack.com/docs/opbnb-tooling Find a complete list of available tools on the . ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and opBNB nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const {Web3} = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where `YOUR_CHAINSTACK_ENDPOINT` is your node HTTPS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and opBNB nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and opBNB nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.JsonRpcProvider(urlInfo.url, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — opBNB network ID: * opBNB Mainnet: `204` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your opBNB nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and opBNB nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use `forge` to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use `cast` to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where `YOUR_CHAINSTACK_ENDPOINT` is your node HTTPS endpoint protected either with the key or password # Optimism methods Source: https://docs.chainstack.com/docs/optimism-methods See also [interactive Optimism API call examples](/reference/optimism-api-reference). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_simulateV1 | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | erigon\_blockNumber | | | | erigon\_forks | | | | erigon\_getBlockByTimestamp | | | | erigon\_getBlockReceiptsByBlockHash | | | | erigon\_getHeaderByHash | | | | erigon\_getHeaderByNumber | | | | erigon\_getLatestLogs | | | | erigon\_getLogsByHash | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | optimism\_outputAtBlock | | | | optimism\_syncStatus | | | | optimism\_rollupConfig | | | | optimism\_version | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Optimism tooling Source: https://docs.chainstack.com/docs/optimism-tooling ## MetaMask On the [node access page](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Optimism nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Optimism endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Optimism nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Polygon nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Optimism nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Optimism nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Optimism nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint * USERNAME — your node access username * PASSWORD — your node access password See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Optimism nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Optimism network ID: * Mainnet: `10` * Testnet: `420` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Optimism network ID: * Mainnet: `10` * Testnet: `420` See [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Optimism ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `optimism-testnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Testnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint * NETWORK\_ID — Optimism network ID: * Mainnet: `10` * Testnet: `420` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-testnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```bash Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ## GraphQL You can use GraphQL on a dedicated node on the [paid plans](https://chainstack.com/pricing/). ### UI You can query data using the graphical interface. 1. On Chainstack, navigate to your dedicated Ethereum node. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). 2. Hover over **GraphQL IDE URL** and click **Open**. 3. In the graphical interface that opens, run a GraphQL query. Example to get the latest block number: ```graphql GraphQL { block { number } } ``` ### Node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require('axios'); const main = async () => { try { const result = await axios.post( 'YOUR_CHAINSTACK_ENDPOINT', { query: ` { block { number } } ` } ); console.log(result.data); } catch(error) { console.error(error); } } main(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your GraphQL query. In this case, to get the latest block number. # Optimism: Bridge ether from Ethereum L1 to Optimism L2 using the Optimism JavaScript SDK Source: https://docs.chainstack.com/docs/optimism-tutorial-bridge-ether-from-ethereum-l1-to-optimism-l2-using-the-optimism-javascript-sdk **TLDR** * Demonstrates how to bridge ETH from Ethereum Sepolia (L1) to Optimism Sepolia (L2) using the Optimism JavaScript SDK. * Sets up a crossChainMessenger with the user’s private key to deposit ETH onto L2. * Waits for the L2 relay to finalize and shows updated balances on both layers. * Code sample includes environment variable setup and Ethers integration for seamless bridging. ## Main article This tutorial will show you how to use the [Optimism JavaScript SDK](https://sdk.optimism.io/) to bridge ether from L1 to L2. ## Optimism Optimism is a next-generation solution that enhances the Ethereum blockchain by providing a supplementary layer 2 network. Optimism streamlines the transaction process on Ethereum, resulting in significantly lower fees and fast execution. The beauty of Optimism lies in its seamless integration with Ethereum—each transaction takes place on the Optimism network, yet its validity is confirmed via the Ethereum blockchain. ## Optimism's inner working Optimism leverages the breakthrough technology of optimistic rollups, a sophisticated compression technique developed by the team at the Optimism Foundation. Rollups are a new way to scale the Ethereum blockchain and come in optimistic rollups and zero-knowledge rollups (ZK rollups). Optimistic rollups streamline the transaction process by taking the bulk of data off-chain, resulting in faster processing times. Despite this off-chain approach, a small amount of data is still recorded on the Ethereum network for security purposes. What sets optimistic rollups apart from other scaling solutions is that they do not require cryptographic proofs to validate off-chain transactions. Instead, they rely on a system of fraud proofs and utilize the Optimistic Virtual Machine (OVM)—a sandboxed environment—to ensure secure and deterministic smart contract execution. The OVM acts as the interface between Layers 1 and 2, much like the Ethereum Virtual Machine (EVM). However, the OVM only executes computation, while the EVM handles all execution. The OVM and EVM work together through the Execution Manager to execute transactions in a virtualized environment. Learn more about the Optimism network in [Optimism blockchain – Ethereum Layer 2 scaling solution](https://chainstack.com/an-overview-of-optimism-and-communication-between-l2-and-l1/) on the Chainstack blog. ## Prerequisites * [Chainstack account](https://console.chainstack.com/user/login) to deploy nodes on the Ethereum Sepolia testnet and Optimism Sepolia testnet. * Node.js ^16.17.0 — [install node.js](https://nodejs.org/en/download/). ## Dependencies * @eth-optimism/sdk: ^1.10.1 * dotenv: ^16.0.3 * Ethers: ^5.7.2 ## Overview To get from zero to a functioning bridge between Ethereum L1 and Optimism L2, do the following: With Chainstack, create a public chain project. With Chainstack, join the Ethereum Sepolia testnet and Optimism Sepolia testnet. With Chainstack, access your nodes' credentials. Create an npm project. Install the required dependencies. Create a `.env` file to store the secrets. With the Ethers library, create provider and wallet instances. With Optimism JavaScript SDK, bridge ether between L1 and L2. Run the script. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia testnet and Optimism Goerli testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get endpoints for your Ethereum Sepolia testnet and Optimism Sepolia testnet See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Create an npm project In your project directory, open the terminal and run `npm init`. Answer the questions in the terminal to create a sample `package.json` file for your project. ### Install the required dependencies To install the libraries and tools required for this bridge project, run: ```bash Shell npm i @eth-optimism/sdk ethers dotenv ``` ### Create a `.env` file to store the secrets Create a `.env` file in the root directory of your project and paste and fill in the following: ```env env PRIVATE_KEY="YOUR_WALLET_PRIVATE_KEY" SEPOLIA_CHAINSTACK="YOUR_CHAINSTACK_SEPOLIA_ENDPOINT" OPTIMISM_GOERLI_CHAINSTACK="YOUR_CHAINSTACK_OPTIMISM_GOERLI_ENDPOINT" ``` ### Use the Ethers library to create provider and wallet instances Create an `index.js` file in your project's root directory. The first step is to import the required packages and create the global variables. To do that, at the top of the file, paste the following: ```js JavaScript const ethers = require("ethers") const optimismSDK = require("@eth-optimism/sdk") require('dotenv').config() // Environment variables const SEPOLIA_CHAINSTACK = process.env.SEPOLIA_CHAINSTACK; const OPTIMISM_SEPOLIA_CHAINSTACK = process.env.OPTIMISM_SEPOLIA_CHAINSTACK; const PRIVATE_KEY = process.env.PRIVATE_KEY; // Provider instances const l1Provider = new ethers.providers.JsonRpcProvider(GOERLI_CHAINSTACK); const l2Provider = new ethers.providers.JsonRpcProvider(OPTIMISM_GOERLI_CHAINSTACK); // Init Signers async function getSigners() { const privateKey = PRIVATE_KEY; const l1Wallet = new ethers.Wallet(privateKey, l1Provider); const l2Wallet = new ethers.Wallet(privateKey, l2Provider); return [l1Wallet, l2Wallet]; } ``` This part of the code does the following: * Imports libraries and tools. * Declares the required constants using the environment variables. * Creates provider instances for L1 (Ethereum Sepolia testnet) and L2 (Optimism Sepolia testnet). * Creates wallet instances. The `getSigners()` function uses the private key to create wallet instances with the Ethers library. These instances will be used to query balances and sign transactions. ### Use the Optimism JavaScript SDK to bridge ether between L1 and L2 Create a function to retrieve the networks' chain IDs and a `CrossChainMessenger` instance using the Optimism JavaScript SDK: ```js JavaScript // Get Chain IDs using ethers async function chainIds() { const l1Network= await l1Provider.getNetwork(); const l2Network = await l2Provider.getNetwork(); const l1ChainId = l1Network.chainId const l2ChainId = l2Network.chainId return [l1ChainId, l2ChainId] } // Init crossChainMessenger using the Chain IDs and wallet instances async function initialize() { const [l1Signer, l2Signer] = await getSigners() const [l1ChainId, l2ChainId] = await chainIds() crossChainMessenger = new optimismSDK.CrossChainMessenger({ l1ChainId: l1ChainId, l2ChainId: l2ChainId, l1SignerOrProvider: l1Signer, l2SignerOrProvider: l2Signer }) } ``` The `chainIds()` function uses the Ethers library to query the Chainstack endpoints and retrieve the chain IDs that will then be used to create the `crossChainMessenger` instance. The `initialize()` function uses the `getSigners()` and `chainIds()` functions to retrieve the parameters required to create the `crossChainMessenger` using the Optimism JavaScript SDK. The [crossChainMessenger](https://sdk.optimism.io/classes/crosschainmessenger) instance allows us to interact with the L1 and L2 networks. ### Function to retrieve the wallets balances Add the following function to retrieve the wallet balances and display them in the console: ```js JavaScript // Display balances from L1 and L2 using the Optimism SDK crossChainMessenger provider instance async function showBalances() { const l1Balance = (await crossChainMessenger.l1Signer.getBalance()).toString(); const l2Balance = (await crossChainMessenger.l2Signer.getBalance()).toString(); console.log(`Balance on L1: ${ethers.utils.formatEther(l1Balance).slice(0,-14)} ETH`); console.log(`Balance on L2: ${ethers.utils.formatEther(l2Balance).slice(0,-14)} ETH`); console.log("----------------------------------"); } ``` The `console.log` statements with the `-` are there only to make the response in the console more legible. ### Function to transfer ether between L1 and L2 Now you can create a function to bridge the ether between L1 and L2: ```js JavaScript // Transfer an amount of ether from L1 to L2 async function bridgeEth() { const wei = BigInt(100000000000000000); // 0.1 ETH in Wei console.log("Fetching current balances..."); await showBalances(); console.log("Initiating ETH transfer from L1 to L2..."); const depositResponse = await crossChainMessenger.depositETH(wei); console.log(`Transaction hash for deposit from L1 to L2: ${depositResponse.hash}`); console.log(`See on Sepolia Etherscan: https://sepolia.etherscan.io/tx/${depositResponse.hash}`); await depositResponse.wait(); console.log("Waiting for deposit transaction to be relayed..."); console.log("----------------------------------"); await crossChainMessenger.waitForMessageStatus( depositResponse.hash, optimismSDK.MessageStatus.RELAYED ); console.log("ETH transfer from L1 to L2 is complete."); console.log("Updating current balances..."); console.log("----------------------------------"); await showBalances(); } ``` Note that the `const wei` holds the amount that will be transferred, expressed in the Wei unit. The default for this script is set to 0.1 ether, equivalent to 100,000,000,000,000,000 Wei. You can use a [Wei converter](https://eth-converter.com/) to include other amounts, or you can use the following: * 1 ether = 1,000,000,000,000,000,000 Wei * 0.1 ether = 100,000,000,000,000,000 Wei * 0.01 ether = 10,000,000,000,000,000 Wei * 0.001 ether = 1,000,000,000,000,000 Wei The `bridgeEth()` function transfers the amount of ether specified in the `wei` constant from Ethereum (L1) to Optimism (L2). It uses the [depositEth](https://sdk.optimism.io/classes/crosschainmessenger#depositETH-2) method of the `crossChainMessenger` instance. It then waits for the `MessageStatus` to become `RELAYED`. The `RELAYED` status indicates that a message has been successfully transmitted from one network to another and is under processing on the recipient network. ### Message statuses Find the different message statuses in the [Optimism JavaScript SDK docs](https://sdk.optimism.io/enums/messagestatus). After that, the `bridgeEth()` function gives some updates and retrieves the balances again. At the bottom of the file, add the `main` function and call it: ```js JavaScript // Main function async function main() { await initialize() await bridgeEth() } // Run the main function and catch any error main().then(() => process.exit(0)) .catch((error) => { console.error(error) process.exit(1) }) ``` At this point, the entire code will look like this: ```js JavaScript const ethers = require("ethers") const optimismSDK = require("@eth-optimism/sdk") require('dotenv').config() // Environment variables const SEPOLIA_CHAINSTACK = process.env.SEPOLIA_CHAINSTACK; const OPTIMISM_SEPOLIA_CHAINSTACK = process.env.OPTIMISM_SEPOLIA_CHAINSTACK; const PRIVATE_KEY = process.env.PRIVATE_KEY; // Provider instances const l1Provider = new ethers.providers.JsonRpcProvider(SEPOLIA_CHAINSTACK); const l2Provider = new ethers.providers.JsonRpcProvider(OPTIMISM_SEPOLIA_CHAINSTACK); // Init Signers async function getSigners() { const privateKey = PRIVATE_KEY; const l1Wallet = new ethers.Wallet(privateKey, l1Provider); const l2Wallet = new ethers.Wallet(privateKey, l2Provider); return [l1Wallet, l2Wallet]; } // Get Chain IDs async function chainIds() { const l1Network = await l1Provider.getNetwork(); const l2Network = await l2Provider.getNetwork(); const l1ChainId = l1Network.chainId const l2ChainId = l2Network.chainId return [l1ChainId, l2ChainId] } // Init crossChainMessenger using the Chain IDs and wallet instances async function initialize() { const [l1Signer, l2Signer] = await getSigners() const [l1ChainId, l2ChainId] = await chainIds() crossChainMessenger = new optimismSDK.CrossChainMessenger({ l1ChainId: l1ChainId, l2ChainId: l2ChainId, l1SignerOrProvider: l1Signer, l2SignerOrProvider: l2Signer }) } // Display balances from L1 and L2 async function showBalances() { const l1Balance = (await crossChainMessenger.l1Signer.getBalance()).toString(); const l2Balance = (await crossChainMessenger.l2Signer.getBalance()).toString(); console.log(`Balance on L1: ${ethers.utils.formatEther(l1Balance).slice(0,-14)} ETH`); console.log(`Balance on L2: ${ethers.utils.formatEther(l2Balance).slice(0,-14)} ETH`); console.log("----------------------------------"); } // Transfer an amount of ether from L1 to L2 async function bridgeEth() { const wei = BigInt(100000000000000000); // 0.1 ETH in Wei console.log("Fetching current balances..."); await showBalances(); console.log("Initiating ETH transfer from L1 to L2..."); const depositResponse = await crossChainMessenger.depositETH(wei); console.log(`Transaction hash for deposit from L1 to L2: ${depositResponse.hash}`); console.log(`See on Goerli Etherscan: https://goerli.etherscan.io/tx/${depositResponse.hash}`); await depositResponse.wait(); console.log("Waiting for deposit transaction to be relayed..."); console.log("----------------------------------"); await crossChainMessenger.waitForMessageStatus( depositResponse.hash, optimismSDK.MessageStatus.RELAYED ); console.log("ETH transfer from L1 to L2 is complete."); console.log("Updating current balances..."); console.log("----------------------------------"); await showBalances(); } // Main function async function main() { await initialize() await bridgeEth() } // Run the main function main().then(() => process.exit(0)) .catch((error) => { console.error(error) process.exit(1) }) ``` ### Run the script Now it's time to run the script and bridge some ether from L1 to L2. To do this, you will need some Sepolia ether in your wallet. To get testnet ether, you can use the following faucet: * [Chainstack's Sepolia faucet](https://faucet.chainstack.com) Once you have received some Sepolia ether, pick the amount you want to send and update the `wei` constant. To start the script, run the following command: ```bash Shell node index ``` The console will log all of the steps and it will look similar to the following: ```bash Shell Fetching current balances... Balance on L1: 9.7844 ETH Balance on L2: 0.9314 ETH ---------------------------------- Initiating ETH transfer from L1 to L2... Transaction hash for deposit from L1 to L2: 0x97455a64eb1c496f4ecc937ffcf2d9294228d9658504a16ab9dbfa638d32693a See on Sepolia Etherscan: https://sepolia.etherscan.io/tx/0x97455a64eb1c496f4ecc937ffcf2d9294228d9658504a16ab9dbfa638d32693a Waiting for deposit transaction to be relayed... ---------------------------------- ETH transfer from L1 to L2 is complete. Updating current balances... ---------------------------------- Balance on L1: 9.6842 ETH Balance on L2: 1.0314 ETH ---------------------------------- ``` As you can see, it prints the transaction hash and the link to check the transaction details using Sepolia Etherscan. See the details of one of the already [completed transactions on Etherscan](https://goerli.etherscan.io/tx/0x97455a64eb1c496f4ecc937ffcf2d9294228d9658504a16ab9dbfa638d32693a). ## Conclusion This tutorial guided you through creating a basic L1 => L2 bridge using the Optimism JavaScript SDK. ### About the author Developer Advocate @ Chainstack BUIDLs on EVM, The Graph protocol, and Starknet Helping people understand Web3 and blockchain development [](https://github.com/soos3d) [](https://twitter.com/web3Dav3) [](https://www.linkedin.com/in/davide-zambiasi/) # Introduction Source: https://docs.chainstack.com/docs/platform-introduction Chainstack is the leading Web3 infrastructure provider for top chains, including Subgraphs, and add-ons like Solana Geyser. Topping up with crypto is also an option at [Billing](https://console.chainstack.com/user/settings/billing). We support topping up with [150+ cryptocurrencies](https://nowpayments.io/supported-coins) . To submit a request for a new product capability, protocol support, or other improvements, visit our [ideas portal](https://ideas.chainstack.com/). For assistance, reach out to [Chainstack support](https://support.chainstack.com). ## Supported protocols ### Public networks ## Chainstack faucets ## Supported cloud providers See also [Available clouds, regions, and locations](/docs/nodes-clouds-regions-and-locations). ## Supported hosting options Chainstack also supports private hosting options. They are as follows: deployment to the Chainstack-managed cloud infrastructure. Requires no additional setup. deployment to a self-managed infrastructure. See [Hybrid hosting](/docs/hybrid-hosting). Requires infrastructure setup. # Polygon methods Source: https://docs.chainstack.com/docs/polygon-methods See also [interactive Polygon API call examples](/reference/polygon-getting-started). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | erigon\_blockNumber | | | | erigon\_forks | | | | erigon\_getBlockByTimestamp | | | | erigon\_getBlockReceiptsByBlockHash | | | | erigon\_getHeaderByHash | | | | erigon\_getHeaderByNumber | | | | erigon\_getLatestLogs | | | | erigon\_getLogsByHash | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | bor\_getAuthor | | | | bor\_getCurrentProposer | | | | bor\_getCurrentValidators | | | | bor\_getRootHash | | | | bor\_getSignersAtHash | | | | eth\_getRootHash | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Polygon tooling Source: https://docs.chainstack.com/docs/polygon-tooling ## Bor Interact with your Polygon node using [Bor](https://github.com/maticnetwork/bor). Install [Bor](https://github.com/maticnetwork/bor). Use `geth attach` command with the node endpoint. ```shell Shell bor attach YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Invoke any methods from [Web3 JavaScript API](https://web3js.readthedocs.io/). Example below demonstrates how to get the balance of an address in wei value and convert it to ether value: ```js JavaScript > web3.fromWei(web3.eth.getBalance("0xde0b295669a9fd93d5f28d9ec85e40f4cb697bae")) 642538.078574759898951277 ``` You can also use the [Bor client Docker container](https://hub.docker.com/r/maticnetwork/bor). ## GraphQL You can use GraphQL on a dedicated node on the [paid plans](https://chainstack.com/pricing/). ### UI You can query data using the graphical interface. On Chainstack, navigate to your dedicated Polygon node. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Hover over **GraphQL IDE URL** and click **Open**. In the graphical interface that opens, run a GraphQL query. Example to get the latest block number: ```graphql GraphQL { block { number } } ``` ### Node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require('axios'); const main = async () => { try { const result = await axios.post( 'YOUR_CHAINSTACK_ENDPOINT', { query: ` { block { number } } ` } ); console.log(result.data); } catch(error) { console.error(error); } } main(); ``` * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your GraphQL query. In this case, to get the latest block number. See also [Using GraphQL with EVM-compatible nodes](https://support.chainstack.com/hc/en-us/articles/4409604331161-Using-GraphQL-with-EVM-compatible-nodes). ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Polygon nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Polygon endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Polygon nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Polygon nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Polygon nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3.php Build DApps using [web3.php](https://github.com/web3p/web3.php) and Polygon nodes deployed with Chainstack. Install [web3.php](https://github.com/web3p/web3.php). Connect over HTTP: ```php Php ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password Use [JSON-RPC methods](https://eth.wiki/json-rpc/API) to interact with the node. Example to get the latest block number: ```php Php eth; $eth->blockNumber(function ($err, $data) { print "$data \n"; }); ?> ``` ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Polygon nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Polygon nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Polygon network ID: * Mainnet: `137` * Mumbai testnet: `80001` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Polygon network ID: * Mainnet: `137` * Mumbai testnet: `80001` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```shell Shell brownie networks add Polygon ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Polygon network ID: * Mainnet: `137` * Mumbai testnet: `80001` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Polygon: Bridging ERC-20 from Ethereum to Polygon Source: https://docs.chainstack.com/docs/polygon-tutorial-bridging-erc20-from-ethereum-to-polygon ## Deprecation notice As Goerli & Mumbai testnets have been deprecated, this guide is for historical reference. The Polygon mainnet is an L2 commit chain to the Ethereum mainnet. Bridging your existing Ethereum smart contract to the Polygon commit chain allows network users to move their assets based on your contract between an Ethereum network and a Polygon commit chain. In this tutorial, you will: * Deploy an ERC-20 smart contract on the Ethereum [Goerli testnet](/docs/protocols-networks). * Deploy a compatible smart contract on the Polygon [Mumbai testnet](/docs/protocols-networks). * Map the Ethereum smart contract to the Polygon smart contract. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy an Ethereum node and a Polygon node. * [Truffle Suite](https://trufflesuite.com/) to create and deploy contracts. * [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/) to use the audited [ERC-20 libraries](https://docs.openzeppelin.com/contracts/4.x/erc20) to create your ERC-20 contract. ## Overview To get from zero to a deployed ERC-20 contract on the Ethereum Goerli testnet and bridge it to the Polygon Mumbai testnet, do the following: With Chainstack, create a public chain project. With Chainstack, join the Ethereum Goerli testnet. In the same project, join the Polygon Mumbai testnet. With Chainstack, access your Ethereum node and Polygon node endpoints. With OpenZeppelin, create an ERC-20 contract. With Truffle, compile and deploy the contract through your Ethereum node. With Truffle, compile and deploy a Polygon contract through your Polygon node. Submit a mapping request to bridge the deployed Ethereum contract to the deployed Polygon contract. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Goerli testnet and the Polygon Mumbai testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Ethereum node and Polygon node access and credentials See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Install OpenZeppelin Contracts See [OpenZeppelin Contracts](https://docs.openzeppelin.com/contracts/4.x/). ### Install Truffle Suite See [Truffle Suite: Installation](https://trufflesuite.com/docs/truffle/how-to/install/). ### Create the root Ethereum ERC-20 contract On your machine, in the contract directory, initialize Truffle: ```bash Shell truffle init ``` This will generate the Truffle boilerplate structure: ```bash Shell . ├── contracts │ └── .gitkeep ├── migrations │ └── .gitkeep ├── test │ └── .gitkeep └── truffle-config.js ``` Go to the `contracts` directory. In the directory, create your ERC-20 contract: `myL2token.sol`. ```solidity solidity //SPDX-License-Identifier: MIT pragma solidity ^0.8.0; import "@openzeppelin/contracts/token/ERC20/ERC20.sol"; contract myL2token is ERC20 { constructor(uint256 initialSupply) ERC20("myL2token", "ML2T") { \_mint(msg.sender, initialSupply); } } ``` This is a standard [OpenZeppelin ERC-20 preset contract](https://docs.openzeppelin.com/contracts/4.x/erc20). Create `2_deploy_contracts.js` in the `migrations` directory. ```js JavaScript var myL2token = artifacts.require("./myL2token.sol"); module.exports = function(deployer) { deployer.deploy(myL2token, 100); }; ``` This will create the instructions for Truffle to deploy the contract with the supply of `100 ML2T` tokens. ### Compile and deploy the root Ethereum ERC-20 contract Install `HDWalletProvider`. [HDWalletProvider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider) is Truffle's separate npm package used to sign transactions. Run: ```bash Shell npm install @truffle/hdwallet-provider ``` Edit `truffle-config.js` to add: * `HDWalletProvider` * Your Ethereum node access and credentials * Your Ethereum account that you will use to deploy the contract ```js JavaScript const HDWalletProvider = require("@truffle/hdwallet-provider"); const private_key = 'PRIVATE_KEY'; module.exports = { networks: { goerli: { provider: () => new HDWalletProvider(private_key, "YOUR_CHAINSTACK_ENDPOINT"), network_id: 5 } }, compilers: { solc: { version: "0.8.1", } } }; ``` where * `goerli` — any network name that you will pass to the `truffle migrate --network` command. * `HDWalletProvider` — Truffle's custom provider to sign transactions. * PRIVATE\_KEY — the private key of your Ethereum account that will deploy the contract. * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [Ethereum tooling](/docs/ethereum-tooling). * `network_id` — the network ID of the Ethereum Goerli testnet: `5`. * `solc` — the Solidity compiler version that Truffle must use. Run: ```bash Shell truffle migrate --network goerli ``` This will engage `2_deploy_contracts.js` and deploy the contract to the Ethereum Goerli testnet as specified in `truffle-config.js`. ### Verify your root Ethereum ERC-20 contract on Etherscan Once your contract is deployed, you can view it online at [Etherscan](https://goerli.etherscan.io/). Before you submit a mapping request to bridge your root Ethereum ERC-20 contract to the Polygon commit chain, you must verify the contract on Etherscan. Flatten your contract code Since your ERC-20 contract uses imported OpenZeppelin libraries, you must put all the imports into one `.sol` file to make Etherscan be able to verify it. Install [Truffle Flattener](https://www.npmjs.com/package/truffle-flattener). In the `contracts` directory, run: ```bash Shell npx truffle-flattener myL2token.sol > flatmyL2token.sol ``` Clean up the licensing information. The flattened contract will have the same licensing note imported from each of the files. Multiple licensing notes in one file break the Etherscan verification, so you have to leave one licensing note for the entirety of the flattened contract. The easiest way to clean up is to search for the `SPDX` mentions in the file and remove all of them except for the very first one. Verify the deployed contract on Etherscan. At this point, you have your flattened and cleaned-up contract ready for the Etherscan verification. Go to [Etherscan](https://goerli.etherscan.io/). Find your deployed contract. The address of your contract should have been printed by Truffle at the end of the deployment in the contract address field. On the contract page on Etherscan, click **Contract** > **Verify and Publish**. In **Compiler Type**, select **Solidity (Single file)**. In **Compiler Version**, select **v0.8.1**. This is the version this tutorial used to compile the contract. In **Open Source License Type**, select **MIT License (MIT)**. Click **Continue**. Keep the **Optimization** option set to **No** as Truffle does not use optimization by default. Paste the entirety of your flattened `.sol` contract in the **Enter the Solidity Contract Code below** field. Click **Verify and Publish**. Etherscan will take a few seconds to compile your contract, verify, and publish it. ### Create the child Polygon ERC-20 contract Go to the `contracts` directory. In the directory, put the default [child ERC-20 contract](https://github.com/maticnetwork/pos-portal/blob/master/flat/ChildERC20.sol) provided by Polygon. Create `2_deploy_contracts.js` in the `migrations` directory. ```js JavaScript var ChildERC20 = artifacts.require("./ChildERC20.sol"); module.exports = function(deployer) { deployer.deploy(ChildERC20, 'myL2tokenChild', 'ML2T', 18, '0x2e5e27d50EFa501D90Ad3638ff8441a0C0C0d75e'); }; ``` where * `myL2tokenChild` — the name of your ERC-20 token * `ML2T` — the symbol of your ERC-20 token * `18` — the default decimals number as used by the [OpenZeppelin ERC-20 preset contract](https://docs.openzeppelin.com/contracts/4.x/erc20) * `0x2e5e27d50EFa501D90Ad3638ff8441a0C0C0d75e` — the [ChildChainmanager](https://wiki.polygon.technology/docs/develop/ethereum-polygon/pos/getting-started/#steps-to-use-the-pos-bridge) address on the Polygon Mumbai testnet. For the ChildChainManager contract addresses, look online for the addresses provided by Polygon: * [Mainnet addresses](https://github.com/maticnetwork/static/blob/master/network/mainnet/v1/index.json) ### Compile and deploy the child Polygon ERC-20 contract Clean up the environment by moving `myL2token.sol` and `flatmyL2token.sol` to a backup directory so that Truffle does not pick them up for deployment. 1. Edit `truffle-config.js` to change to: * Your Polygon node access and credentials * Your Polygon account that you will use to deploy the contract. * The Solidity compiler version used by the default [child ERC-20 contract template](https://github.com/maticnetwork/pos-portal/blob/master/flat/ChildERC20.sol) provided by Polygon. ```js JavaScript const HDWalletProvider = require("@truffle/hdwallet-provider"); const private_key = 'PRIVATE_KEY'; module.exports = { networks: { mumbai: { provider: () => new HDWalletProvider(private_key, "YOUR_CHAINSTACK_ENDPOINT"), network_id: 80001 } }, compilers: { solc: { version: "0.6.6", } } }; ``` where * `mumbai` — any network name that you will pass to the `truffle migrate --network` command. * `HDWalletProvider` — Truffle's custom provider to sign transactions. * PRIVATE\_KEY — the private key of your Polygon account that will deploy the contract. * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint. See also [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials) and [Polygon tooling](/docs/polygon-tooling). * `network_id` — the network ID of the Polygon network: testnet is `80001`, mainnet is `137`. * `solc` — the Solidity compiler version that Truffle must use. OpenZeppelin contracts have a higher version Solidity compiler requirement than the default Truffle installation, hence you must provide a specific compiler version. 2. Run: ```bash Shell truffle migrate --network Mumbai ``` ### Verify your child Polygon ERC-20 contract on the Polygon explorer Once your contract is deployed, you can view it online at the [Polygon Mumbai explorer](https://mumbai.polygonscan.com/). Go to the [Polygon Mumbai explorer](https://mumbai.polygonscan.com/). Find your deployed contract. The address of your contract should have been printed by Truffle at the end of the deployment in the `contract address` field. On the contract page on the explorer, click **Contract** > **Verify and Publish**. In **Compiler Type**, select **Solidity (Single file)**. In **Compiler Version**, select **v0.6.6**. This is the compiler version the default child contract uses as provided by Polygon. In **Open Source License Type**, select **MIT License (MIT)**. Click **Continue**. Keep the **Optimization** option set to **No** as Truffle does not use optimization by default. Paste the entirety of your `ChildERC20.sol` contract in the **Enter the Solidity Contract Code below** field. Click **Verify and Publish**. ### ABI data If on the verification attempt you get a message that the explorer cannot get the ABI data for the contract verification, do the following: 1. Go to the [online ABI encoding service](https://abi.hashex.org/). 2. In the service, provide the [ChildERC20.json](https://github.com/maticnetwork/pos-portal/blob/master/artifacts/ChildERC20.json) ABI data as (ticne). Remove the first and last curly bracket `{}` and `"abi":` or the code will not parse. 3. Click **Parse**. 4. Put your constructor data by adding arguments with the data type: * name\_: `myL2tokenChild` * symbol\_: `ML2T` * decimals\_: `18` * childChainManager: `0x2e5e27d50EFa501D90Ad3638ff8441a0C0C0d75e` 5. Copy the encoded data. 6. Paste the encoded data in the ABI constructor arguments field on the explorer. The explorer will take a few seconds to compile your contract, verify, and publish it. ### Map your Ethereum ERC-20 contract to the Polygon contract Go to the [token mapper](https://mapper.polygon.technology/map). Select **ERC20** and **Goerli Testnet - Mumbai Testnet**. Provide the address of your contract on the Ethereum Goerli testnet and on the Polygon Mumbai testnet. Provide an email address to be notified of when the mapping is done. Click **Submit**. ### Bridge the mapped tokens When your token is mapped, bridge your token from Ethereum to Polygon and back: ## Conclusion This tutorial guided you through the basics of bridging an ERC-20 contract from the Ethereum Goerli testnet to the Polygon Mumbai testnet. The same instructions will work for the Ethereum mainnet and the Polygon mainnet. Polygon has public L2 contract templates and a network of deployed contracts monitored by Heimdall nodes, all of which make it easy to bridge assets from the Ethereum mainnet to the Polygon commit chain. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Polygon zkEVM methods Source: https://docs.chainstack.com/docs/polygon-zkevm-methods See also [interactive Polygon zkEVM API call examples](/reference/zkevm-getting-started). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | zkevm\_batchNumber | | | | zkevm\_batchNumberByBlockNumber | | | | zkevm\_consolidatedBlockNumber | | | | zkevm\_estimateCounters | | | | zkevm\_estimateGasPrice | | | | zkevm\_getBatchByNumber | | | | zkevm\_getBroadcastURI | | | | zkevm\_isBlockConsolidated | | | | zkevm\_isBlockVirtualized | | | | zkevm\_verifiedBatchNumber | | | | zkevm\_virtualBatchNumber | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_getTrieFlushInterval | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | trace\_block | | | | trace\_call | | | | trace\_callMany | | | | trace\_filter | | | | trace\_rawTransaction | | | | trace\_replayBlockTransactions | | | | trace\_replayTransaction | | | | trace\_transaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Polygon zkEVM tooling Source: https://docs.chainstack.com/docs/polygon-zkevm-tooling ### Current API limitations Polygon zkEVM currently supports most of the wide-known JSON-RPC methods. * See [zkEVM methods](/reference/zkevm-rpc-methods) to try out the exclusive `zkevm_*` namespace methods. * See [Polygon zkEVM documentation](https://github.com/0xPolygonHermez/zkevm-node/blob/develop/docs/json-rpc-endpoints.md) for the full list of supported methods. ## Interaction tools ### MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Development tools ### Truffle Configure [Truffle Suite](https://truffleframework.com/) to deploy contracts to your Polygon zkEVM nodes. Install [Truffle Suite](https://truffleframework.com/), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Install the [dotenv](https://www.npmjs.com/package/dotenv) package to securely load your sensitive variables from a `.env` file Create a new environment, and define a new network, in `truffle-config.js`: ```js JavaScript require('dotenv').config(); const HDWalletProvider = require("@truffle/hdwallet-provider"); module.exports = { networks: { zkEVMTestnet: { provider: () => { return new HDWalletProvider([process.env.YOUR_PRIVATE_KEY], `${process.env.YOUR_CHAINSTACK_ENDPOINT}`); }, network_id: 1442, }, }, compilers: { solc: { version: "0.8.17", settings: { optimizer: { enabled: true, runs: 50, }, }, }, }, }; ``` ### Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Polygon zkEVM nodes. Install [Hardhat](https://hardhat.org/) and create a project. Install the [dotenv](https://www.npmjs.com/package/dotenv) package to securely load your sensitive variables from a `.env` file Create a new environment in `hardhat.config.js`: ```js JavaScript require("@nomicfoundation/hardhat-toolbox"); require('dotenv').config(); module.exports = { solidity: "0.8.18", defaultNetwork: "zkEVM_testnet", networks: { zkEVM_testnet: { url: `${process.env.YOUR_CHAINSTACK_ENDPOINT}`, accounts: [process.env.YOUR_PRIVATE_KEY] }, }, }; ``` ### web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Polygon zkEVM nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. #### HTTPS Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```js JavaScript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Polygon zkEVM nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). #### HTTPS Use `HTTPProvider` to connect to your node endpoint and get the latest block number. ```py Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.block_number) ``` ```py Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.block_number) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) ### node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```js JavaScript const axios = require("axios"); const payload = { jsonrpc: "2.0", id: 0, method: "eth_blockNumber", params: [] }; (async () => { const response = await axios.post(`YOUR_CHAINSTACK_ENDPOINT`, payload) console.log(response.data) })(); ``` * YOUR\_CHAINSTACK\_ENDPOINT — your Chainstack node endpoint protected either with the key or password. See node access details. * `query` — your JSON-RPC query. In this case, to get the latest block number. ### ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Polygon zkEVM nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTPS or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). #### HTTPS Use the `JsonRpcProvider` object to connect to your node endpoint and get the balance of any address: ```js JavaScript const ethers = require('ethers'); const NODE_URL = "YOUR_CHAINSTACK_ENDPOINT "; const provider = new ethers.JsonRpcProvider(NODE_URL, NETWORK_ID); const eth_getBalance = async () => { const balance = await provider.getBalance("0x439356Ad40D2f2961c99FFED4453f482AEC453Af"); console.log(balance); }; eth_getBalance() ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * NETWORK\_ID — Polygon zkEVM network ID: * Mainnet: `1101` * Testnet:`1442` ### Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint: ```bash Shell brownie networks add zkEVM ID name="NETWORK_NAME" host=YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-testnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, `zkEVM`. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Polygon zkEVM network ID: * Mainnet: `1101` * Testnet: `1442` Example to run the deployment script: ```bash Shell brownie run deploy.py --network chainstack-testnet ``` ### Remix IDE To make Remix IDE interact with the network through a zkEVM node deployed with Chainstack: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. # Polygon zkEVM: Deploy a smart contract using Hardhat Source: https://docs.chainstack.com/docs/polygon-zkevm-tutorial-deploy-a-smart-contract-using-hardhat **TLDR** * Demonstrates bridging Sepolia ETH to the Polygon zkEVM testnet using a zkEVM bridge UI. * Walks through setting up a Hardhat project to deploy a simple vault contract to zkEVM. * Explains verifying the contract on the zkEVM explorer and sending contract interactions via script and console. * Provides a straightforward developer flow for bridging, deploying, and interacting with zkEVM. ## Main article Polygon zkEVM is the first ever EVM-compatible zero-knowledge rollup to hit the market. This means that developers can leverage existing Web3 tooling and zero-knowledge proofs to deploy smart contracts and execute transactions on the Ethereum network cheaper than ever before. Polygon zkEVM is a layer 2 scaling solution that aims to make Ethereum transactions faster and more efficient using special math called zero-knowledge proofs to ensure transactions are valid and quickly finalized. ZK-rollups execute smart contracts transparently by sharing zero-knowledge proofs of their validity, which allows Polygon zkEVM to work seamlessly with the Ethereum Virtual Machine. If you want to read more about ZK-EVMs and ZK-rollups, you can check out our blog article [zkEVM and zk-rollups explained](https://chainstack.com/zkevm-and-zkrollups-explained/). ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Polygon zkEVM node * [Node.js](https://nodejs.org/en) as the JavaScript framework ## Dependencies * Hardhat: ^2.13.0 * @nomicfoundation/hardhat-toolbox: ^2.0.2 * dotenv: ^16.0.3 ## Overview In this tutorial, we will go over how to bridge funds between Sepolia and zkEVM testnets, as well as how to deploy a smart contract to the testnet using Hardhat. Here is a brief overview of the tutorial: With Chainstack, create a public chain project. With Chainstack, join the Polygon zkEVM testnet. With Chainstack, access your nodes' credentials. Bridge funds between the Sepolia testnet and the zkEVM testnet. Create a Hardhat project using node.js. Install the required dependencies. Create a `.env` file to store the secrets. Edit the Hardhat config file. Write the smart contract. Write and run the deployment script. Deploy smart contracts to the zkEVM testnet. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ethereum Sepolia testnet and the Polygon zkEVM testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Polygon zkEVM node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Fund your wallet Before diving into the project, make sure to top up your wallet with Sepolia ether. You can use our [faucet](https://faucet.chainstack.com) for this. ### Bridging Sepolia ETH to zkEVM testnet We can easily move assets between Ethereum (L1) and Polygon zkEVM (L2) using the zkEVM bridge. The UI interface for the bridge is available at [public.zkevm-test.net](https://public.zkevm-test.net/login). To bridge assets between L1 and L2, the user has to lock up any amount of those assets in the original network using the zkEVM bridge. An equivalent amount of wrapped tokens are then minted in the other chain. Let us go through the process of obtaining zkEVM ETH: 1. Add the Polygon zkEVM to your MetaMask wallet. You can do that by simply creating a Polygon zkEVM node with Chainstack, and adding the network to your MetaMask by clicking on the **Add to MetaMask**. See [Chainstack console](https://console.chainstack.com/). 2. Make sure that your wallet is connected to the Sepolia Testnet and has sufficient Sepolia ETH. 3. Navigate to the [zkEVM bridge interface](https://public.zkevm-test.net/login). 4. Follow the instructions on the bridge to mint some ETH to the zkEVM testnet. After submitting the transaction, it is necessary to wait until the **Finalize** button is activated. Without finalizing, the bridge operation won't be complete. ### Create a Hardhat project Create a new directory for your project, then run the following from a terminal: ```bash Shell npm init -y && npm install --save-dev hardhat ``` After Hardhat is installed, run: ```bash Shell npx hardhat ``` This will launch the Hardhat CLI, which will prompt you to configure a starter project. For this tutorial, click **Yes** on all the prompts Hardhat offers you. ### Set up environment variables This project uses the [dotenv](https://github.com/motdotla/dotenv) package to safely use environment variables. Run the following command in your root directory to install the dotenv package: ```bash Shell npm install dotenv ``` In your project's root directory, create a new file and name it `.env`. Here is where you will set up the environment variables for your Chainststack Polygon zkEVM endpoint and your wallet's private key. ```sh .env YOUR_CHAINSTACK_ENDPOINT="YOUR_CHAINSTACK_POLYGON_ZKEVM_ENDPOINT" YOUR_PRIVATE_KEY="YOUR_WALLET_PRIVATE_KEY" ``` Save the file after you added your information. Now run the following command to load all the environment variables: ```bash Shell source .env ``` ### Edit the Hardhat configuration file You will find a file named `hardhat.config.js` in the root directory. This file is used to configure various settings for your Hardhat projects, such as the network you want to deploy your contracts on, the compilers you want to use, and the plugins you want to enable. Delete the default code in the file and replace it with the following: ```js hardhat.config.js require("@nomicfoundation/hardhat-toolbox"); require('dotenv').config(); module.exports = { solidity: "0.8.18", defaultNetwork: "zkEVM_testnet", networks: { zkEVM_testnet: { url: `${process.env.YOUR_CHAINSTACK_ENDPOINT}`, accounts: [process.env.YOUR_PRIVATE_KEY] }, }, }; ``` Let's break down what each part of the file does: * `require("@nomicfoundation/hardhat-toolbox");` imports the Hardhat Toolbox plugin, which provides several useful tools and utilities for Hardhat projects. * `require("dotenv").config();` loads environment variables from a `.env` file using the `dotenv` package. * `module.exports = { ... }` exports a JavaScript object containing the configuration for the Hardhat project. * `solidity: "0.8.18",` sets the Solidity compiler version to 0.8.18. * `networks: { ... }` defines the network configurations for the Hardhat project. * `defaultNetwork: { ... }` defines the default network that Hardhat will use. * `zkEVM_testnet: { ... }` defines the configuration for the `zkEVM` network. * `url: ${process.env.YOUR_CHAINSTACK_ENDPOINT},` sets the RPC URL for the zkEVM network. * `accounts: [process.env.YOUR_PRIVATE_KEY],` sets the accounts for the `zkEVM` network using the `YOUR_PRIVATE_KEY` environment variable. This will allow the Hardhat project to deploy contracts and interact with the zkEVM testnet using the specified private key. ### Create the simple vault smart contract In the root directory, you will find a directory named `contracts`. Create a new file named `SimpleVault.sol`, and paste the following code inside it: ```js SimpleVault.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.17; /** * @title SimpleVault * @dev A simple vault contract for the exclusive use of its owner. * The owner can deposit and withdraw ether. */ contract SimpleVault { address payable public owner; // Event emitted when ether is deposited event Deposit(address indexed depositor, uint256 amount); // Event emitted when ether is withdrawn event Withdrawal(address indexed owner, uint256 amount); /** * @dev Sets the contract deployer as the owner. */ constructor() { owner = payable(msg.sender); } /** * @dev Modifier to check if the caller is the owner of the contract. */ modifier onlyOwner() { require(msg.sender == owner, "Only the owner can perform this action"); _; } /** * @notice Allows the owner to deposit ether into the contract. * @dev The deposit function must be marked as payable to receive ether. */ function deposit() external payable onlyOwner { require(msg.value > 0, "You must send some ether to deposit"); emit Deposit(msg.sender, msg.value); } /** * @notice Allows the owner to withdraw a specified amount of ether. * @param amount The amount of ether to be withdrawn in wei. */ function withdraw(uint256 amount) external onlyOwner { require(address(this).balance >= amount, "Not enough Ether in the vault"); owner.transfer(amount); emit Withdrawal(owner, amount); } /** * @notice Returns the balance of the contract in ether. * @return The balance of the contract in wei. */ function getBalance() external view returns (uint256) { return address(this).balance; } } ``` This is a simple smart contract that allows us to deposit and withdraw ETH. All the functions are commented on for a better understanding. ### Create and run the deployment script In the `scripts` directory inside the root of your project, you will find a file named `deploy.js`. Replace its content with the following: ```js deploy.js const hre = require("hardhat"); async function main() { const SimpleVault = await hre.ethers.getContractFactory("SimpleVault"); console.log("Deploying your contract, please Wait....."); const simplevault = await SimpleVault.deploy(); await simplevault.deployed(); console.log("Vault Contract deployed to:", simplevault.address); } main() .then(() => process.exit(0)) .catch((error) => { console.error(error); process.exit(1); }); ``` This is a simple deploy script that deploys the `SimpleVault` smart contract to the zkEVM testnet, and returns the address of the newly deployed contract in the terminal. You can search for your contract on the [Polygon zkEVM testnet explorer](https://explorer.public.zkevm-test.net/). To run this script, execute the following command in the terminal: ```bash Shell npx hardhat run --network zkEVM_testnet scripts/deploy.js ``` ### Interact with the smart contract We can interact with a deployed smart contract through Hardhat in two ways: * Write a script to programmatically send a series of transactions to the smart contract. * Interact with the smart contract right from the terminal. Let us do a bit of both. Create a new file named `interact.js` inside the scripts directory. Paste the following code inside it: ```js interact.js require('dotenv').config(); const { ethers } = require('hardhat'); const hre = require("hardhat"); async function main() { const address = 'SMART_CONTRACT'; const Vault = await ethers.getContractFactory('SimpleVault'); const vault = await Vault.attach(address); const currentValue = await vault.getBalance(); console.log('The current value stored in the vault is', currentValue.toString()); console.log('Sending 1 ETH to the vault...'); const newValue = await vault.deposit({value: ethers.utils.parseEther('1')}); console.log(`View transaction at: https://explorer.public.zkevm-test.net/tx/${newValue.hash}`); await newValue.wait(1); // Wait 1 block for the transaction to be verified const updatedValue = await vault.getBalance(); console.log('The new value stored in the vault is', updatedValue.toString()); // const withdrawValue = await vault.withdraw(ethers.utils.parseEther('0.5')); } main() .then(() => process.exit(0)) .catch((error) => { console.error(error); process.exit(1); }); ``` * In this script, we will attach the address of the deployed smart contract to a local instance we initialize via Hardhat. We can then use the RPC URL and the private key we configured in the config file to send transactions and call functions to and from the contract. * You can see that we send 1 ETH to the contract. To execute the script, run the following command in the terminal: ```js JavaScript npx hardhat run --network zkEVM_testnet scripts/interact.js ``` * Open another terminal inside the same directory and run the following command: ```js JavaScript npx hardhat console --network zkEVM_testnet ``` * This will open up a Hardhat console that will allow us to interact with our smart contract via the command line. To connect the console to the deployed smart contract, run: ```js JavaScript const address = 'CONTRACT_ADDRESS'; const Vault = await ethers.getContractFactory('SimpleVault'); const vault = await Vault.attach(address); ``` * You can read the locked vault value from the smart contract by simply running this command in the console: ```js JavaScript await vault.getBalance(); ``` * You can withdraw locked ETH from the console by running this command in the terminal: ```js JavaScript await vault.withdraw(ethers.utils.parseEther('0.5')) ``` And just like that, we used a Hardhat script to deposit ETH into a deployed smart contract, and used the Hardhat console to interact with the smart contract. ## Conclusion This tutorial guided you through bridging funds between the Sepolia testnet and the Polygon zkEVM testnet. We also deployed a smart contract to the zkEVM testnet using Hardhat. ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS Part-time Rust aficionado [](https://github.com/Genesis3800) [](https://twitter.com/PriyankGupta03) [](https://www.linkedin.com/in/priyank-gupta-0308/) # Introduction Source: https://docs.chainstack.com/docs/pricing-introduction ## Pricing principles Each Chainstack’s subscription plan has its own quota that represents the allocated resources and usage limits that users can spend within their chosen plan. In addition to the allocated quota, Chainstack also offers the convenience of "pay-as-you-go" billing for any extra usage beyond the specified limits. This means that if users exceed their allocated quota, they can continue using Chainstack's services seamlessly, with any additional usage being automatically charged at the applicable rates. Visit Chainstack’s [pricing page](https://chainstack.com/pricing/) to get familiar with a quota for each plan and extra usage costs. Quota spending can vary depending on the combination of services that you use. Service cost is reflected in [request units](/docs/pricing-introduction#what-are-request-units). In addition to basic services, you can purchase Chainstack’s add-ons that help you to enrich your experience with the platform and improve your DApp performance. Available add-ons and their cost can be found in the [pricing page](https://chainstack.com/pricing/). ## Coupons & promos Follow the Chainstack [twitter account](https://twitter.com/chainstackhq) & sign up for the [newsletter](https://chainstack.com/newsletter/) to never miss a promo or get a coupon code. We run the promos very often. ## What you can be billed for Chainstack has the following chargeable items: * Request units — for request-based services, such as npdeendpoints, Subgraphs, etc. The cost of a request can vary depending on the service. * Warp transactions — due to the partnership with bloXroute, Warp transactions are charged separately. * Compute hours and hourly rated storage — for dedicated nodes and indexers. * Management fee — for hybrid hosting. [Contact us](https://chainstack.com/contact/) to learn more. ## What are request units? Request unit (RU) is a metric that measures the size and complexity of request processing for Chainstack request-based services. ## Why request units are used for pricing? In the Web3 ecosystem, certain requests can require more extensive data processing or involve complex computations and higher spending of resources. Such requests will cost more RU. This ensures that the pricing is more closely aligned with the actual resource consumption and effort required for each request. The introduction of request units allows us to offer a fairer and more flexible pricing structure, taking into account the diverse nature of blockchain infrastructure operations. See current cost for Chainstack services at the [pricing page](https://chainstack.com/pricing/). ### TON requests For TON, there's no difference between a full node an archive node in data availability or pricing. All data is always available and all node requests are consumed as 1 request unit. ## Payment mechanics Your internal balance serves as a primary payment source to cover any charges incurred from using the Chainstack services. If your balance has sufficient funds, the charges are automatically deducted from it. If the balance is insufficient, Chainstack then proceeds to charge the attached credit card. Currently, the following payment methods can be used to top up your balance: * Credit card * Crypto payment from Chainstack console (powered by Coinbase Commerce) * Wire transfer * Direct crypto payment If your request volume is extra high or you prefer specific payment methods, [contact us](https://chainstack.com/contact/) for a special offer. ## Rate limits See [Limits](/docs/limits). ### Dedicated nodes You can also order [dedicated nodes](/docs/dedicated-node) and pay for the node resource consumption only instead of per-request billing. ## Range limits See [Limits](/docs/limits). # Clients Source: https://docs.chainstack.com/docs/protocols-clients Blockchains can support multiple software implementations, or clients, that connect users to a blockchain network. Clients are developed to be compatible with specific protocols and serve different purposes, for example, allowing for faster transactions or managing smart contracts in an advanced manner. Some clients can also be optimized for different consensus algorithms. You can learn which clients are available for the most popular protocols on Chainstack below. ## Ethereum clients ### Consensus layer * Nimbus — the [Nim](https://github.com/status-im/nimbus-eth2) implementation of the Ethereum Beacon Chain client. ### Execution layer Ethereum node can have one of the following implementations of the execution layer client: * Geth — the [Go Ethereum](https://github.com/ethereum/go-ethereum) implementation. It's bigger in size and can be interacted with by using [Geth JSON-RPC methods](https://eth.wiki/json-rpc/API). To get the Geth client on your node, you must deploy a dedicated node in the full or archive mode. * Erigon — the [Erigon](https://github.com/ledgerwatch/erigon) implementation. It's smaller in size and can be interacted with by using [Erigon RPC methods](https://github.com/ledgerwatch/erigon/blob/devel/cmd/rpcdaemon/README.md#rpc-implementation-status). To get the Erigon client on your node, you must deploy a trader node in the archive mode or a dedicated node in the archive mode. * Reth — available as a [dedicated node](/docs/dedicated-node). ## Polygon clients Polygon node can have one of the following client implementations: * Bor — the native Polygon client. It can be interacted with by using [JSON-RPC methods](https://eth.wiki/json-rpc/API). To get the Bor client on your trader or dedicated node, you must deploy it in the full mode. * Erigon — the [Erigon](https://github.com/ledgerwatch/erigon) implementation. It can be interacted with by using [Erigon RPC methods](https://github.com/ledgerwatch/erigon/blob/devel/cmd/rpcdaemon/README.md#rpc-implementation-status). To get the Erigon client on your trader or dedicated node, you must deploy it in the archive mode. ## BNB Smart Chain clients BNB Smart Chain node can have one of the following client implementations: * Geth — the [Go Ethereum](https://github.com/bnb-chain/bsc) implementation. It's bigger in size and can be interacted with by using [Geth JSON-RPC methods](https://eth.wiki/json-rpc/API). To get the Geth client on your trader or dedicated node, you must deploy in the full mode. * Erigon — the [Erigon](https://github.com/ledgerwatch/erigon) implementation. It's smaller in size and can be interacted with by using [Erigon RPC methods](https://github.com/ledgerwatch/erigon/blob/devel/cmd/rpcdaemon/README.md#rpc-implementation-status). To get the Erigon client on your trader or dedicated node, you must deploy it in the archive mode. ## Base client The client is [op-geth](https://github.com/base-org/node). ## Avalanche client [AvalancheGo](https://github.com/ava-labs/avalanchego) is the node implementation for the Avalanche network in Golang. You can find API documentation in the [Chainstack Avalanche API reference](/reference/avalanche-getting-started). ## Arbitrum client ### Stylus is supported Arbitrum nodes on Chainstack support Stylus. [Nitro](https://github.com/OffchainLabs/nitro) is the official node for the Arbitrum protocol. It is a fully integrated, complete layer 2 optimistic rollup system that includes fraud proofs, the sequencer, token bridges, advanced calldata compression, and more. For available methods, see [Chainstack Arbitrum API reference](/reference/arbitrum-getting-started) and the [official Arbitrum documentation](https://developer.arbitrum.io/). ## zkSync Era The client is [zkSync Era](https://github.com/matter-labs/zksync-era). ## Polygon zkEVM client [zkEVM Node](https://github.com/0xPolygonHermez/zkevm-node) is a Go implementation of a node that operates the Polygon zkEVM network. For available methods, see [Chainstack zkEVM API reference](/reference/zkevm-getting-started) and the [official zkEVM documentation](https://developer.arbitrum.io/). ## Optimism client [Global Nodes](/docs/global-elastic-node) run [op-erigon](https://github.com/testinprod-io/op-erigon). [Trader Nodes](/docs/trader-node) run [op-geth](https://github.com/ethereum-optimism/op-geth). ## Aurora client [Aurora](https://github.com/aurora-is-near/aurora-engine) operates on top of NEAR. It is EVM-compatible, and all its available JSON-RPC methods can be found in the official [Aurora documentation](https://doc.aurora.dev/evm/rpc/) and [GitHub repository](https://github.com/aurora-is-near/relayer2-public). ## Solana client Solana — Solana's [official client](https://github.com/solana-labs/solana) is developed in Rust. JSON-RPC methods are available and can be found in the official [Solana documentation](https://docs.solana.com/api/http). For JavaScript applications, use the [@solana/web3.js](https://github.com/solana-labs/solana-web3.js) library as a convenient interface for interacting with a Solana node using RPC methods. ## Scroll The client is [Geth-based](https://github.com/scroll-tech/scroll). ## Ronin client Go Ronin — the [Go Ethereum](https://github.com/bnb-chain/bsc) implementation that can be interacted with by using [Geth JSON-RPC methods](https://eth.wiki/json-rpc/API). ## Aptos client [Aptos-core](https://github.com/aptos-labs/aptos-core) is the official client for the Aptos protocol. Developers can interact with Aptos via its various SDKs, including the [TypeScript SDK](https://aptos.dev/sdks/ts-sdk/index), [Python SDK](https://aptos.dev/sdks/python-sdk), [Rust SDK](https://aptos.dev/sdks/rust-sdk), and [Unity SDK](https://aptos.dev/sdks/unity-sdk), or through its [REST API](https://aptos.dev/nodes/aptos-api-spec/#/). ## Oasis Saphire client [Oasis-node](https://github.com/oasisprotocol/oasis-core) is written in Go and serves as both a consensus and execution layer client. As it is EVM-compatible, it can be interacted with using the [standard JSON-RPC API](https://ethereum.org/en/developers/docs/apis/json-rpc/). ## Gnosis clients ### Consensus layer Lighthouse — the [Rust](https://github.com/sigp/lighthouse) implementation of the Beacon Chain client. ### Execution layer Nethermind — the [Nethermind](https://github.com/NethermindEth/nethermind) implementation. See the JSON-RPC methods reference in the official [Nethermind documentation](https://docs.nethermind.io/nethermind/ethereum-client/json-rpc). ## Cronos client [Cronos](https://github.com/crypto-org-chain/cronos) is a fully EVM-compatible chain. Developers can connect to it through the [standard JSON-RPC API](https://ethereum.org/en/developers/docs/apis/json-rpc/) or via libraries like [ethers.js](https://docs.ethers.io/v5/), [web3.js](https://web3js.readthedocs.io/), and [web3.py](https://github.com/ethereum/web3.py). ## Sonic client [Sonic](https://www.soniclabs.com/) is the mainnet for Fantom and an EVM-compatible node client for Fantom's network. It can be accessed via [standard JSON-RPC methods](https://ethereum.org/en/developers/docs/apis/json-rpc/). ## Fantom client [Sonic](https://www.soniclabs.com/) is the mainnet for Fantom and an EVM-compatible node client for Fantom's network. It can be accessed via [standard JSON-RPC methods](https://ethereum.org/en/developers/docs/apis/json-rpc/). ## TRON client [java-tron](https://github.com/tronprotocol/java-tron) is the client. See also the [TRON API reference](https://developers.tron.network/reference/background). ## Starknet client [Pathfinder](https://docs.starknet.io/documentation/starknet_versions/pathfinder_versions/) is the node implementation for the Starknet protocol. It supports version `v0.2.1` of the Starknet JSON-RPC [specification](https://github.com/starkware-libs/starknet-specs/blob/v0.2.1/api/starknet_api_openrpc.json) with some minor [differences](https://github.com/eqlabs/pathfinder#api-v021). ## Harmony The client is [Geth-based](https://github.com/harmony-one/harmony). ## Bitcoin client [Bitcoin Core](https://bitcoin.org/en/download) connects to the Bitcoin peer-to-peer network to download and validate blocks and transactions. See available JSON-RPC API methods in the [Bitcoin Postman collection](/reference/bitcoin-rpc-methods-postman-collection). # Configurations Source: https://docs.chainstack.com/docs/protocols-configurations For the node types, see: ### Advanced options on paid plans Dedicated nodes, archive nodes, debug & trace APIs are available on the [paid plans](https://chainstack.com/pricing/). # Modes and types Source: https://docs.chainstack.com/docs/protocols-modes-and-types ## Types Chainstack supports 2 types of nodes for public chains: elastic node and dedicated node. Elastic nodes are available for all networks by default. To get a dedicated node, you'll have to contact us directly. ## Modes For most of the available public chains, Chainstack supports deploying nodes in the following modes: * Full — a node that stores full blockchain data. However, it has limitations to how many blocks are available for querying. * Archive — a node that stores full blockchain data and an archive of historical states, which makes it possible to query any block since the chain genesis. * Trader — a node that propagates transactions to the global mempool at high speed. See [Trader nodes](/docs/warp-transactions). The following modes are available for on-platform public chains: | Protocol | Full | Archive | Full mode querying limits | | ----------------- | --------------------------------------------- | --------------------------------------------- | -------------------------------------------------------------------------------------------------------------------------- | | Ethereum | | | Latest 128 blocks | | Solana | | | Up to the first block available on the node. \~1.5 days worth. | | BNB Smart Chain | | | Latest 128 blocks | | Polygon | | | Latest 128 blocks | | Arbitrum | | | Latest 128 blocks | | Base | | | Latest 128 blocks | | Optimism | | | Latest 128 blocks | | Avalanche | | | Latest 32 blocks of C-Chain | | TON | | | Both v2 and v3 (indexer) are supported | | Ronin | | | Latest 128 blocks | | Blast | | | | | zkSync Era | | | | | Starknet | | | | | Scroll | | | | | Aptos | | | Latest 100,000,000 transactions | | Sonic | | | | | Fantom | | | | | TRON | | | | | Cronos | | | Latest 100 blocks | | Gnosis Chain | | | 1,024 MB of historical states. \~10 hours worth of immediately previous blocks. | | Kaia (ex. Klaytn) | | | | | Celo | | | | | Moonbeam | | | | | Aurora | | | Latest 5 epochs | | Oasis Sapphire | | | | | Polygon zkEVM | | | A full node currently stores the full blockchain data and an archive of historical states starting from the chain genesis. | | Harmony | | | Latest 128 blocks | ### Advanced options on paid plans Dedicated nodes, archive nodes, debug & trace APIs are available [paid plans](https://chainstack.com/pricing/). # Networks Source: https://docs.chainstack.com/docs/protocols-networks This page lists all the supported networks that you can use to deploy your Chainstack nodes. | Protocol | Mainnet | Testnet | Testnet faucet | | --------------- | -------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------- | | Ethereum | Mainnet | Holešky Testnet — a proof-of-stake public permissioned Ethereum testnet. Sepolia Testnet — a proof-of-stake public permissioned Ethereum testnet. | [Chainstack Faucet](https://faucet.chainstack.com) | | Solana | Mainnet | Devnet — public Solana sandbox network | [Solana faucet](https://solfaucet.com/) | | BNB Smart Chain | Mainnet | Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Polygon | Mainnet | Amoy Testnet | [Testnet Polygon faucet](https://faucet.polygon.technology) | | Arbitrum | Arbitrum One Mainnet | Arbitrum Sepolia Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Base | Mainnet | Sepolia Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Optimism | Mainnet | Optimism Sepolia Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Avalanche | Mainnet | Fuji Testnet | [Testnet AVAX faucet](https://faucet.avax-test.network/) | | TON | Mainnet | Testnet | | | Ronin | Mainnet | Saigon Testnet | [Ronin Faucet](https://faucet.roninchain.com) | | Blast | Mainnet | | | | zkSync Era | Mainnet | zkSync Era Sepolia Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Starknet | Mainnet | Sepolia Testnet | [Starknet faucet](https://faucet.goerli.starknet.io) | | Scroll | Mainnet | Sepolia Testnet | [Chainstack Faucet](https://faucet.chainstack.com) | | Sonic | Mainnet | Blaze Testnet | [Blaze faucet](https://testnet.soniclabs.com/) | | Fantom | Mainnet | Testnet | [Fantom faucet](https://faucet.fantom.network/) | | opBNB | Mainnet | | | | Aptos | Mainnet | Testnet | [Aptos faucet](https://aptoslabs.com/testnet-faucet) | | Fantom | Mainnet | Testnet | [Fantom faucet](https://faucet.fantom.network/) | | Cronos | Mainnet | Testnet | [Cronos faucet](https://cronos.org/faucet) | | Gnosis Chain | Mainnet | Chiado Testnet | [Gnosis Chain faucet](https://gnosisfaucet.com) | | Klaytn | Mainnet | | | | Celo | Mainnet | | | | Moonbeam | Mainnet | | | | Aurora | Mainnet | | [Testnet NEAR faucet](https://near-faucet.io/) | | Oasis Sapphire | Mainnet | Testnet | | | Harmony | Mainnet | Testnet | [Harmony faucet](https://faucet.pops.one) | | Polygon zkEVM | Mainnet | Testnet | | | Bitcoin | Mainnet | Testnet | [One of the Bitcoin faucets](https://tbtc.bitaps.com) | # Introduction Source: https://docs.chainstack.com/docs/protocols-tooling-introduction Tooling is a collection of guides that provide an extensive overview of various tools and methods that can be utilized to interact with different blockchain networks. It encompasses a broad range of tools, from command-line interfaces specific to each protocol to programming libraries such as web3.js, web3.py, and web3.php. Most of the protocols also have instructions for using popular blockchain development frameworks like Truffle, Hardhat, and Remix IDE. Each protocol's guide is designed to offer step-by-step instructions for each tool, including installation, configuration, and usage examples, such as how to get the balance of an address, query data using GraphQL, or deploy smart contracts using Truffle or Hardhat. It also provides details on how to connect to blockchain nodes using different protocols and endpoints, and how to secure these connections using keys or passwords. ### Run nodes on Chainstack [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. # Protocols Source: https://docs.chainstack.com/docs/protocols-tutorials These tutorials provide a basic introduction to the protocols supported by Chainstack, enabling you to instantly dive into the world of Web3 development. # Quotas Source: https://docs.chainstack.com/docs/quotas ## Quotas and notifications Each Chainstack’s subscription plan has its own quota that represents the allocated resources and usage limits that users can spend within their chosen plan. In addition to the allocated quota, Chainstack also offers the convenience of "pay-as-you-go" billing for any extra usage beyond the specified limits. This means that if users exceed their allocated quota, they can continue using Chainstack's services seamlessly, with any additional usage being automatically charged at the applicable rates. Visit Chainstack’s [pricing page](https://chainstack.com/pricing/) to get familiar with a quota for each plan and extra usage costs. See [Manage your billing](/docs/manage-your-billing) on how to enable or disable the pay-as-you-go feature. Quota spending can vary depending on the combination of services that you use. Service cost is reflected in [request units](/docs/pricing-introduction#what-are-request-units). You will receive at least two email notifications: * On using up 80% of your plan's quota * On using up 100% ## Inactivity deactivation and notifications To optimize platform performance and maintain efficient resource allocation, unused resources are automatically managed according to specific inactivity periods. ### Nodes * Inactivity period: 30 days * Notification schedule: * First notice: Day 23 * Second notice: Day 28 * Final notice: Day 30 * Action: Automatic deletion after 30 days of inactivity ### Organizations * Inactivity criteria: No active assets within the organization * Suspension timeline: 60 days * Notification schedule: * First notice: Day 53 * Second notice: Day 58 * Final notice: Day 60 * Action: Organization suspension after 60 days # Ronin: Consensus algorithm Source: https://docs.chainstack.com/docs/ronin-consensus-algorithm ## TLDR Ronin on a mix of Proof of Authority (PoA) and Delegated Proof of Stake (DPoS) to gradually become more decentralized with more PoA and less of DPoS. ## Consensus mechanism ### Initial setup: Proof of Authority (PoA) Ronin initially employed PoA where transactions are verified by pre-selected validators, providing fast but centrally controlled validation. ### Transition to Delegated Proof of Stake (DPoS) To advance decentralization, Ronin has integrated DPoS, allowing token holders to delegate their stakes to validators, enhancing the democratic and decentralized validation process. #### DPoS features: * **Validator selection**: Validators are chosen through token holder votes, increasing their stakes and potential to be selected as validators. * **Validator roles**: Consists of Governing Validators (selected through PoA) and Standard Validators (chosen through DPoS by token stake). * **Block production**: Validators take turns producing blocks, maintaining elements of PoA for continuity. ### Validator dynamics * **22 Validator slots**: 12 are Governing Validators, and 10 are open to any qualifying Standard Validators. * **Daily updates**: Validator stakes are updated daily, with the top 10 by stake becoming Standard Validators. * **Delegation**: Token holders can delegate stakes to influence validator selection. ### Staking and rewards * **Token**: RON. * **Minimum stake**: 250,000 RON to become a Validator Candidate. * **Reward distribution**: Validators earn rewards from transaction fees and a portion of the staking rewards, distributed daily via smart contracts. ### Security measures * **Clone attack prevention**: Requires a supermajority to confirm transactions, minimizing the risk of block height duplication attacks. * **Slashing mechanisms**: Penalties for validators engaged in malicious activities, enhancing network integrity. ### Governance and operations * **Bridge operators**: Manage asset transfers between Ronin and other EVM-compatible chains, with specific rewards and penalties. * **Maintenance and unavailability**: Systems in place for maintenance modes and slashing for non-performance, ensuring operational continuity. ### Final remarks Ronin's hybrid consensus approach, combining PoA and DPoS, aims to balance efficiency with increased decentralization, reflecting the evolving landscape of blockchain governance. # Ronin gaming: Overview of Axie & Pixels Source: https://docs.chainstack.com/docs/ronin-gaming-overview-of-axie-pixels Web3 gaming keeps growing & transforming the traditional gaming landscape. The two flagship games currently on Ronin are: Pixels and Axie. ### Get you own Ronin node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## Ronin: A scalable blockchain for Web3 gaming Ronin is an Ethereum-based sidechain developed specifically to support the needs of blockchain games and decentralized applications. Designed to address the scalability and efficiency challenges often associated with Ethereum, Ronin offers a high-performance infrastructure that is tailor-made for the demands of the gaming industry. One of the flagship games on the Ronin's network is the Pixels game, a unique blockchain-based title that allows players to purchase and own individual pixels on a virtual canvas. Pixels leverages Ronin's Ethereum Virtual Machine (EVM) compatibility and efficient transaction processing to provide a seamless and cost-effective gameplay experience. ## What the Pixels game brings to the table The Pixels game is a testament to the capabilities of the Ronin blockchain. Employing a play-to-earn model, the game enables players to earn rewards in the form of the native PIXEL token by engaging in various in-game activities, such as exploring the virtual world, building their own environments, and participating in the thriving community. One of the key mechanics of Pixels is its use of non-fungible tokens (NFTs) to represent the individual pixels on the game's canvas. These Pixel NFTs can be bought, sold, and traded by players, creating a vibrant secondary market and empowering users with true digital ownership. The game's decentralized governance model, powered by a Decentralized Autonomous Organization (DAO), further reinforces the principles of Web3 by allowing the community to shape the game's development and future direction. The Pixels game has already made a significant impact on the broader blockchain gaming ecosystem. Its success has demonstrated the potential of Web3 gaming to create engaging and innovative user experiences, while also showcasing the capabilities of the Ronin blockchain. ## What Axie is and what Axie does Axie Infinity has emerged as the leading blockchain-based game, built on the custom-designed Ronin network. Developed by Sky Mavis, Axie Infinity is a play-to-earn game where players breed, battle, and trade digital pets called Axies, earning the game's native token, Smooth Love Potion (SLP), which can be exchanged for other cryptocurrencies or fiat currency. The success of Axie Infinity has been a driving force behind the adoption and growth of the Ronin blockchain. As of 2022, Axie Infinity boasts over 2 million active players, making it one of the top blockchain games by user activity and transaction volume on the Ronin network. The growth, performance, and the longevity of Axie Infinity on the Ronin blockchain have been impressive. At the start, In Q1 2022, the Ronin network saw over \$2.6 billion in total transaction volume, with more than 560,000 daily active wallets. And Axie has been running ever since. The success of Axie Infinity on the Ronin blockchain is a testament to the potential of blockchain technology in transforming the gaming industry. By providing scalability, cost-efficiency, and true player ownership of in-game assets, Axie Infinity and the Ronin network have paved the way for a new era of blockchain-powered gaming experiences. In terms of performance, the Ronin network has consistently delivered on its promise of scalability and efficiency. The blockchain's high transaction throughput and low gas fees have enabled the Pixels and the Axie games to handle the significant user demand without compromising the user experience. ## Ronin key features: 1. EVM compatibility: Ronin is fully EVM-compatible, allowing developers to easily build and deploy Solidity-based decentralized applications on the network. 2. High throughput: Ronin is designed to scale to handle millions of daily active users and transactions, processing over \$4 billion in NFT volumes to date. 3. Near-instant transactions: With fast block times and minimal fees, Ronin enables near-instant transactions, an essential requirement for real-time gaming applications. 4. Optimized for gaming: Ronin's architecture and design are tailored for the unique needs of Web3 games, providing a smooth, frictionless experience for both players and developers. 5. Decentralized governance: The Ronin network is governed by a Decentralized Autonomous Organization (DAO), allowing the community to participate in important decisions regarding the network's development and evolution. ## Conclusion As evidenced by the user base of Axie and Pixels, the Ronin blockchain ecosystem presents an exciting opportunity for developers to create innovative, blockchain-based games and applications. By understanding the technical features, developer resources, and practical examples, developers can make informed decisions about whether Ronin is the right fit for their project needs. The platform's growing user base, low-cost transactions, and cross-chain compatibility make it a promising option for Web3 developers looking to build the next generation of decentralized applications. # Ronin methods Source: https://docs.chainstack.com/docs/ronin-methods See also [interactive Ronin API call examples](/reference/getting-started-ronin). | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | txpool\_content | | | | txpool\_inspect | | | | txpool\_contentFrom | | | | txpool\_status | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Ronin: on-chain meta racing game Source: https://docs.chainstack.com/docs/ronin-on-chain-meta-racing-game **TLDR** * Ronin is an EVM protocol tailored for Web3 gaming, here used to build a meta racing game called *Race the Ronin Chain*. * Players deposit \$RON, predict parts of block hashes, and win if their predictions match the actual chain blocks over a 50-block race. * Computation for determining winners is done off-chain for efficiency, with final results submitted on-chain by the contract owner. * The mechanics showcase a blend of on-chain transparency and off-chain checks, making for a fun yet cost-effective Web3 game experiment. ## Introduction Ronin is an EVM protocol purpose-built to serve the unique needs of the gaming industry. Developed by Sky Mavis, the creators of Axie Infinity, Ronin stands out for its ability to support online games, specifically in the realm of Web3 gaming. A purpose-built blockchain protocol needs a purpose built blockchain game — and that's what we created here. The game is called *Race the Ronin Chain*. Here is the [full game repo on Chainstacklabs](https://github.com/akegaviar/race-ronin-chain). ### NFP Not for production (NFP) obviously. Feel free to take the source, modify to your needs, and boost the Web3 gaming ecosystem. We assume no responsibility for the code. Moreover, this is a very rough unaudited contract. **TLDR of the game** It's a *meta* racing game because you run the game on the system you race on. The premise is very simple: * Anyone can create a race on the contract * A race is a range of 50 blocks * A number of players can enter the created race * When entering a race, the players commit a up to 50 of 3-character hash values or slots called *predictions* * The game mechanics is that if there's a match of one of the committed 3-character values with a 64-character block hash in the range of 50 blocks race range, the player advances one step forward * The player that advances the most by the end of the race (i.e. gets the most predictions right), wins * There's also other stuff like: when entering a race, each player pays the entrance fee of 1 $RON; the winner gets all of the players' $RON; and the treasury keeps 1% on each win Interested? Let's run through all the details. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Ronin node. * [Foundry](https://getfoundry.sh/) to compile, test, and deploy the contract. * [web3.py](https://github.com/ethereum/web3.py) for the participation and the winner calculation & submission scripts. ## Quick start It's all in the repository: [race-ronin-chain](https://github.com/akegaviar/race-ronin-chain). ### Solidity 0.8.19 & PUSH0 Keep the Solidity compiler to 0.8.19 unless the Ronin chain supports PUSH0 when you are reading this. Otherwise you won't be able to deploy the contract. Assuming you have all the prerequisites satisfied. ### Generate the contract ABI The one in the repository will work, but if you modify the contract, by far the easiest way to generate the ABI is to run: ```shell Shell forge build --silent && jq '.abi' ./out/RaceRoninChain.sol/RaceRoninChain.json > /root/race-ronin-chain/abi/RaceRoninChain.abi ``` ### Deploy the contract Ronin is not EIP-1559 compatible, so keep the transactions to legacy. ```shell Shell forge create src/RaceRoninChain.sol:RaceRoninChain --private-key PRIVATE_KEY --constructor-args TREASURY_ADDRESS --legacy ``` where * PRIVATE\_KEY — your deployer private key. Important to remember that for the RaceRoninChain contract the message sender is the owner. This means that your will be submitting the race winner from this deployer account (see later in the article). * TREASURY\_ADDRESS — the house keeps 1% on each prize distribution, so this where the \$RON will go from every race in your house fees. ### Interact with the contract There are two scripts in the repository in the `python_scripts` directory: * `enter_race.py` — generates 50 random 3-character predictions for each of the players, starts a race, and enrols the four players in the race * `compute_stats_and_submit_winner.py` — does the winner calculation and submits the winner In each of the scripts, check the commented lines and make sure you provide all the necessary variables like keys, addresses, and Chainstack endpoints for Ronin, of course. ## Game mechanics & other important considerations There's quite a bit of nuance to this seemingly simple game, so let's do a run-down. Pretty sure there's a lot of uncharted territory too. The game runs over a course of 50 blocks on the Ronin chain. Players try predicting parts of block hashes that will appear during this span. Each Ronin block comes with a unique hash—a 64-character hexadecimal string. The crux of the game is in forecasting these hash segments correctly. To participate, players commit to the game by executing a transaction to the contract's `enterRace` function at a specified block number. This commitment not only signifies the start of a new race but also allows for a transparent view of the number of participants, the stakes involved in terms of total \$RON, and the kickoff block. When entering, players also submit their wager—the entry fee in \$RON with up to 50 three-character predictions of the block hashes that will appear over the race duration. As the race progresses, players' positions are determined by how accurately their predictions match the actual block hashes within the race's 50-block duration. A correct prediction, matching a segment of a block's hash, propels a player forward in the race. Conversely, incorrect guesses leave a player lagging. At the race's conclusion, the game tallies each participant's successful predictions against the block hashes that appeared during the race. The player whose predictions align most closely with the actual block hashes wins. In the event of a tie, the game honors the principle of "first come, first served"—the player who first submitted their predictions is declared the winner, rewarding promptness and deterring mimicry (but not front-running — more on that later). ### Off-chain computation & other considerations #### Off-chain compute As I'm sure you noticed, the off-chain winner computation & submission is the most glaring thing about this game, so let's explore it. Doing an on-chain computation is prohibitively expensive and is straight impossible at scale on any EVM network. This leaves us with the only option to move the winner compute to off-chain. Here's the current implementation: * the `compute_stats_and_submit_winner.py` script runs and checks whether the latest race is completed * if completed, the script retrieves the block hashes for all 50 blocks from the start block to the end block of the race * then the script calculates the winner based on the entered predictions and the actual block hashes * the script submits the winner from the contract owner (the deployer) * the contract does an on-chain verification of the submitted results and accepts or reverts the transaction. The verification is a check whether the submitted winning address & the full string of the submitted predictions are a part of the player and of this race id. In the future, it should be worth it moving the off-chain compute to something like [Space and Time's Verifiable Web3 Off-Chain Compute Layer](https://www.spaceandtime.io/blog/verifiable-off-chain-compute-for-smart-contracts) when it's fully available. For now, however, the bet is on keeping the compute component trusted (as opposed to trustless) in a way that if the players and the community detect foul play, they will abandon the project. #### Front-running & copy-trading While copy-trading has a bit of a protection through the "first come, first served" model (discussed earlier), the front-running would still be an issue. On the other hand, these and all the other unexplored mechanics are a part of what makes the game fun. #### High gas cost The cost of participation scales with the number of predictions made due to gas fees, which can significantly affect a player's strategy regarding the number of predictions to submit. For example, a 10 slot prediction will cost you ballpark 350k gas, while a full 50 slot prediction will cost you about 1,300k gas. Submitting more slots increases your chances to win the race, though. ## Conclusion Explore both the creator and user gaming ideas in the Ronin ecosystem and hopefully your project will end up in the [top Ronin games](https://chainplay.gg/chain/ronin/). There might be a lot of opportunity in that segment. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Ronin tooling Source: https://docs.chainstack.com/docs/ronin-tooling ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Ronin nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Ronin endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## web3.js Build DApps using [web3.js](https://github.com/web3/web3.js) and Ronin nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password. ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password. ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Ronin nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number. ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Ronin nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Ronin network ID: * Ronin Mainnet: `2020` * Saigon Testnet: `2021` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Ronin network ID: * Ronin Mainnet: `2020` * Saigon Testnet: `2021` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). # Ronin: Make a game's smart contract Source: https://docs.chainstack.com/docs/ronin-tutorial-making-a-game-contract **TLDR:** * Ronin is an EVM-compatible chain optimized for large-scale Web3 gaming, offering streamlined gameplay and robust security. * This tutorial walks you through deploying a Ronin node on the Saigon Testnet and creating a Hardhat-based smart contract for a simple blockchain game. * The contract accepts deposits, verifies if players can participate, and pays out winners while retaining funds when players lose—illustrating a basic on-chain gaming model. * You’ll learn to configure Hardhat, manage environment variables, and deploy your contract to Ronin for full transparency and fair gameplay. ## Introduction to Ronin Ronin, an Ethereum Virtual Machine (EVM) compatible blockchain, is purpose-built to serve the unique needs of the gaming industry. Developed by Sky Mavis, the creators of Axie Infinity, Ronin stands out for its ability to support large-scale online games, specifically in the realm of Web3 gaming. ### Key technical features of Ronin * **Optimized for gaming**: Ronin is designed to streamline the gaming experience by removing the complexities commonly found in other blockchains. This results in a platform that is efficient, with minimal spam and optimized uptime for games. * **Security enhancements**: in response to security challenges faced in the past, Ronin has undergone extensive security overhauls. These include rigorous internal security protocols, comprehensive code reviews, and architecture audits to ensure robust security measures are in place. ### Consensus mechanisms in Ronin * **Proof-of-authority (PoA)** — Ronin initially utilized the PoA consensus mechanism. In this system, a select group of validators, trusted for their expertise and reputation, were responsible for maintaining the network. This approach facilitated faster transaction speeds and lower fees due to its energy-efficient design. * **Transition to delegated-proof-of-stake (DPoS)** — Ronin integrated the DPoS consensus mechanism to advance decentralization. This allowed broader participation in the network's maintenance, where anyone holding enough RON tokens could become a validator. While retaining the benefits of PoA, such as efficiency and low costs, this shift markedly improved the blockchain's decentralization. Check out the [Ronin docs](https://docs.roninchain.com/basics/introduction) to learn more. Since Ronin is designed to develop games, today, we'll make a smart contract that can handle for games. In this tutorial, you will: * Deploy a Ronin node on the Saigon Testnet. * Create the game smart contract with Hardhat. * Deploy the smart contract with Hardhat ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Ronin node. * [node.js](https://nodejs.org/en/) as the JavaScript framework. * [Hardhat](https://hardhat.org/hardhat-runner/docs/getting-started#overview) to create, deploy, and interact with contracts. ## Overview This tutorial guides you through the process of developing a smart contract for a blockchain-based game specifically tailored for deployment on the Ronin Saigon Testnet. We aim to develop a versatile and robust smart contract capable of managing the core game logic on the blockchain. The game itself is designed to be played on the client side, typically within a web browser. Players will have the ability to connect their Ronin wallet and deposit a specified fee to begin gameplay. The smart contract plays a pivotal role in the gaming experience: it securely handles the deposit and, depending on the game's outcome, executes the payout. To get from zero to a working game, do the following: With Chainstack, create a public chain project. With Chainstack, join the Ronin Saigon Testnet. With Chainstack, access your Ronin node endpoint. With Hardhat, create and set up the project. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Ronin Saigon testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Ronin node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Fund your wallet Before diving into the game project, make sure to top up your wallet with testnet RON. * Install the [Ronin wallet](https://wallet.skymavis.com/). * Use the [Ronin faucet](https://faucet.roninchain.com/). ### Install Hardhat See [Installing Hardhat](https://hardhat.org/hardhat-runner/docs/getting-started#installation). ### Install dotenv Install the `dotenv` package to securely manage environment variables. ```shell Shell npm i dotenv ``` ### Create a Hardhat project Create a new directory for your project, then run the following from a terminal: ```shell Shell npx hardhat ``` This will launch the Hardhat CLI, prompting you to choose a starter project. For this project, answer **yes** to the following: * `Create a JavaScript project` * `Do you want to install this sample project's dependencies with npm (hardhat @nomicfoundation/hardhat-toolbox)?` ### Edit the Hardhat configuration file You will find a file named `hardhat.config.js` in the root directory. This file configures various settings for your Hardhat projects, such as the network you want to deploy your contracts on, the compilers you want to use, and the plugins you want to enable. Delete the default code in the file and replace it with the following: ```javascript hardhat.config.js require("@nomicfoundation/hardhat-toolbox"); require("dotenv").config(); const RONIN_SAIGON_CHAINSTACK = process.env.RONIN_SAIGON_CHAINSTACK; const PRIVATE_KEY = process.env.RONIN_PRIVATE_KEY; /** @type import('hardhat/config').HardhatUserConfig */ module.exports = { networks: { saigon: { url: RONIN_SAIGON_CHAINSTACK, accounts: [PRIVATE_KEY], }, }, solidity: "0.8.23", }; ``` Let's break down what each part of the file does: * `require("@nomicfoundation/hardhat-toolbox");` imports the Hardhat Toolbox plugin, which provides several useful tools and utilities for Hardhat projects. * `require("dotenv").config();` loads environment variables from a `.env` file using the `dotenv` package. * `module.exports = { ... }` exports a JavaScript object containing the configuration for the Hardhat project. * `solidity: "0.8.23",` sets the Solidity compiler version to 0.8.23. * `networks: { ... }` defines the network configurations for the Hardhat project. In this case, it defines a network called `saigon` that connects to the Ronin Saigon blockchain network. * `saigon: { ... }` defines the configuration for the `saigon` network. * `url: RONIN_SAIGON_CHAINSTACK,` sets the URL for the Saigon network using the `RONIN_SAIGON_CHAINSTACK` environment variable. * `accounts: [PRIVATE_KEY],` sets the accounts for the `saigon` network using the `PRIVATE_KEY` environment variable. This will allow the Hardhat project to deploy contracts and interact with the Saigon network using the specified private key. ### Create the game-handling smart contract In the root directory, you will find a directory named `contracts` with a sample contract in it. Rename this contract to `Game.sol` and replace its code with the following: ```sol Game.sol // SPDX-License-Identifier: MIT pragma solidity ^0.8.23; /// @title Game Contract /// @notice This contract allows players to deposit ETH to play a game and handles game results and payouts. contract Game { /// @notice Minimum deposit amount required to play the game uint256 public constant MINIMUM_DEPOSIT = 1 ether; /// @notice Maximum deposit amount allowed for the game uint256 public constant MAXIMUM_DEPOSIT = 2 ether; /// @notice Mapping to track user deposits mapping(address => uint256) public deposits; /// @notice Owner address address private owner; /// @notice Sets the contract deployer as the owner constructor() payable { owner = msg.sender; } /// @notice Ensures that only the owner can call the function modifier onlyOwner() { require(msg.sender == owner, "Only the owner can call this function"); _; } /// @dev Deposits ETH to the contract to play the game /// @notice Deposits must be between 1 and 2 Ethers /// @notice Contract must have enough funds to pay potential winnings function deposit() public payable { require(msg.value >= MINIMUM_DEPOSIT && msg.value <= MAXIMUM_DEPOSIT, "Deposit must be between 1 and 2 Ethers"); require(address(this).balance + msg.value >= MAXIMUM_DEPOSIT * 2, "Contract does not have enough funds to cover potential winnings"); deposits[msg.sender] += msg.value; } /// @notice Checks if the user has enough deposit to play /// @param user The address of the user to check /// @return True if the user has enough deposit, false otherwise function canPlay(address user) public view returns (bool) { return deposits[user] >= MINIMUM_DEPOSIT; } /// @notice Handles the game result and processes payouts or deposit retention /// @dev Only callable by the owner /// @param player The address of the player /// @param userWon Indicates whether the player won or not function gameResult(address player, bool userWon) public onlyOwner { require(canPlay(player), "Player did not deposit enough to play"); uint256 depositAmount = deposits[player]; if (userWon) { require(address(this).balance >= depositAmount * 2, "Contract does not have enough funds"); payable(player).transfer(depositAmount * 2); } else { // Keep the deposit in the contract if AI wins } deposits[player] = 0; } /// @notice Allows the owner to withdraw all funds from the contract /// @dev Only callable by the owner function withdraw() public onlyOwner { uint256 balance = address(this).balance; require(balance > 0, "No funds to withdraw"); payable(owner).transfer(balance); } // Additional functions can be added here } ``` ### Default logic Keep in mind that this smart contract is a proof of concept and it should not be used in production as is. A security audit is higly recomended. ## Understanding the smart contract This smart contract, designed for a blockchain-based game, operates on a simple yet effective mechanism. It allows players to deposit RON (or the native currency of the chain you are using) within a specified range—a minimum of 1 ether and a maximum of 2 ethers—to participate in the game. This range ensures fair play and manages the contract's ability to pay winnings. The contract tracks these deposits against each player's address, maintaining a balance reflecting their current game stake. Let's break down each element and function of the smart contract. ### Contract overview * **Contract name** — `Game` * **Purpose** — this contract allows players to deposit ETH to play a game, and it manages the game results and payouts. ### Constants 1. **MINIMUM\_DEPOSIT** * **Type** — `uint256` (unsigned integer) * **Purpose** — specifies the minimum amount of ether a player must deposit to play the game. * **Value** — 1 ether 2. **MAXIMUM\_DEPOSIT** * **Type** — `uint256` * **Purpose** — indicates the maximum amount of ether a player can deposit. * **Value** — 2 ethers ### State variables 1. **deposits** * **Type** — `mapping(address => uint256)` * **Purpose** — keeps track of the amount of ether each player (address) has deposited. 2. **owner** * **Type** — `address` * **Purpose** — stores the address of the contract owner, who has special privileges (like executing the `gameResult` and `withdraw` functions). ### Constructor * **Functionality** — sets the deployer of the contract as the `owner`. ### Modifiers 1. **onlyOwner** * **Purpose** — restricts the execution of certain functions to only the contract owner. ### Functions 1. **deposit** * **Access** — public * **Payment Type** — payable (can receive ether) * **Purpose** — allows players to deposit ETH within the allowed range (1 to 2 ethers). It also ensures the contract has enough funds to cover potential winnings. * **Logic** — updates the `deposits` mapping with the player's deposit amount. 2. **canPlay** * **Access** — public * **Purpose** — checks if a user has deposited enough ETH to play the game. * **Parameters** — `user` (address of the player) * **Returns** — `bool` (`True` if the player has enough deposit, `False` otherwise) 3. **gameResult** * **Access** — public, but restricted to `onlyOwner` * **Purpose** — processes the outcome of the game. It either pays out double the deposit to the player if they win or retains the deposit in the contract if they lose. * **Parameters**: * `player` — address of the player * `userWon` — boolean indicating whether the player won or not * **Logic** — if the player wins, it transfers double the deposit amount to them and resets their deposit to zero. If the player loses, just resets their deposit. 4. **withdraw** * **Access** — public, but restricted to `onlyOwner` * **Purpose** — allows the owner to withdraw all ETH stored in the contract. * **Logic** — transfers the entire contract balance to the owner's address. ### TL;DR * Players can participate in the game by depositing a certain amount of ether (between 1 and 2 ethers). * The contract ensures fairness and readiness for payouts before accepting deposits. * After the game ends, the result is communicated to the contract. Winners receive double their stake, while the stakes of those who lose remain with the contract. * Only the contract owner can process game results and withdraw funds from the contract, ensuring controlled and secure operations. ### Environment variables In the root directory of the Hardat project, create a `.env` file for your endpoint and private keys: ``` RONIN_SAIGON_CHAINSTACK="YOUR_CHAINSTACK_RONIN_ENDPOINT" RONIN_PRIVATE_KEY="YOUR_RONIN_WALLET_PRIVATE_KEY" ``` ### Create the deploying script In the `scripts` directory inside the root of your project, you will find a file named `deploy.js`. Replace its content with the following: ```javascript deploy.js const hre = require("hardhat"); async function main() { console.log("Deploying contract..."); const GameContract = await hre.ethers.deployContract("Game"); // Deploy the contract. await GameContract.waitForDeployment(); console.log("Contract deployed to:", GameContract.target); const roninAddress = GameContract.target.substring(2); console.log( `Find the contract at https://saigon-app.roninchain.com/address/ronin:${roninAddress}` ); } // We recommend this pattern to be able to use async/await everywhere // and properly handle errors. main().catch((error) => { console.error(error); process.exitCode = 1; }); ``` This code is a script that deploys a `Game` smart contract. Here's a breakdown of what each part of the script does: 1. **Import Hardhat runtime environment (HRE)**: ```javascript Javascript const hre = require("hardhat"); ``` This line imports the Hardhat runtime environment, which provides various utilities for working with Ethereum, such as deploying contracts. 2. **Main function**: ```javascript Javascript async function main() { console.log("Deploying contract..."); const GameContract = await hre.ethers.deployContract("Game"); // Deploy the contract. await GameContract.waitForDeployment(); console.log("Contract deployed to:", GameContract.target); const roninAddress = GameContract.target.substring(2); console.log( `Find the contract at https://saigon-app.roninchain.com/address/ronin:${roninAddress}` ); } ``` The `main` function is an asynchronous function where the main logic of the script is executed. 3. **Starting deployment process**: ```javascript Javascript console.log("Deploying contract..."); ``` This line prints a message to the console indicating that the deployment process is starting. 4. **Deploying the contract**: ```javascript Javascript const GameContract = await hre.ethers.deployContract("Game"); ``` This line uses Hardhat's `ethers` plugin to deploy a contract named `Game`. The `await` keyword is used because `deployContract` is an asynchronous operation. 5. **Waiting for deployment completion**: ```javascript Javascript await GameContract.waitForDeployment(); ``` This line waits for the contract deployment to be completed. It's important to wait for the deployment to finish before proceeding. 6. **Logging the deployed contract address**: ```javascript Javascript console.log("Contract deployed to:", GameContract.target); ``` After the contract is successfully deployed, this line logs the address of the deployed contract to the console. 7. **Removing the `0x` prefix from the address**: ```javascript Javascript const roninAddress = GameContract.target.substring(2); ``` This line removes the `0x` prefix from the Ethereum address using the `substring` method. As the Ronin explorer uses this format: `https://saigon-app.roninchain.com/address/ronin:49a1EA88e5F81850DE30Dc038c1d08028ecFc9b5`. 8. **Providing the contract address on Ronin explorer**: ```javascript Javascript console.log( `Find the contract at https://saigon-app.roninchain.com/address/ronin:${roninAddress}` ); ``` This line constructs a URL to view the contract on the Ronin blockchain explorer and logs it to the console. It appends the modified address to the explorer's URL. 9. **Error handling**: ```javascript Javascript main().catch((error) => { console.error(error); process.exitCode = 1; }); ``` This part of the script ensures that if any errors occur during the execution of the `main` function, they are caught and printed to the console, and the script exits with an error code. ### Deploy the smart contract To deploy the `Game` contract, run the following command in the terminal: ``` npx hardhat run --network saigon scripts/deploy.js ``` This will deploy the contract on Ronin Saigon Testnet displaying something similar to the following: ``` Deploying contract... Contract deployed to: 0x49a1EA88e5F81850DE30Dc038c1d08028ecFc9b5 Find the contract at https://saigon-app.roninchain.com/address/ronin:49a1EA88e5F81850DE30Dc038c1d08028ecFc9b5 ``` You can now find the contract on the [Saigon Explorer](https://saigon-app.roninchain.com/). You can also find the transactions from the Ronin wallet. ## Next steps Now you have a working smart contract deployed, the next step will be to build a front end with your game and wallet interaction. ## Conclusion In this comprehensive tutorial, we journeyed through the exciting world of blockchain-based game development on the Ronin blockchain, an EVM-compatible platform optimized for gaming. From setting up a node on the Ronin Saigon testnet to deploying a game-centric smart contract using Hardhat, we've laid down a robust foundation for blockchain game developers. The key takeaway from this tutorial is the seamless integration of blockchain technology into gaming. By deploying a smart contract on Ronin, we have created a system that enhances the gaming experience and ensures secure and fair gameplay. The ability to handle in-game financial transactions directly on the blockchain, including player deposits and payouts, showcases the power and versatility of smart contracts in gaming environments. ### About the author Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Scroll methods Source: https://docs.chainstack.com/docs/scroll-methods | Method | Availability | Comment | | ---------------------------------------- | --------------------------------------------- | ------- | | eth\_accounts | | | | eth\_blockNumber | | | | eth\_call | | | | eth\_chainId | | | | eth\_estimateGas | | | | eth\_feeHistory | | | | eth\_gasPrice | | | | eth\_getAccount | | | | eth\_getBalance | | | | eth\_getBlockByHash | | | | eth\_getBlockByNumber | | | | eth\_getBlockReceipts | | | | eth\_getBlockTransactionCountByHash | | | | eth\_getBlockTransactionCountByNumber | | | | eth\_getCode | | | | eth\_getFilterChanges | | | | eth\_getFilterLogs | | | | eth\_getLogs | | | | eth\_getProof | | | | eth\_getStorageAt | | | | eth\_getTransactionByBlockHashAndIndex | | | | eth\_getTransactionByBlockNumberAndIndex | | | | eth\_getTransactionByHash | | | | eth\_getTransactionCount | | | | eth\_getTransactionReceipt | | | | eth\_getUncleCountByBlockHash | | | | eth\_getUncleCountByBlockNumber | | | | eth\_maxPriorityFeePerGas | | | | eth\_newBlockFilter | | | | eth\_newFilter | | | | eth\_newPendingTransactionFilter | | | | eth\_signTransaction | | | | eth\_subscribe | | | | eth\_syncing | | | | eth\_uninstallFilter | | | | eth\_unsubscribe | | | | eth\_sendRawTransaction | | | | net\_listening | | | | net\_peerCount | | | | net\_version | | | | web3\_clientVersion | | | | web3\_sha3 | | | | debug\_getBadBlocks | | | | debug\_storageRangeAt | | | | debug\_traceBlock | | | | debug\_traceBlockByHash | | | | debug\_traceBlockByNumber | | | | debug\_traceCall | | | | debug\_traceTransaction | | | | scroll\_estimateL1DataFee | | | | admin\_addPeer | | | | admin\_addTrustedPeer | | | | admin\_datadir | | | | admin\_exportChain | | | | admin\_importChain | | | | admin\_nodeInfo | | | | admin\_peerEvents | | | | admin\_peers | | | | admin\_removePeer | | | | admin\_removeTrustedPeer | | | | admin\_startHTTP | | | | admin\_startWS | | | | admin\_stopHTTP | | | | admin\_stopWS | | | # Scroll tooling Source: https://docs.chainstack.com/docs/scroll-tooling ## Geth Interact with your Scroll node using [Geth](https://geth.ethereum.org/docs/getting-started). Install [Geth](https://github.com/ethereum/go-ethereum). Use `geth attach` command with the node endpoint. ```shell Shell geth attach YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Invoke any methods from [Web3 JavaScript API](https://web3js.readthedocs.io/). Example below demonstrates how to get the balance of an address in wei value and convert it to ether value: ```js Javascript > web3.fromWei(web3.eth.getBalance("0xde0b295669a9fd93d5f28d9ec85e40f4cb697bae")) 642538.078574759898951277 ``` ## GraphQL You can use GraphQL on a dedicated node on the [paid plans](https://chainstack.com/pricing/). ### UI You can query data using the graphical interface. On Chainstack, navigate to your dedicated Scroll node. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). Hover over **GraphQL IDE URL** and click **Open**. In the graphical interface that opens, run a GraphQL query. Example to get the latest block number: ```graphql GraphQL { block { number } } ``` ### Node.js You can build a web app to query data using node.js and [axios](https://www.npmjs.com/package/axios): ```javascript Javascript const axios = require('axios'); const main = async () => { try { const result = await axios.post( 'YOUR_CHAINSTACK_ENDPOINT', { query: ` { block { number } } ` } ); console.log(result.data); } catch(error) { console.error(error); } } main(); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node GraphQL endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * `query` — your GraphQL query. In this case, to get the latest block number. See also [Using GraphQL with EVM-compatible nodes](https://support.chainstack.com/hc/en-us/articles/4409604331161-Using-GraphQL-with-EVM-compatible-nodes). ## MetaMask On [node access details](/docs/manage-your-node#view-node-access-and-credentials), click **Add to MetaMask**. ## Truffle Configure [Truffle Suite](https://truffleframework.com) to deploy contracts to your Scroll nodes. Install [Truffle Suite](https://truffleframework.com), [HD Wallet-enabled Web3 provider](https://github.com/trufflesuite/truffle/tree/develop/packages/hdwallet-provider), and create a project. Create a new environment in `truffle-config.js`, add your mnemonic phrase generated by [a wallet](https://ethereum.org/en/wallets/find-wallet/) and the Scroll endpoint instead of YOUR\_CHAINSTACK\_ENDPOINT: ```javascript Javascript const HDWalletProvider = require("@truffle/hdwallet-provider"); const mnemonic = 'pattern enroll upgrade ...'; ... module.exports = { networks: { chainstack: { provider: () => new HDWalletProvider(mnemonic, "YOUR_CHAINSTACK_ENDPOINT"), network_id: "*" }, } } }; ``` ## Hardhat Configure [Hardhat](https://hardhat.org/) to deploy contracts and interact through your Scroll nodes. Install [Hardhat](https://hardhat.org/) and create a project. Create a new environment in `hardhat.config.js`: ```javascript Javascript require("@nomiclabs/hardhat-waffle"); ... module.exports = { solidity: "0.7.3", networks: { chainstack: { url: "YOUR_CHAINSTACK_ENDPOINT", accounts: ["YOUR_PRIVATE_KEY"] }, } }; ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password. See [node access details](/docs/manage-your-node#view-node-access-and-credentials). * YOUR\_PRIVATE\_KEY — the private key of the account that you use to deploy the contract Run `npx hardhat run scripts/deploy.js --network chainstack` and Hardhat will deploy using Chainstack. See also [Forking EVM-compatible mainnet with Hardhat](https://support.chainstack.com/hc/en-us/articles/900004242406). ## Remix IDE To make Remix IDE interact with the network through a Chainstack node: Get [MetaMask](https://metamask.io/) and set it to interact through a Chainstack node. See [Interacting through MetaMask](#metamask). In Remix IDE, navigate to the **Deploy** tab. Select **Injected Provider - MetaMask** in **Environment**. This will engage MetaMask and make Remix IDE interact with the network through a Chainstack node. For a detailed tutorial with Remix IDE, see [Trust fund account with Remix](/docs/ethereum-tutorial-trust-fund-account-with-remix). ## web3.js Build DApps using [web3.js](https://github.com/ethereum/web3.js/) and Scroll nodes deployed with Chainstack. Install [web3.js](https://web3js.readthedocs.io/). Connect over HTTP or WebSocket. ### HTTP Use the `HttpProvider` object to connect to your node HTTPS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.HttpProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const Web3 = require('web3'); const web3 = new Web3(new Web3.providers.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')); web3.eth.getBlockNumber().then(console.log); ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node WSS endpoint protected either with the key or password ## web3.py Build DApps using [web3.py](https://github.com/ethereum/web3.py) and Scroll nodes deployed with Chainstack. Install [web3.py](https://web3py.readthedocs.io/). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `HTTPProvider` to connect to your node endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.HTTPProvider('https://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * HOSTNAME — your node HTTPS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebsocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```python Key Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('YOUR_CHAINSTACK_ENDPOINT')) print(web3.eth.blockNumber) ``` ```python Password Protected from web3 import Web3 web3 = Web3(Web3.WebsocketProvider('wss://%s:%s@%s'% ("USERNAME", "PASSWORD", "HOSTNAME"))) print(web3.eth.blockNumber) ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint protected either with the key or password * HOSTNAME — your node WSS endpoint hostname * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). See also [WebSocket connection to an EVM node](https://support.chainstack.com/hc/en-us/articles/900001918763-WebSocket-connection-to-an-Ethereum-node). ## web3j Build DApps using [web3j](https://github.com/web3j/web3j) and Scroll nodes deployed with Chainstack. Use the `HttpService` object to connect to your node endpoint. Example to get the latest block number: ```java Java package getLatestBlock; import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.web3j.protocol.Web3j; import org.web3j.protocol.core.DefaultBlockParameterName; import org.web3j.protocol.core.methods.response.EthBlock; import org.web3j.protocol.exceptions.ClientConnectionException; import org.web3j.protocol.http.HttpService; import okhttp3.Authenticator; import okhttp3.Credentials; import okhttp3.OkHttpClient; import okhttp3.Request; import okhttp3.Response; import okhttp3.Route; public final class App { private static final String USERNAME = "USERNAME"; private static final String PASSWORD = "PASSWORD"; private static final String ENDPOINT = "ENDPOINT"; public static void main(String[] args) { try { OkHttpClient.Builder clientBuilder = new OkHttpClient.Builder(); clientBuilder.authenticator(new Authenticator() { @Override public Request authenticate(Route route, Response response) throws IOException { String credential = Credentials.basic(USERNAME, PASSWORD); return response.request().newBuilder().header("Authorization", credential).build(); } }); HttpService service = new HttpService(RPC_ENDPOINT, clientBuilder.build(), false); Web3j web3 = Web3j.build(service); EthBlock.Block latestBlock = web3.ethGetBlockByNumber(DefaultBlockParameterName.LATEST, false).send().getBlock(); System.out.println("Latest Block: #" + latestBlock.getNumber()); } catch (IOException | ClientConnectionException ex) { Logger.getLogger(App.class.getName()).log(Level.SEVERE, null, ex); } } } ``` where * ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) See also [the full code on GitHub](https://github.com/chainstack/web3j-getLatestBlock). ## ethers.js Build DApps using [ethers.js](https://github.com/ethers-io/ethers.js/) and Scroll nodes deployed with Chainstack. Install [ethers.js](https://www.npmjs.com/package/ethers). Connect over HTTP or WebSocket. See also [EVM node connection: HTTP vs WebSocket](https://support.chainstack.com/hc/en-us/articles/900002187586-Ethereum-node-connection-HTTP-vs-WebSocket). ### HTTP Use the `JsonRpcProvider` object to connect to your node endpoint and get the latest block number: ```javascript Key Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` ```javascript Password Protected const { ethers } = require("ethers"); var urlInfo = { url: 'YOUR_CHAINSTACK_ENDPOINT', user: 'USERNAME', password: 'PASSWORD' }; var provider = new ethers.providers.JsonRpcProvider(urlInfo, NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password * USERNAME — your node access username (for password-protected endpoints) * PASSWORD — your node access password (for password-protected endpoints) * NETWORK\_ID — Scroll network ID: * Testnet: `534351` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ### WebSocket Use the `WebSocketProvider` object to connect to your node WSS endpoint and get the latest block number: ```javascript Javascript const { ethers } = require("ethers"); const provider = new ethers.providers.WebSocketProvider('YOUR_CHAINSTACK_ENDPOINT', NETWORK_ID); provider.getBlockNumber().then(console.log); ``` where * YOUR\_CHAINSTACK\_ENDPOINT — your node WSS endpoint endpoint protected either with the key or password * NETWORK\_ID — Scroll network ID: * Testnet: `534351` See also [node access details](/docs/manage-your-node#view-node-access-and-credentials). ## Brownie Install [Brownie](https://eth-brownie.readthedocs.io/en/stable/install.html). Use the `brownie networks add` command with the node endpoint. For example, Scroll mainnet: ```shell Shell brownie networks add Scroll ID name="NETWORK_NAME" host= YOUR_CHAINSTACK_ENDPOINT chainid=NETWORK_ID ``` where * ID — any name that you will use as the network tag to run a deployment. For example, `chainstack-mainnet`. * NETWORK\_NAME — any name that you want to identify the network by in the list of networks. For example, **Mainnet (Chainstack)**. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS or WSS endpoint protected either with the key or password * NETWORK\_ID — Scroll network ID: * Testnet: `534351` Example to run the deployment script: ```shell Shell brownie run deploy.py --network chainstack-mainnet ``` ## Foundry Install [Foundry](https://github.com/gakonst/foundry/). Use `--rpc-url` to run the operation through your Chainstack node. ### Forge Use [forge](https://github.com/gakonst/foundry/tree/master/forge) to develop, test, and deploy your smart contracts. To deploy a contract: ```shell Shell forge create CONTRACT_NAME --contracts CONTRACT_PATH --private-key YOUR_PRIVATE_KEY --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where * CONTRACT\_NAME — name of the contract in the Solidity source code * CONTRACT\_PATH — path to your smart contract * YOUR\_PRIVATE\_KEY — the private key to your funded account that you will use to deploy the contract. * YOUR\_CHAINSTACK\_ENDPOINT — your node HTTPS endpoint protected either with the key or password ### Cast Use [cast](https://github.com/gakonst/foundry/tree/master/cast) to interact with the network and the deployed contracts. To get the latest block number: ```shell Shell cast block-number --rpc-url YOUR_CHAINSTACK_ENDPOINT ``` where YOUR\_CHAINSTACK\_ENDPOINT is your node HTTPS endpoint protected either with the key or password # Scroll: Deploy Uniswap V3 on Scroll Source: https://docs.chainstack.com/docs/scroll-tutorial-deploy-the-uniswap-v3-smart-contracts-on-scroll **TLDR:** * Shows how to deploy the entire Uniswap v3 suite onto the Scroll Sepolia Testnet in a few steps * Guides through setting up the v3 deploy repository, configuring a Scroll RPC endpoint, and specifying WETH and admin settings * Explains core Uniswap v3 contracts (Router, Factory, NFT positions, migrator, etc.) and how they compose the AMM * Emphasizes the Scroll environment’s hierarchical ZKP optimizations, making this deployment straightforward and efficient ## What is Scroll? [Scroll](https://scroll.io/) network is a new scaling solution for Ethereum that uses zero-knowledge proofs to make transactions fast, secure, and cheap. It's fully compatible with the EVM (Ethereum Virtual Machine), so developers can easily port their DApps to Scroll without any changes. Chainstack recently announced support for Scroll’s Sepolia Testnet, which makes it even easier for developers to get started. In this tutorial, you’ll learn how to deploy the full Uniswap v3 AMM suite on Scroll in just a few easy steps. ## Why use Scroll? [Scroll's hierarchical ZKP](https://docs.scroll.io/en/technology/) system is optimized for both provers and verifiers, allowing it to achieve faster transaction finality and lower gas fees than other layer-2 scaling solutions. The first layer of Scroll's ZKP system is optimized for provers, using custom circuit optimization and hardware-friendly proving algorithms. This allows provers to generate proofs much faster, reducing the overall transaction latency. The second layer of Scroll's ZKP system is optimized for verifiers, using succinct proofs and verification algorithms compatible with the EVM. This allows verifiers to verify proofs much faster, reducing the overall transaction cost. As a result of its hierarchical ZKP system, Scroll is able to achieve much faster transaction finality and lower gas fees than other layer-2 scaling solutions. This makes it a great choice for applications that require high throughput and low latency, such as DeFi and NFT trading. ## Prerequisites * [Chainstack account](https://console.chainstack.com/) to deploy a Scroll Sepolia node * [node.js](https://nodejs.org/en/) as the JavaScript framework * The [yarn](https://yarnpkg.com/) package manager * TypeScript V4.2.3 ## Overview This tutorial shows you how to deploy the Uniswap V3 smart contracts on the Scroll Sepolia Testnet. To get from zero to a deployed instance of Uniswap V3, do the following: With Chainstack, create a public chain project. With Chainstack, join the Scroll Sepolia Testnet. With Chainstack, access your Scroll Sepolia node endpoint. Fund your account through the Chainstack Faucet. Deploy the Uniswap v3 suite. ## Step-by-step ### Create a public chain project See [Create a project](/docs/manage-your-project#create-a-project). ### Join the Scroll Sepolia Testnet See [Join a public network](/docs/manage-your-networks#join-a-public-network). ### Get your Scroll Sepolia node endpoint See [View node access and credentials](/docs/manage-your-node#view-node-access-and-credentials). ### Fund your wallet Before diving into the project, top up your wallet with testnet Scroll ETH. The best option is to get some Sepolia ETH from the Chainstack Faucet and then bridge them to Scroll Sepolia: 1. Get Sepolia ETH from the [Chainstack Sepolia Faucet](https://faucet.chainstack.com). 2. Bridge to Scroll Sepolia using the [Scroll Bridge](https://sepolia--scroll-io.netlify.app/bridge). To deploy the Uniswap v3 Suite you will need about 0.5 ETH in your wallet with a gas price of 11 gwei. Make sure you have enough funds in your wallet before you proceed. ### Deploy the Uniswap v3 suite In this step, we will deploy the full Uniswap v3 AMM suite on Scroll network. And we will leverage the [v3 deploy](https://github.com/Uniswap/deploy-v3) official repository to automate the process. The repository contains a CLI script for deploying the latest Uniswap v3 smart contracts to any EVM-compatible network. This repository collects almost all of the contracts that make up Uniswap v3 and provides scripts to deploy them easily in a single command. Uniswap is one of the most popular decentralized finance (DeFi) solutions in the Web3 space. Uniswap v3 is the third major version of Uniswap, and it allows users to swap ERC-20 tokens without a centralized intermediary. Since its launch in May 2021, Uniswap v3 has quickly become the most popular [DeX](/docs/web3-language-and-acronyms#dex) in the world, with over \$100 billion in [total value locked](/docs/smart-contracts-glossary#total-value-locked). #### Set up the CLI Clone the repository: ```shell Shell git clone https://github.com/Uniswap/deploy-v3.git ``` Make sure you install TypeScript V4.2.3: ```shell Shell yarn add typescript@4.2.3 --dev ``` Move into the repository's directory: ```shell Shell cd deploy-v3 ``` Install the dependencies and build: ```shell Shell yarn && yarn build ``` The console should have a similar output: ``` $ ncc build index.ts -o dist -m ncc: Version 0.36.1 ncc: Compiling file index.js into CJS ncc: Using typescript@4.2.3 (local user-provided) 1480kB dist/index.js 1480kB [3043ms] - ncc 0.36.1 $ cat shebang.txt dist/index.js > dist/index.cmd.js && mv dist/index.cmd.js dist/index.js ✨ Done in 3.76s. ``` #### Deploy the smart contracts For the next step, you will need the following: * The private key of the account deploying the smart contracts, starting with `0x` * The Chainstack Scroll Sepolia HTTPS endpoint * The address of the account that will manage the smart contracts In the console, run the following command, adding your private key, endpoint, and address: ```shell Shell yarn start \ -pk YOUR_PRIVATE_KEY \ -j YOUR_CHAINSTACK_ENDPOINT \ -w9 0xa1EA0B2354F5A344110af2b6AD68e75545009a03 \ -ncl ETH -o YOUR_ADDRESS ``` Here is a list of what the flags mean: * `-pk` — your deployment account's private key. * `-j` — the JSON-RPC endpoint. * `-w9` — the address of the WETH9 contract. On the Scroll Testnet, this contract is deployed at `0xa1EA0B2354F5A344110af2b6AD68e75545009a03`. * `-ncl` — the native currency label. This is the symbol for the native token, which is `ETH`. * `-o` — the owner's address of the deployed administrator contract. Below is the full list of options: ```shell Shell > npx @uniswap/deploy-v3 --help Usage: npx @uniswap/deploy-v3 [options] Options: -pk, --private-key Private key used to deploy all contracts -j, --json-rpc JSON RPC URL where the program should be deployed -w9, --weth9-address
Address of the WETH9 contract on this chain -ncl, --native-currency-label Native currency label, e.g. ETH -o, --owner-address
Contract address that will own the deployed artifacts after the script runs -s, --state Path to the JSON file containing the migrations state (optional) (default: "./state.json") -v2, --v2-core-factory-address
The V2 core factory address used in the swap router (optional) -g, --gas-price The gas price to pay in GWEI for each transaction (optional) -c, --confirmations How many confirmations to wait for after each transaction (optional) (default: "2") -V, --version output the version number -h, --help display help for command ``` Running this command will deploy the contracts one by one; the console will display each deployment like the following: ``` Step 1 complete [ { message: 'Contract UniswapV3Factory deployed', address: '0x0287f57A1a17a725428689dfD9E65ECA01d82510', hash: '0x9832d37fc03460326a4a17a77bf81ad0f7c4e75a4a5d495230bc49cb93f7d1d6' } ] Step 2 complete [ { message: 'UniswapV3Factory added a new fee tier 1 bps with tick spacing 1', hash: '0x8f9e1cf9fc09efbcad4e56885e041d1bbe9bae53432f798d53122fc3ceb9477b' } ] . . . ``` You can also find the details of the contracts in the `state.json` file. Congratulations, you deployed all of the Uniswap v3 smart contracts on the Scroll Sepolia Testnet. Use the [block explorer](https://sepolia-blockscout.scroll.io/) to see the transactions. ## Understanding the smart contracts ### `UniswapV3Factory` [The UniswapV3Factory contract](https://docs.uniswap.org/protocol/reference/core/UniswapV3Factory) is the central hub for all Uniswap v3 pools. It allows users to create new pools, query information about existing pools, and add or remove liquidity from pools. The UniswapV3Factory contract is also responsible for managing the protocol fees for all Uniswap v3 pools. ### `UniswapInterfaceMulticall` [UniswapInterfaceMulticall](https://docs.uniswap.org/contracts/v3/reference/periphery/base/Multicall) is a helper contract allowing users to group multiple function calls into one call. This can be useful for reducing the number of transactions that need to be sent to the blockchain and ensuring that all calls are executed in the same block. ### `ProxyAdmin` and `TransparentUpgradeableProxy` A proxy gateway contract is a smart contract that allows for the easy upgrade of other smart contracts. This is essential for any project that wants to evolve over time without having to fork the blockchain. ### `TickLens` The [TickLens](https://docs.uniswap.org/protocol/reference/periphery/lens/TickLens) contract is a peripheral contract in the Uniswap v3 protocol. It provides functions for querying information about the ticks in a Uniswap v3 pool. The `TickLens` contract can be used to: * Find out how many ticks are in a pool. * Determine which ticks are initialized. * Identify which ticks have liquidity. ### `NonfungibleTokenPositionDescriptor` and `NonfungiblePositionManager` The [NonfungibleTokenPositionDescriptor](https://docs.uniswap.org/contracts/v3/reference/periphery/NonfungibleTokenPositionDescriptor) contract stores the ID of NFTs and their respective `NonFungiblePositionManger` contract address. The [NonfungiblePositionManager](https://docs.uniswap.org/contracts/v3/reference/periphery/NonfungiblePositionManager) manages the positions and liquidity for ERC-721 non-fungible tokens. ### `V3Migrator` The [V3Migrator](https://docs.uniswap.org/contracts/v3/reference/periphery/V3Migrator) contract is a tool that helps developers migrate their Uniswap v2 pools to Uniswap v3. ### `UniswapV3Staker` The [UniswapV3Staker](https://docs.uniswap.org/protocol/reference/periphery/staker/UniswapV3Staker) contract is a peripheral contract in the Uniswap v3 protocol. It allows users to stake their ERC-20 tokens in Uniswap v3 pools in order to earn rewards. The rewards are paid out as the pool's native token. The `UniswapV3Staker` contract is critical to the Uniswap v3 protocol because it incentivizes liquidity provision. By allowing users to earn rewards for staking their tokens, the `UniswapV3Staker` contract helps to ensure that there is always enough liquidity in Uniswap v3 pools to facilitate swaps. This makes the protocol more decentralized and efficient. ### `QuoterV2` and `SwapRouter02` The `Quoter` and `SwapRouter` contracts are essential for users who want to exchange tokens on Uniswap. [QuoterV2](https://docs.uniswap.org/protocol/reference/periphery/lens/QuoterV2) is a peripheral contract in the Uniswap v3 protocol. It allows users to get price quotes for swaps without executing the swap. `QuoterV2` uses the Uniswap v3 core, periphery contracts, and the `PoolTicksCounter` library. It includes functions to quote exact input or output amounts and handles reverts by counting initialized ticks crossed. [SwapRouter02](https://docs.uniswap.org/contracts/v3/reference/periphery/SwapRouter) executes swaps across different asset pools. It is a stateless contract, meaning it holds no token balances. This makes it more efficient and secure than other types of contracts. The `SwapRouter` contract considers the price of tokens, the liquidity in different pools, and the fees for swaps to calculate the best price for a swap. It then executes the swap, transferring tokens between pools and updating users' balances. ## Conclusion In conclusion, deploying Uniswap v3 on Scroll is a relatively straightforward process. Following the steps outlined in this tutorial, you can create a Uniswap v3 pool on Scroll in just a few minutes. Once your pool is deployed, you can start providing liquidity and earning fees. ### About the author Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Statistics Source: https://docs.chainstack.com/docs/see-statistics Check the requests stats for your nodes and subgraphs on your [Statistics](https://console.chainstack.com/statistics) page. # Sending Trader Node Warp transactions with web3.js, ethers.js, web3.py, and ethClient.go Source: https://docs.chainstack.com/docs/sending-warp-transaction-with-web3js-ethersjs-web3py-and-ethclientgo **TLDR** * Chainstack Trader nodes use a bloXroute-powered BDN behind the scenes to send your transactions to validators at high speed – a Warp transaction. * Improves transaction propagation vs. normal node endpoints, so your transaction is seen sooner and has better chances in tight mempool conditions. * No extra sign-up is needed for bloXroute; you keep using standard Ethereum JSON-RPC calls. * Examples are given for web3.js, ethers.js, web3.py, and Go’s ethclient. ## Main article ### EVM transactions This guide focuses on the EVM [Trader nodes](/docs/warp-transactions): Ethereum, BNB Smart Chain, Polygon. ## Introduction A [Warp](/docs/warp-transactions) transaction is an integration of Chainstack and the high-speed [bloXroute](https://bloxroute.com/) network. Warp transactions are designed to reduce transaction times and ensure efficient delivery to their destination. Simply put, when a transaction is sent through a Trader node with Warp transactions endpoint, it is placed on an "expressway" for blockchain transaction propagation. This means that compared to sending transactions through a standard EVM node, a Warp transaction can reach validators much more quickly, significantly reducing transaction finalization time. This article describes how to implement Warptransactions with Chainstack and popular Web3 libraries. ## What is a Warp transaction, and what is bloXroute? When a transaction is sent to a normal EVM node, it is propagated to all nodes within the blockchain network. It moves from node to node until a validator validates it or eventually reaches all nodes. Depending on network conditions, this process usually takes a few hundred milliseconds to a few seconds. Check out how to test transaction propagation by reading [Monitoring transaction propagation from node to mempool in EVM networks with Python](/docs/monitoring-transaction-propagation-from-node-to-mempool-in-evm-networks-with-python). A bloXroute distribution network (BDN), which can be found in more detail in the [BDN Architecture](https://docs.bloxroute.com/bdn-architecture), is a special network with a direct connection to a set of validators. The core of the BDN is a high-capacity, low-latency global network of servers optimized to propagate transactions and blocks quickly. In general, a transaction sent to the BDN will be delivered to a validator in an expedited manner. According to their [website](https://bloxroute.com/why-bloxroute/), bloXroute's BDN provides several benefits, including: * Faster transaction propagation: bloXroute's BDN can propagate transactions up to 400-1,000 ms faster than the traditional peer-to-peer (P2P) network. This can be a significant advantage for applications that require real-time transaction processing, such as trading and DeFi. * Improved network stability: bloXroute's BDN is more stable than the P2P network. This is because the servers are connected using a dedicated network, reducing the risk of congestion and packet loss. * Increased scalability: bloXroute's BDN is designed to scale to meet the needs of future blockchain applications. The network is dynamic and can be easily scaled up or down as needed. ### What about Trader nodes with Warp transactions? A Trader node with Warp transactions enabled is a hybrid of a normal EVM node and a BDN, combining the best of both worlds. In most circumstances, it behaves just like a normal EVM node. However, when it receives a transaction from the user, it immediately forwards the transaction to the closest [bloXroute cloud API](https://docs.bloxroute.com/introduction/cloud-api-ips), which serves as the gateway to the BDN. The transaction does not follow the normal propagation process. There are a few benefits of using Trader nodes with Warp transactions: * Users can use a Trader node with Warp transactions to avoid the need for separate signup for the [bloXroute service](https://bloxroute.com/pricing/). This is particularly convenient for users who send a low number of transactions each month. * bloXroute uses a special set of APIs for authentication and sending transactions, which may not be compatible with the customer's existing implementation. In contrast, a Chainstack Trader node with Warp transactions leverages the standard Ethereum JSON-RPC API, making it compatible with most mainstream tools and libraries, such as Metamask, web3.py, and web3.js. * When monitoring transactions within a network, using a Trader node with Warp transactions may be a better option since it is directly connected to other peers in the network. ## How to access a Trader node with Warp transactions on Chainstack The Trader node transactions feature is available on the [paid plans](https://chainstack.com/pricing/) for the following protocols: * Ethereum * Solana * BNB Smart Chain To deploy a Trader node, in the wizard select a Trader node > Warp transactions. This is the gateway to your application. ## How to execute a Warp transaction A Warp transaction is a type of transaction that is dispatched to your node using the [eth\_sendRawTransaction](/reference/ethereum_node_api/execute_transactions/eth_sendRawTransaction) RPC method. This method is a standard Ethereum procedure for transmitting transactions. In this section, we will dive deeper into the process of sending Warp transactions using some of the most widely used Web3 libraries. We will provide examples for web3.js, ethers.js, web3.py, and ethClient.go. ### Executing a Warp transaction with web3.js web3.js is a library that allows you to interact with a local or remote Ethereum node using HTTP, IPC, or WebSocket. Here's how you can send a Warp transaction using web3.js: ```javascript Javascript const Web3 = require("web3"); var url = "YOUR_CHAINSTACK_TRADER_NODE_ENDPOINT" var publicKey = "YOUR_ADDRESS" var privateKey = "YOUR_PRIVATE_KEY" const web3 = new Web3(url); const transaction = { from: publicKey, to: publicKey, value: 100, gas: 30000, gasPrice: 1000000000, }; const signedTransaction = await web3.eth.accounts.signTransaction(transaction, privateKey); const transactionHash = await web3.eth.sendSignedTransaction(signedTransaction.rawTransaction); const transactionReceipt = await web3.eth.getTransactionReceipt(transactionHash); console.log(transactionReceipt); ``` ## Executing a Waro\p transaction with ethers.js ethers.js is a complete Ethereum library and wallet implementation prioritizing compactness and simplicity. Here's how you can send a Warp transaction using ethers.js: ```javascript Javascript const { ethers } = require("ethers"); var url = "YOUR_CHAINSTACK_TRADER_NODE_ENDPOINT" var publicKey = "YOUR_ADDRESS" var privateKey = "YOUR_PRIVATE_KEY" async function main() { const provider = new ethers.JsonRpcProvider(url); balance = await provider.getBalance(publicKey) console.log("balance is " + balance) var nonce = await provider.getTransactionCount(publicKey); console.log("nonce is " + nonce) var wallet = new ethers.Wallet(privateKey, provider) var transaction = { 'to': publicKey, 'value': 100, 'maxGasFee': 6000000000000000 }; console.log("sending tx") wallet.sendTransaction(transaction) .then((txObj) => { console.log('txHash', txObj.hash) }) } main(); ``` ## Executing a Warp transaction with web3.py web3.py is a Python library for interacting with Ethereum. It's commonly used for scripting, back-end development, and server-side tasks. Here's how you can send a Warp transaction using web3.py: ```python Python import web3 url = "YOUR_CHAINSTACK_TRADER_NODE_ENDPOINT" publicKey = "YOUR_ADDRESS" privateKey = "YOUR_PRIVATE_KEY" w3 = web3.Web3(Web3.HTTPProvider(url)) transaction = { "from": publicKey, "to": publicKey, "value": 100, "gas": 300000, "gasPrice": 1000000000, } signed_transaction = w3.eth.account.sign_transaction(transaction, privateKey) transaction_hash = w3.eth.send_raw_transaction(signed_transaction.rawTransaction) transaction_receipt = w3.eth.wait_for_transaction_receipt(transaction_hash) print(transaction_receipt) ``` ## Executing a Warp transaction with ethClient.go ethClient.go is a Go-based Ethereum client that allows you to interact with the Ethereum blockchain. Here's how you can send a Warp transaction using ethClient.go: ```go go package main import ( "fmt" "math/big" "github.com/ethereum/go-ethereum/ethclient" ) func main() { client, err := ethclient.Dial("YOUR_CHAINSTACK_TRADER_NODE_ENDPOINT") if err != nil { panic(err) } publicKey := "" privateKey := "" // Create the transaction transaction := types.Transaction{ From: publicKey, To: publicKey, Value: big.NewInt(1000000000000), // in wei Gas: 3000, GasPrice: big.NewInt(1000000000), // in wei } signedTransaction, err := transaction.Sign(types.Secp256k1PrivateKey(crypto.FromECDSA(common.HexToECDSA(privateKey)))) if err != nil { panic(err) } txHash, err := client.SendTransaction(signedTransaction) if err != nil { panic(err) } fmt.Println(txHash) } ``` ## Conclusion This article has provided an in-depth look at the integration between Chainstack and bloXroute, the mechanics of Warp transactions, and how to execute Warp transactions using the most popular Web3 libraries. Feel free to ping me on Twitter or Telegram if you have any questions. Developer Advocate @ Chainstack BUIDLs on Ethereum, zkEVMs, The Graph protocol, and IPFS [](https://twitter.com/wuzhongzhu) [](https://www.linkedin.com/in/wuzhong-zhu-44563589/) [](https://github.com/wuzhong-zhu) # Smart contracts Source: https://docs.chainstack.com/docs/smart-contracts-glossary A smart contract is a program that runs on a blockchain, such as Ethereum, Polygon, Arbitrum, BNB Chain, Solana, etc. It is deployed at a specific address on the blockchain. Smart contracts enable self-executing agreements with predefined rules and conditions, eliminating the need for intermediaries. ## Keccak256 Keccak256 is a cryptographic hash function that generates a unique, fixed-size string of bytes for each unique input it receives. This feature makes it useful for ensuring data integrity, as any change in the input data leads to a different hash output. It's virtually impossible to derive the original input from the hash output, making it a one-way function. In Ethereum: 1. Keccak256 is used to generate Ethereum addresses from public keys. 2. It's used to verify the integrity of transactions. 3. It's used for efficient and secure data storage and retrieval in Merkle Trees. 4. It's used to identify functions in Ethereum's contract ABI. Keccak256 is crucial for the security and functionality of Ethereum and similar blockchain systems. ## Account An account in the context of Ethereum refers to an entity with an ether (ETH) balance that can send transactions on the Ethereum network. There are two types of accounts: externally owned accounts (EOAs), which are controlled by private keys, and contract accounts, which are deployed smart contracts. ## Address In the context of blockchain, an address refers to the hashed form of a public key that belongs to a specific user or contract. It is used to identify participants on the blockchain network. In Ethereum, addresses are typically represented as hexadecimal strings and are used to send and receive transactions. ## Address checksumming A checksummed address is a standard Ethereum address with certain characters capitalized to include a checksum validation. Checksumming is a way of including error-detection codes in an Ethereum address. The goal of checksumming is to prevent errors when an address is typed manually. An Ethereum address is a 40 character hexadecimal string derived from the last 20 bytes of the Keccak256 hash of the public key. To create a checksummed address, the address (excluding the '0x' prefix) is hashed using Keccak256, which outputs a 64 character hexadecimal string. This string is used to determine the capitalization of each alphanumeric character in the original address: if the Nth digit of the Keccak hash is 8 or more, the Nth character in the original address is capitalized, if it is a letter. Here's a simplified explanation of how it works: 1. You have an original Ethereum address: `0xae2fc483527b8ef99eb5d9b44875f005ba1fae13` (all lowercase). 2. You hash the lowercase hexadecimal address using Keccak256, which produces a new hash. 3. You go through each character of the original address: * If the corresponding digit in the hash is 8 or greater, you capitalize that character. * If the corresponding digit in the hash is less than 8, the character remains lowercase. 4. The resulting Ethereum address with this specific capitalization is the checksummed address. The capitalization adds an extra layer of validation: if the address is mistyped, the chances are high that the capitalization will not match the correct one, and the address can be recognized as invalid. It's important to note that not all tools and services recognize checksummed addresses, and they may treat the addresses as case-insensitive. However, using checksummed addresses can significantly reduce the risk of funds being sent to incorrect addresses due to typing errors. ## Chainlink VRF Chainlink Verifiable Random Function (VRF) is a provably-fair and verifiable source of randomness designed for smart contracts. It's a solution to one of the major challenges in blockchain development: generating a secure and truly random number. Here's how it works: **Request**: A smart contract requests randomness from the Chainlink VRF. This request includes a seed value, which can be any number - often, the current block's hash or timestamp is used. **Generate**: Chainlink VRF generates a random number and a cryptographic proof. The random number is generated based on the seed and the private key of the oracle node. **Respond**: The random number and the proof are sent back to the smart contract. **Verify**: The smart contract verifies the proof. It checks that the Oracle node used its known public key and the seed from the original request to generate the random number. If the proof is valid, the random number is accepted. This process ensures that the random number is truly random and that the oracle node cannot tamper with the result. It provides a reliable way for smart contracts to obtain secure and provably fair random numbers. Chainlink VRF is often used in blockchain games, gambling apps, and anywhere else a random input is needed in a smart contract. ## Token A token represents an asset or value on a blockchain. Tokens can be fungible or non-fungible. Fungible tokens, such as ERC-20 tokens, are identical and mutually interchangeable, while non-fungible tokens (NFTs), such as ERC-721 tokens, are unique and have distinct characteristics. ## ERC-20 ERC-20 is a widely adopted standard for creating and issuing tokens on the Ethereum blockchain. It defines a set of rules and functions that allow for the implementation of fungible tokens. Many popular cryptocurrencies, such as USDT, Apecoin, Dogecoin, and Shiba Inu are ERC-20 tokens. ## ERC-721 ERC-721 is a standard for creating and managing non-fungible tokens (NFTs) on the Ethereum blockchain. Unlike ERC-20 tokens, each ERC-721 token is unique and has its own distinct properties and ownership. ERC-721 tokens are commonly used to represent digital collectibles, in-game assets, or unique items. ## Renounced ownership of smart contracts Renouncing ownership of a smart contract means the original developer or owner relinquishes control over the contract and the associated assets. This is often done to enhance trust and decentralization within the Web3 ecosystem. When ownership is renounced, the contract becomes immutable, and nobody can change how it operates. ## Smart contract ABI ABI stands for Application Binary Interface and is a standardized way to interact with smart contracts on the Ethereum network. It defines the functions and data structures of a smart contract in a format that can be understood and used by external entities. The ABI specifies the encoding and decoding rules for data exchange between contracts and external applications. ## Smart contract audit A smart contract audit is a comprehensive review and analysis of a protocol's smart contract code to identify potential security vulnerabilities or flaws. The audit aims to ensure that the smart contract functions as intended, without any loopholes or weaknesses that could compromise the security or user funds. Audits are typically conducted by specialized firms or security experts. ## Solidity Solidity is the most popular programming language used by Web3 developers to write smart contracts for the Ethereum blockchain. It is a statically-typed, high-level language specifically designed for creating smart contracts. Solidity enables developers to define the behavior and logic of contracts and is supported by a range of development tools and frameworks. ## Total value locked Total value locked (TVL) is a metric used to assess the total value of assets locked or invested in a specific decentralized finance (DeFi) platform or smart contract. It represents the sum of all the assets (tokens or cryptocurrencies) users have deposited or staked within the protocol. TVL is often used as an indicator of a protocol's popularity and the amount of trust users place in it. ## Transactions Transactions in the context of blockchain are cryptographically signed instructions initiated by accounts. A transaction represents a state-changing operation on the Ethereum network, such as transferring ETH from one account to another, invoking a smart contract function, or deploying a new smart contract. Transactions contain relevant information, such as sender and recipient addresses, transaction value, gas price, and data payload. # Solana Agave 2.0 upgrade reference Source: https://docs.chainstack.com/docs/solana-agave-20-upgrade-reference A quick reference on the Solana node client transition to Agave and the deprecated methods **TLDR** * Agave replaces the original Solana node client and deprecates several outdated calls by October 21, 2024. * Most newly recommended methods (e.g., `getSignatureStatuses`) are already standard, so your existing workflows likely remain unaffected. * The code examples illustrate how to switch from `confirmTransaction` to `getSignatureStatuses` in both JavaScript and Python. ## In brief [Agave](https://github.com/anza-xyz/agave) is the Rust-based Solana node client that is also a fork of the original Solana Labs client. The primary purpose of Agave replacing the original Solana Labs client is make the network multi-client. The multi-client part mostly means the introduction of another node client — the C-based Firedancer. Nodes are expected to upgrade to Agave v2.0 by **October 21, 2024**. For you as a developer, this means there are a few methods that will be deprecated with other methods replacing them. The methods that are getting the deprecation have already been outdated for quite a bit of time and all of the replacement methods (except for [isBlockhashValid](/reference/solana_node_api/isBlockhashValid)) aren't new. So the chances are you won't be significantly affected. See: ## Methods: deprecated and replaced | Deprecated call | Replacement call | | --------------------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | confirmTransaction | [getSignatureStatuses](/reference/solana_node_api/getSignatureStatuses) | | getSignatureStatus | [getSignatureStatuses](/reference/solana_node_api/getSignatureStatuses) | | getSignatureConfirmation | [getSignatureStatuses](/reference/solana_node_api/getSignatureStatuses) | | getConfirmedSignaturesForAddress | [getSignaturesForAddress](/reference/solana_node_api/getSignaturesForAddress) | | getConfirmedBlock | [getBlock](/reference/solana_node_api/getBlocks) | | getConfirmedBlocks | [getBlocks](/reference/solana_node_api/getBlocks) | | getConfirmedBlocksWithLimit | [getBlocksWithLimit](/reference/solana_node_api/getBlocksWithLimit) | | getConfirmedTransaction | [getTransaction](/reference/solana_node_api/getTransaction) | | getConfirmedSignaturesForAddress2 | [getSignaturesForAddress](/reference/solana_node_api/getSignaturesForAddress) | | getRecentBlockhash | [getLatestBlockhash](/reference/solana_node_api/getLatestBlockhash) | | getFees | [getFeeForMessage](/reference/solana_node_api/getFeeForMessage) | | getFeeCalculatorForBlockhash | [isBlockhashValid](/reference/solana_node_api/isBlockhashValid) or [getFeeForMessage](/reference/solana_node_api/getFeeForMessage) | | getFeeRateGovernor | [getFeeForMessage](/reference/solana_node_api/getFeeForMessage) | | getSnapshotSlot | [getHighestSnapshotSlot](/reference/solana_node_api/getHighestSnapshotSlot) | | getStakeActivation | [getAccountInfo \| Solana](/reference/solana_node_api/getAccountInfo) ([alternative approach](https://solana.stackexchange.com/questions/15710/the-alternative-method-to-get-the-stake-account-status-since-getstakeactivation)) | ## Replacing `confirmTransaction` with `getSignaturesStatuses` ```javascript @solana/web3.js import { Connection, Transaction, SendOptions } from '@solana/web3.js'; async function sendAndConfirmTransaction( connection: Connection, transaction: Transaction, timeout: number = 30000, // 30 seconds default timeout options?: SendOptions ): Promise { // Send the transaction const signature = await connection.sendTransaction(transaction, options?.signers || [], { ...options, skipPreflight: options?.skipPreflight || false }); // Create a promise that will reject after the timeout const timeoutPromise = new Promise((_, reject) => { setTimeout(() => { reject(new Error(`Transaction confirmation timeout after ${timeout}ms`)); }, timeout); }); // Create the confirmation promise const confirmationPromise = (async () => { let done = false; while (!done) { // Get the status of the transaction const response = await connection.getSignatureStatuses([signature]); const status = response.value[0]; if (status) { if (status.err) { throw new Error(`Transaction failed: ${status.err.toString()}`); } // Check if we have enough confirmations if (status.confirmationStatus === 'finalized') { done = true; return signature; } } // Wait a bit before checking again await new Promise(resolve => setTimeout(resolve, 1000)); } })(); // Race between timeout and confirmation try { return await Promise.race([confirmationPromise, timeoutPromise]); } catch (error) { // If it's a timeout error, we should still return the signature if (error.message.includes('timeout')) { return signature; } throw error; } } // Example usage async function example() { const connection = new Connection('CHAINSTACK_NODE'); const transaction = new Transaction(); // ... add your transaction instructions here ... try { const signature = await sendAndConfirmTransaction(connection, transaction, 60000, { skipPreflight: false, // ... other options }); console.log('Transaction confirmed:', signature); } catch (error) { console.error('Transaction failed:', error); } } ``` ```python solana.py from solana.rpc.api import Client from solana.transaction import Transaction from solana.rpc.commitment import Commitment from typing import Optional, List, Union import time import asyncio from concurrent.futures import TimeoutError class TransactionConfirmationError(Exception): """Custom exception for transaction confirmation errors""" pass async def send_and_confirm_transaction( client: Client, transaction: Transaction, signers: List, timeout: int = 30, # timeout in seconds commitment: Union[str, Commitment] = "finalized" ) -> str: """ Send and confirm a transaction using getSignatureStatuses Args: client: Solana client instance transaction: Transaction to send signers: List of signers for the transaction timeout: Maximum time to wait for confirmation in seconds commitment: Commitment level to use Returns: Transaction signature Raises: TransactionConfirmationError: If transaction fails or times out """ # Send the transaction try: response = client.send_transaction( transaction, *signers, opts={ "skip_preflight": False, "preflight_commitment": commitment } ) signature = response["result"] except Exception as e: raise TransactionConfirmationError(f"Failed to send transaction: {str(e)}") # Start time for timeout tracking start_time = time.time() async def confirm_transaction() -> str: while True: try: # Check if we've exceeded timeout if time.time() - start_time > timeout: raise TimeoutError(f"Transaction confirmation timeout after {timeout} seconds") # Get transaction status response = client.get_signature_statuses([signature]) if response["result"]["value"][0] is None: # Transaction not yet processed, wait and retry await asyncio.sleep(1) continue status = response["result"]["value"][0] # Check if transaction failed if status.get("err"): raise TransactionConfirmationError( f"Transaction failed: {status['err']}" ) # Check confirmation status conf_status = status.get("confirmationStatus") if conf_status == "finalized": return signature # Wait before checking again await asyncio.sleep(1) except TimeoutError: # Return signature even on timeout return signature except Exception as e: if not isinstance(e, TimeoutError): raise TransactionConfirmationError(f"Confirmation failed: {str(e)}") try: return await confirm_transaction() except Exception as e: raise TransactionConfirmationError(str(e)) # Example usage async def example(): # Initialize client client = Client("CHAINSTACK_NODE") # Create and populate your transaction transaction = Transaction() # ... add your transaction instructions here ... try: signature = await send_and_confirm_transaction( client=client, transaction=transaction, signers=[], # Add your signers here timeout=60, # 60 seconds timeout commitment="finalized" ) print(f"Transaction confirmed: {signature}") except TransactionConfirmationError as e: print(f"Transaction failed: {str(e)}") # Run the example if __name__ == "__main__": asyncio.run(example()) ``` # Solana: Analyzing adjacent transactions for priority fees Source: https://docs.chainstack.com/docs/solana-analyzing-adjacent-transactions-for-priority-fees **TLDR** * This Python script inspects the fee usage in blocks adjacent to your own target transaction on Solana, focusing on the same on-chain program. * It decodes compute budget instructions (priority fees) to show both your transaction’s fee and those of other transactions in nearby blocks. * By comparing priority fees, you can tune your own transaction fees to optimize speed or cost, and troubleshoot performance. * Useful for pump.fun or any scenario where outbidding other participants at the block level is crucial. ## Main article This guide shows you how to analyze transaction priority fees for transactions adjacent to your target transaction on Solana. This is particularly useful when developing trading strategies or troubleshooting transaction execution on programs like pump.fun. What the script in this tutorial does is look for transactions adjacent to the one you specify (usually your own) and checks all other transactions involving the same program and prints the priority fees in the same block and the adjacent blocks to easily see how your particular transaction stacks up and how you can tune it to win more races. ## Prerequisites ### Get you own node endpoint today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. * Python 3.7+ * Packages: `pip install solana base58` ## Understanding Solana fees The recommended reading list: * [How to use Priority Fees to unlock faster transactions](/docs/solana-how-to-priority-fees-faster-transactions) * [Estimate Priority Fees with getRecentPrioritizationFees](/docs/solana-estimate-priority-fees-getrecentprioritizationfees) * [Solana Trader nodes](/docs/solana-trader-nodes) * Or the official Solan docs: [Fees on Solana](https://solana.com/docs/core/fees). ## Implementation If you end up in this specific scenario as outlined in the beginning of the article, here's things that you need to have to get the results: * A program address that you are interacting with to get the races simply won or tactically won (like trying to get your transaction execution at a certain place in the race instead of trying to be first), or simply how much on an edge you need to be the first. Which you can also do with [How to use Priority Fees to unlock faster transactions](/docs/solana-how-to-priority-fees-faster-transactions) and [Estimate Priority Fees with getRecentPrioritizationFees](/docs/solana-estimate-priority-fees-getrecentprioritizationfees) as mentioned. * A transaction signature — this can be the signature of your transaction of a transaction of an account that you get through your own [humint](https://en.wikipedia.org/wiki/Human_intelligence_\(intelligence_gathering\)) ways. The script itself: ```python Python from solana.rpc.api import Client from solana.rpc.commitment import Commitment from solana.transaction import Signature import base58 import json from typing import List, Optional, Tuple PROGRAM_ID = "6EF8rrecthR5Dkzon8Nwu78hRvfCKubJ14M5uBEwF6P" COMPUTE_BUDGET_ID = "ComputeBudget111111111111111111111111111111" def get_transaction_slot(client: Client, signature_str: str) -> Optional[int]: try: signature_bytes = base58.b58decode(signature_str) signature = Signature(signature_bytes) response = client.get_transaction( signature, max_supported_transaction_version=0 ) if hasattr(response, 'value') and response.value: return response.value.slot return None except Exception as e: print(f"Error getting transaction slot: {e}") return None def calculate_transaction_fees(tx) -> Tuple[float, float]: total_fee_lamports = tx.meta.fee total_fee_sol = total_fee_lamports / 1_000_000_000 cu_price_micro_lamports = None cu_limit = None for ix in tx.transaction.message.instructions: program_id = tx.transaction.message.account_keys[ix.program_id_index] if str(program_id) == COMPUTE_BUDGET_ID: data = base58.b58decode(ix.data) if len(data) >= 9 and data[0] == 3: cu_price_micro_lamports = int.from_bytes(data[1:9], 'little') elif len(data) >= 4 and data[0] == 2: cu_limit = int.from_bytes(data[1:4], 'little') if cu_price_micro_lamports is not None and cu_limit is not None: priority_fee_lamports = (cu_price_micro_lamports * cu_limit) // 1_000_000 priority_fee_sol = priority_fee_lamports / 1_000_000_000 else: priority_fee_sol = 0 return total_fee_sol, priority_fee_sol def get_block_transactions(client: Client, slot: int, target_signature: str) -> List[tuple]: try: response = client.get_block(slot, max_supported_transaction_version=0) if not response.value: return [] transactions = [] for tx in response.value.transactions: if any(str(account) == PROGRAM_ID for account in tx.transaction.message.account_keys): signature = str(tx.transaction.signatures[0]) total_fee, priority_fee = calculate_transaction_fees(tx) writable_accounts = [] for idx, account in enumerate(tx.transaction.message.account_keys): if idx < tx.transaction.message.header.num_required_signatures: writable_accounts.append(str(account)) compute_units = tx.meta.compute_units_consumed if hasattr(tx.meta, 'compute_units_consumed') else None tx_info = { 'signature': signature, 'total_fee': total_fee, 'priority_fee': priority_fee, 'block': slot, 'writable_accounts': writable_accounts, 'compute_units': compute_units } if signature == target_signature: transactions.insert(0, tx_info) else: transactions.append(tx_info) return transactions except Exception as e: print(f"Error getting block transactions for slot {slot}: {e}") return [] def analyze_slots(rpc_url: str, signature: str, slots_back: int, slots_forward: int): client = Client(rpc_url) current_slot = get_transaction_slot(client, signature) if not current_slot: print("Could not find transaction slot") return print(f"\nAnalyzing slots {current_slot-slots_back} to {current_slot+slots_forward}") print("=" * 80) for slot in range(current_slot - slots_back, current_slot + slots_forward + 1): transactions = get_block_transactions(client, slot, signature) print(f"\nSlot {slot}:") print("-" * 40) if transactions: target_tx = None other_txs = [] for tx in transactions: if tx['signature'] == signature: target_tx = tx else: other_txs.append(tx) other_txs.sort(key=lambda x: x['priority_fee'], reverse=True) if target_tx: print("\n" + "⭐️" * 40) print("🎯 TARGET TRANSACTION FOUND 🎯") print("⭐️" * 40) print(f"\nSignature: {target_tx['signature']}") print(f"Block: {target_tx['block']}") print(f"{'Total fee:':<15} {target_tx['total_fee']:>12.8f} SOL") print(f"{'Priority fee:':<15} {target_tx['priority_fee']:>12.8f} SOL") print(f"{'Base fee:':<15} {(target_tx['total_fee'] - target_tx['priority_fee']):>12.8f} SOL") print(f"Compute Units: {target_tx['compute_units']}") print("\nWritable Accounts:") for account in target_tx['writable_accounts']: print(f" {account}") print("⭐️" * 40 + "\n") if other_txs: print(f"Other transactions in slot {slot} (sorted by priority fee):") print("-" * 40) for tx in other_txs: print(f"\nSignature: {tx['signature']}") print(f"Block: {tx['block']}") print(f"Fee: {tx['total_fee']:.8f} SOL (Priority: {tx['priority_fee']:.8f} SOL)") print(f"Compute Units: {tx['compute_units']}") print("Writable Accounts:") for account in tx['writable_accounts']: print(f" {account}") print("-" * 40) else: print("No relevant transactions") def main(): RPC_URL = "CHAINSTACK_NODE_RPC" SIGNATURE = "TRANSACTION_SIGNATURE" SLOTS_BACK = 1 # Number of slots to check back SLOTS_FORWARD = 1 # Number of slots to check forward analyze_slots(RPC_URL, SIGNATURE, SLOTS_BACK, SLOTS_FORWARD) if __name__ == "__main__": main() ``` In the script: * PROGRAM\_ID — set to the [pump.fun executable](https://solscan.io/account/6EF8rrecthR5Dkzon8Nwu78hRvfCKubJ14M5uBEwF6P). Set it to any executable on-chain program you interact with. * COMPUTE\_BUDGET\_ID — set to the [Solana system program](https://solscan.io/account/ComputeBudget111111111111111111111111111111) responsible for compute units and fees. We use in the script to find and decode the priority fees. * RPC\_URL — tour Chainstack [Solana node endpoint](https://console.chainstack.com/). * SIGNATURE — the transactions signature (hash) around which you want to analyze similar transactions. * SLOTS\_BACK — number of slots to travel back from the slot where you transaction landed (was included in the block). * SLOTS\_FORWARD — number of slot to travel forward to see who used what fees after you transaction landed. To use the script, just run `python check_adjacent.py` or whatever you name it. Example output: ```shell Shell Slot 309220412: ---------------------------------------- ⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️ 🎯 TARGET TRANSACTION FOUND 🎯 ⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️ Signature: 3Bd4wJ3pZrPYbhnHpUbZFXfunD2A1dciNx8umhhTEg4KLYmP1nERNbKEbHP9j6K8CcJgdekRi3PhUqVjdsHcsMSA Block: 309220412 Total fee: 0.00200500 SOL Priority fee: 0.00200000 SOL Base fee: 0.00000500 SOL Compute Units: 57900 Writable Accounts: 8SDfmRCMwn4qNvBpnvMTX7hunGHXWEJ5odhazNVcFyBw ⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️⭐️ Other transactions in slot 309220412 (sorted by priority fee): ---------------------------------------- ``` ## Use cases This tool is particularly useful for: * Analyzing competitor trading strategies * Optimizing your own priority fees * Troubleshooting failed transactions * Understanding market competition in specific slots * Developing automated trading systems ## Conclusion Understanding transaction priority fees and their impact on execution is crucial for successful trading on Solana. This tool helps you analyze the competitive landscape and optimize your trading strategy by providing insights into how other traders are setting their priority fees. For more Solana development tools and guides, see [Mastering Solana](/docs/solana-development). Director of Developer Experience @ Chainstack Talk to me all things Web3 20 years in technology | 8+ years in Web3 full time years experience Trusted advisor helping developers navigate the complexities of blockchain infrastructure [](https://github.com/akegaviar/) [](https://twitter.com/akegaviar) [](https://www.linkedin.com/in/ake/) [](https://warpcast.com/ake) # Solana Archive nodes: The backbone of Solana’s data availability and developer tooling Source: https://docs.chainstack.com/docs/solana-archive-nodes-the-backbone-of-solanas-data-availability-and-developer-tooling For methods available specifically available for Solana archive nodes, see [Solana archive methods availability](/docs/limits#solana-archive-methods-availability) . The Solana ecosystem is pretty rich & diverse in the projects, dapps, developer tooling, and the need for robust and scalable data management solutions is as crucial as ever. Solana archive nodes, heavy as they are, play a pivotal role in providing the ecosystem with a decentralized, high-performance, and extensible infrastructure for storing and accessing historical blockchain data. ### Get you own Solana archive node today [Start for free](https://console.chainstack.com/) and get your app to production levels immediately. No credit card required. You can sign up with your GitHub, X, Google, or Microsoft account. ## The importance of Solana archive nodes Solana historical nodes are critical for the overall Web3 ecosystem. From an end-user perspective, these nodes serve as an immutable record of the Solana blockchain's history, allowing users to access and verify past transaction data, providing transparency and accountability. The Solana explorer, a web-based tool that allows users to browse the Solana blockchain, is powered by these historical nodes. From a developer's standpoint, Solana historical nodes are crucial for building robust and reliable applications on top of the Solana blockchain. Developers can leverage the historical data stored in these nodes to implement advanced analytics and reporting features, facilitate data-driven decision making, ensure data integrity and auditability, and enable cross-chain interoperability. This comprehensive data storage serves several critical functions: 1. Data availability: Solana archive nodes ensure that historical blockchain data is readily available and accessible, enabling users and applications to access past transaction records, account balances, and other on-chain information as needed. You can ingest the Solana archive data (including other protocols) with Chainstack nodes [tailored for the task](https://chainstack.com/raw-web3-data-for-your-warehouse/). 2. Performance: Solana archive nodes, similar to the regular full nodes, are designed to provide fast and efficient access to historical data. 3. End-user benefits: For end-users, Solana archive nodes enable access to historical data and transaction records, which can be useful for tasks such as auditing, research, and analytical purposes. 4. Developer benefits: Developers building applications on Solana can leverage the archive node system to access historical data and build more sophisticated features and functionality. This can include things like on-chain analytics, data visualization, and historical data-driven applications. A few examples: 1. Explorers & analytics platforms — Companies like Solana FM, Solana Beach, and Solscan leverage Solana archive nodes to provide comprehensive transaction history, account balances, and advanced analytics for the Solana ecosystem. 2. DeFi apps — DeFi platforms that operate on the Solana blockchain, such as Serum, Raydium, and Saber, rely on archive nodes to retrieve historical trading data, calculate yields, and power their financial services. 3. Non-Fungible Token (NFT) Marketplaces: NFT platforms built on Solana, including Magic Eden and Solanart, use archive nodes to track the ownership history, trading volumes, and other metadata for NFTs on the Solana network. 4. Solana-based gaming and metaverse projects, like Aurory and Star Atlas, leverage archive nodes to store and retrieve player activity and use it for analytics, ## Key architectural components The foundation of a Solana archive node is its custom-built ledger database designed for append-only writes of massive data volumes. Rather than using a traditional database like PostgreSQL, Solana leverages a highly optimized fork of Facebook's RocksDB to achieve the throughput and scalability needed. Archive nodes store the full ledger history, which spans multiple petabytes and is constantly growing. To keep storage costs down, older data is progressively moved to cheaper "cold storage" tiers like Amazon S3 Glacier. Operators use sophisticated data tiering and recovery mechanisms to strike the right balance between cost and access latency. On top of the ledger storage, archive nodes maintain several index structures mapping block heights and signatures to the physical storage location. This allows quickly looking up and retrieving any historical transaction or block by its signature or height. With petabytes of data to store, archive nodes need robust redundancy and replication to protect against data loss. Ledger data is typically stored with 3x replication, spread across multiple disks, machines, and even geographic regions. Solana's [Turbine block propagation protocol](https://docs.solanalabs.com/consensus/turbine-block-propagation) allows streaming large amounts of data to replicas with minimal redundancy. To allow scaling storage beyond a single machine, the archive is horizontally sharded across multiple nodes. Solana's architecture provides a framework for seamlessly sharding the state. New nodes joining the cluster can quickly sync by downloading a snapshot of the full state at a given point. ## Retrieving Historical Data Developers can access historical data through Solana's JSON RPC APIs exposed by archive nodes. Here's an example of fetching an old transaction using the JavaScript `@solana/web3.js` library: ```javascript Javascript const solanaWeb3 = require("@solana/web3.js"); const connection = new solanaWeb3.Connection('CHAINSTACK_ARCHIVE_NODE'); (async () => { const oldTx = await connection.getConfirmedTransaction('txSignature'); console.log(oldTx); })(); ``` The `getConfirmedTransaction` method allows specifying a transaction signature to look up. Archive nodes use their indexes to quickly locate and return the requested transaction, even if it occurred years ago. You can also query the historical state of an account at a particular block height or timestamp: ```javascript Javascript const accountInfo = await connection.getAccountInfoAndContext( 'accountPubkey', 'slotNumber', ); console.log(accountInfo); ``` This will return the state and context of the account as it existed at the specified slot height. Under the hood, the archive node finds the nearest ledger checkpoint preceding the requested slot and replays transactions to derive the account state. ## Conclusion Solana archive nodes are a critical component of the Solana blockchain infrastructure, ensuring the long-term scalability, integrity, and utility of the network. By providing secure, decentralized, and performant access to historical blockchain data, these nodes empower end-users and developers alike to unlock new insights, build innovative applications, and drive the continued growth of the Solana ecosystem. As Solana continues to gain traction and adoption, the importance of its archive node system will only increase. Developers and users can rest assured that the Solana blockchain is backed by a robust and scalable data management infrastructure, ready to support the demands of the Web3 future. # Solana: Creating a trading and sniping pump.fun bot Source: https://docs.chainstack.com/docs/solana-creating-a-pumpfun-bot A fully coded Python bot directly interacting with the pump.fun programs & accounts, not relying on any 3rd party APIs