Ethereum Quick Start - Uniswap (Complex)
Ethereum Quick Start - Uniswap (Complex)
Uniswap is one of the leading decentralised exchange (DEX) in web3 and is one that relies on indexers to serve data to it's UI so users can interact with it. By systematically organizing tokens, liquidity pools, transactions, and other essential information, Indexers like SubQuery provide users with a quick and efficient means to search, find, and analyze data within Uniswap.
The objective of this article is to offer a detailed, step-by-step guide on setting up a SubQuery indexer for Uniswap v3 protocol. We will comprehensively cover the necessary configurations and delve into the intricacies of the underlying logic. It's an excellent example of how to do indexing for a complex DEX like Uniswap.
Setting Up the Indexer
In this Uniswap indexing project, our main focus is on configuring the indexer to exclusively capture logs generated by three specific types of Uniswap smart contracts:
UniswapV3Factory (contract address:
0x1F98431c8aD98523631AE4a59f267346ea31F984
): This contract is responsible for creating all the pool smart contracts.Smart Contracts for Individual Pools: These contracts represent individual liquidity pools.
NonfungiblePositionManager (contract address:
0xC36442b4a4522E871399CD717aBDD847Ab11FE88
): This contract is instrumental in producing liquidity positions, which are implemented as NFTs. This functionality enables additional use-cases, including the transfer of liquidity positions.
To gain a deeper understanding of how these core mechanisms work, you can refer to the official Uniswap documentation.
Important
We suggest starting with the Ethereum Gravatar example. The Ethereum Uniswap project is a lot more complicated and introduces some more advanced concepts
In the earlier Quickstart section , you should have taken note of three crucial files. To initiate the setup of a project from scratch, you can proceed to follow the steps outlined in the initialisation description.
As a prerequisite, you will need to generate types from the ABI files of each smart contract. Additionally, you can kickstart your project by using the EVM Scaffolding approach (detailed here). You'll find all the relevant events to be scaffolded in the documentation for each type of smart contract.
Note
The code snippets provided further have been simplified for clarity. You can find the full and detailed code here to see all the intricate details.
UniswapV3Factory
The core role of the factory contract is to generate liquidity pool smart contracts. Each pool comprises a pair of two tokens, uniting to create an asset pair, and is associated with a specific fee rate. It's important to emphasize that multiple pools can exist with the same asset pair, distinguished solely by their unique swap fees.
Your Project Manifest File
The Project Manifest file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data.
For EVM chains, there are three types of mapping handlers (and you can have more than one in each project):
- BlockHanders: On each and every block, run a mapping function
- TransactionHandlers: On each and every transaction that matches optional filter criteria, run a mapping function
- LogHanders: On each and every log that matches optional filter criteria, run a mapping function
In simple terms, there's only one event that requires configuration, and that's the PoolCreated
event. After adding this event to the manifest file, it will be represented as follows:
{
dataSources: [
{
kind: EthereumDatasourceKind.Runtime,
startBlock: 12369621,
options: {
// Must be a key of assets
abi: "Factory",
address: "0x1F98431c8aD98523631AE4a59f267346ea31F984",
},
assets: new Map([
["Factory", { file: "./abis/factory.json" }],
["ERC20", { file: "./abis/ERC20.json" }],
["ERC20SymbolBytes", { file: "./abis/ERC20SymbolBytes.json" }],
["ERC20NameBytes", { file: "./abis/ERC20NameBytes.json" }],
["Pool", { file: "./abis/pool.json" }],
]),
mapping: {
file: "./dist/index.js",
handlers: [
{
kind: EthereumHandlerKind.Event,
handler: "handlePoolCreated",
filter: {
topics: [
"PoolCreated(address indexed token0, address indexed token1, uint24 indexed fee, int24 tickSpacing, address pool)",
],
},
},
],
},
},
],
}
Note
Check out our Manifest File documentation to get more information about the Project Manifest (project.ts
) file.
Update Your GraphQL Schema File
The schema.graphql
file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
Now, let's consider the entities that we can extract from the factory smart contract for subsequent querying. The most obvious ones include:
Factory
: This entity represents the factory smart contracts responsible for creating thePool
smart contracts. As of the publication of this page, there is currently only one active factory smart contract in use.Token
: This entity identifies the token entity, as Pools always involve two tokens.Pool
: This entity represents the XYK pools, which serve as the primary trading execution mechanism on UniswapV3.
For these entities, the following attributes can be derived from data indexed from raw blockchain logs:
type Factory @entity {
# factory address
id: ID!
# amount of pools created
poolCount: BigInt!
# amoutn of transactions all time
txCount: BigInt!
...
# current owner of the factory
owner: ID!
}
type Token @entity {
# token address
id: ID!
# token symbol
symbol: String!
# token name
name: String!
# token decimals
decimals: BigInt!
# token total supply
totalSupply: BigInt!
...
# pools token is in that are white listed for USD pricing
# Should be Pool
# whitelistPools: [Pool!]!
# derived fields
tokenDayData: [TokenDayData!]! @derivedFrom(field: "token")
}
type Pool @entity {
# pool address
id: ID!
# creation
createdAtTimestamp: BigInt!
# block pool was created at
createdAtBlockNumber: BigInt!
# token0
token0: Token
# token0: [Token!] @derivedFrom(field: "id")
token1: Token
# token1: [Token!] @derivedFrom(field: "id")
# current tick
tick: BigInt
# current observation index
observationIndex: BigInt!
# all time token0 swapped
volumeToken0: Float!
...
mints: [Mint!]! @derivedFrom(field: "pool")
burns: [Burn!]! @derivedFrom(field: "pool")
swaps: [Swap!]! @derivedFrom(field: "pool")
collects: [Collect!]! @derivedFrom(field: "pool")
ticks: [Tick!]! @derivedFrom(field: "pool")
}
Note
The attributes mentioned above represent only a subset of the available attributes. For a complete list and detailed documentation, please refer to the final code.
As you explore these attributes, you may notice the relationship between the Pool
and Token
entities. Additionally, you'll find numerous derived attributes like mints
or swaps
.
Note
Importantly, these relationships can not only establish one-to-many connections but also extend to include many-to-many associations. To delve deeper into entity relationships, you can refer to this section. If you prefer a more example-based approach, our dedicated Hero Course Module can provide further insights.
SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
yarn codegen
npm run-script codegen
This action will generate a new directory (or update the existing one) named src/types
. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your schema.graphql
. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in the GraphQL Schema section.
It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the EVM Codegen from ABIs section. All of these types are stored in the src/types/abi-interfaces
and src/types/contracts
directories.
You can conveniently import all these types:
// Import entity types generated from the GraphQL schema
import { Factory, Pool, Token } from "../types";
import {
Pool,
Token,
Factory,
...
} from "../types";
import { EthereumLog } from "@subql/types-ethereum";
import { PoolCreatedEvent } from "../types/contracts/Factory";
Add a Mapping Function
Mapping functions define how blockchain data is transformed into the optimised GraphQL entities that we previously defined in the schema.graphql
file.
Note
For more information on mapping functions, please refer to our Mappings documentation.
Pool
, Factory
, and Token
are models that were generated in a prior step. PoolCreatedEvent
and EthereumLog
are TypeScript models generated by the SubQuery SDK to facilitate event handling.
As a reminder from the configuration step outlined in the Manifest File, we have a single handler called handlePoolCreated
. Now, let's proceed with its implementation:
export async function handlePoolCreated(
event: EthereumLog<PoolCreatedEvent["args"]>
): Promise<void> {
let factory = await Factory.get(FACTORY_ADDRESS);
if (factory === undefined || factory === undefined) {
factory = Factory.create({
id: FACTORY_ADDRESS,
poolCount: ZERO_BI,
totalVolumeETH: 0,
...
}
let [token0, token1] = await Promise.all([
Token.get(event.args.token0),
Token.get(event.args.token1),
]);
// fetch info if nul
if (token0 === undefined || token0 == null) {
const [symbol, name, totalSupply, decimals] = await Promise.all([
fetchTokenSymbol(event.args.token0),
fetchTokenName(event.args.token0),
fetchTokenTotalSupply(event.args.token0).then((r) => r.toBigInt()),
fetchTokenDecimals(event.args.token0),
]);
// bail if we couldn't figure out the decimals
if (!decimals) {
return;
}
token0 = Token.create({
id: event.args.token0,
symbol,
name,
totalSupply,
...
});
}
if (token1 === undefined || token1 == null) {
const [symbol, name, totalSupply, decimals] = await Promise.all([
fetchTokenSymbol(event.args.token1),
fetchTokenName(event.args.token1),
fetchTokenTotalSupply(event.args.token1).then((r) => r.toBigInt()),
fetchTokenDecimals(event.args.token1),
]);
// bail if we couldn't figure out the decimals
if (!decimals) {
return;
}
token1 = Token.create({
id: event.args.token1,
symbol,
name,
totalSupply,
...
});
}
factory.poolCount = factory.poolCount + ONE_BI;
const pool = Pool.create({
id: event.args.pool,
token0Id: token0.id,
token1Id: token1.id,
...
});
await Promise.all([
token0.save(),
token1.save(), // create the tracked contract based on the template
pool.save(),
factory.save(),
]);
}
Explaining the code provided above, the handlePoolCreated
function accepts an Ethereum event object as its input. This function serves the purpose of capturing essential information when a new pool is created on the blockchain. Here's a breakdown of its key steps:
Factory Object Retrieval: Initially, the function tries to retrieve a Factory object. If a Factory object is not found or is undefined, it proceeds to create a new Factory object with default initial values.
Token Information Retrieval: Following that, the function fetches information about two tokens:
token0
andtoken1
.
Note
Throughout this mapping and those that follow, numerous utility functions are employed to process the data. In this specific example, these utility functions are stored in the utils
directory. If you're interested in understanding how they work, you can refer to the final code.
- Data Persistence: To ensure the collected data persists, the function saves the modifications made to the
Token
,Pool
, andFactory
objects. This typically entails storing the data in a database or data store.
🎉 At this point, you have successfully crafted the handling logic for the factory smart contract and populated queryable entities like Token
, Pool
, and Factory
. This means you can now proceed to the building process to test the indexer's functionality up to this point.
Pool Smart Contracts
As we discussed in the introduction of Configuring the Indexer, a new contract is created by the factory contract for each newly created pool.
Your Project Manifest File
The Project Manifest file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data.
For EVM chains, there are three types of mapping handlers (and you can have more than one in each project):
- BlockHanders: On each and every block, run a mapping function
- TransactionHandlers: On each and every transaction that matches optional filter criteria, run a mapping function
- LogHanders: On each and every log that matches optional filter criteria, run a mapping function
The contract factory generates fresh contract instances for each new pool, therefore we use dynamic data sources to create indexers for each new contract:
{
templates: [
{
kind: EthereumDatasourceKind.Runtime,
name: "Pool",
options: {
abi: "Pool",
},
assets: new Map([
["Pool", { file: "./abis/pool.json" }],
["ERC20", { file: "./abis/ERC20.json" }],
["Factory", { file: "./abis/factory.json" }],
]),
mapping: {
file: "./dist/index.js",
handlers: [
{
kind: EthereumHandlerKind.Event,
handler: "handleInitialize",
filter: {
topics: ["Initialize (uint160,int24)"],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleSwap",
filter: {
topics: [
"Swap (address sender, address recipient, int256 amount0, int256 amount1, uint160 sqrtPriceX96, uint128 liquidity, int24 tick)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleMint",
filter: {
topics: [
"Mint(address sender, address owner, int24 tickLower, int24 tickUpper, uint128 amount, uint256 amount0, uint256 amount1)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleBurn",
filter: {
topics: [
"Burn(indexed address,indexed int24,indexed int24,uint128,uint256,uint256)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleFlash",
filter: {
topics: [
"Flash(indexed address,indexed address,uint256,uint256,uint256,uint256)",
],
},
},
],
},
},
],
}
Update Your GraphQL Schema File
The schema.graphql
file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
Numerous entities can be derived from each newly created pool smart contract. To highlight some of the most crucial ones, you'll need to extend the schema.graphql
file with the following entities:
type Mint @entity {
# transaction hash + "#" + index in mints Transaction array
id: ID!
# which txn the mint was included in
transaction: Transaction!
# time of txn
timestamp: BigInt!
# pool position is within
pool: Pool!
# allow indexing by tokens
token0: Token!
# allow indexing by tokens
token1: Token!
...
# order within the txn
logIndex: BigInt
}
type Burn @entity {
# transaction hash + "#" + index in mints Transaction array
id: ID!
# txn burn was included in
transaction: Transaction!
# pool position is within
pool: Pool!
# allow indexing by tokens
token0: Token!
# allow indexing by tokens
token1: Token!
...
# position within the transactions
logIndex: BigInt
}
type Swap @entity {
# transaction hash + "#" + index in swaps Transaction array
id: ID!
# pointer to transaction
transaction: Transaction!
# timestamp of transaction
timestamp: BigInt!
# pool swap occured within
pool: Pool!
# allow indexing by tokens
token0: Token!
# allow indexing by tokens
token1: Token!
...
# index within the txn
logIndex: BigInt
}
type Collect @entity {
# transaction hash + "#" + index in collect Transaction array
id: ID!
# pointer to txn
transaction: Transaction!
# timestamp of event
timestamp: BigInt!
# pool collect occured within
pool: Pool!
...
# index within the txn
logIndex: BigInt
}
type Flash @entity {
# transaction hash + "-" + index in collect Transaction array
id: ID!
# pointer to txn
transaction: Transaction!
# timestamp of event
timestamp: BigInt!
# pool collect occured within
pool: Pool!
...
# index within the txn
logIndex: BigInt
}
type Transaction @entity {
# txn hash
id: ID!
# block txn was included in
blockNumber: BigInt!
# timestamp txn was confirmed
timestamp: BigInt!
# gas used during txn execution
gasUsed: BigInt!
gasPrice: BigInt!
# derived values
mints: [Mint]! @derivedFrom(field: "transaction")
burns: [Burn]! @derivedFrom(field: "transaction")
swaps: [Swap]! @derivedFrom(field: "transaction")
flashed: [Flash]! @derivedFrom(field: "transaction")
collects: [Collect]! @derivedFrom(field: "transaction")
}
Similar to the previously imported entities, we observe various relationships here. In this case, each new entity references both the Token
and Pool
entities, establishing a one-to-one relationship. Additionally, each new entity references a Transaction
entity, which is the only one among the newly added entities not derived from logs. Instead, it's derived from an event to a specific transaction, showcasing the capabilities of the Subquery SDK.
SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
yarn codegen
npm run-script codegen
This action will generate a new directory (or update the existing one) named src/types
. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your schema.graphql
. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in the GraphQL Schema section.
It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the EVM Codegen from ABIs section. All of these types are stored in the src/types/abi-interfaces
and src/types/contracts
directories.
You can conveniently import all these types:
import { Burn, Mint, Swap } from "../types";
import {
InitializeEvent,
MintEvent,
BurnEvent,
SwapEvent,
FlashEvent,
} from "../types/contracts/Pool";
Add a Mapping Function
Mapping functions define how blockchain data is transformed into the optimised GraphQL entities that we previously defined in the schema.graphql
file.
In this scenario, the mapping process involves two substeps:
Enabling Handling of Newly Created Smart Contracts
To ensure that all the events mentioned above are handled for any newly created pool smart contract, you'll need to make a minor update to the code in the factory contract mapping factory.ts
file. Once you've executed subql codegen
, you can include the following import in that file:
import { createPoolDatasource } from "../types";
Then, within handlePoolCreated
you will have to add the following:
await createPoolDatasource({
address: event.args.pool,
});
After adding the above code to the handler of factory smart contract, you can be sure that the manifested events of all the smart contracts that it produce will be handled.
Writing Pool Smart Contracts Handlers
The mapping functions for the new entities are a much longer, therefore, in this example, we will only partially incorporate the mapping of handleSwap
handler:
export async function handleSwap(
event: EthereumLog<SwapEvent["args"]>
): Promise<void> {
const poolContract = Pool__factory.connect(event.address, api);
const [
factory,
pool,
transaction,
ethPrice,
feeGrowthGlobal0X128,
feeGrowthGlobal1X128,
] = await Promise.all([
Factory.get(FACTORY_ADDRESS),
Pool.get(event.address),
loadTransaction(event),
getEthPriceInUSD(),
poolContract.feeGrowthGlobal0X128(),
poolContract.feeGrowthGlobal1X128(),
]);
const [token0, token1] = await Promise.all([
Token.get(pool.token0Id),
Token.get(pool.token1Id),
]);
const oldTick = pool.tick;
...
// global updates
factory.txCount = factory.txCount + ONE_BI; //BigNumber.from(factory.txCount).add(ONE_BI).toBigInt()
factory.totalVolumeETH =
factory.totalVolumeETH + amountTotalETHTracked.toNumber(); //BigNumber.from(factory.totalVolumeETH).add(amountTotalETHTracked).toNumber()
// updated pool ratess
const prices = sqrtPriceX96ToTokenPrices(pool.sqrtPrice, token0, token1);
pool.token0Price = prices[0];
pool.token1Price = prices[1];
// create Swap event
// const transaction = await loadTransaction(event)
const swap = Swap.create({
id: transaction.id + "#" + pool.txCount.toString(),
transactionId: transaction.id,
timestamp: transaction.timestamp,
poolId: pool.id,
token0Id: pool.token0Id,
token1Id: pool.token1Id,
sender: event.args.sender,
origin: event.transaction.from,
recipient: event.args.recipient,
amount0: amount0.toNumber(),
amount1: amount1.toNumber(),
amountUSD: amountTotalUSDTracked.toNumber(),
tick: BigInt(event.args.tick),
sqrtPriceX96: event.args.sqrtPriceX96.toBigInt(),
logIndex: BigInt(event.logIndex),
});
await Promise.all([
swap.save(),
factory.save(),
pool.save(),
token0.save(),
token1.save(),
]);
}
To provide a quick overview of the code above: the function is named handleSwap
and accepts an Ethereum event object (event
) as its parameter. It then proceeds to execute several asynchronous operations concurrently using Promise.all()
. These operations involve fetching data related to the factory, pool, transaction, Ethereum price in USD, and fee growth data from the pool contract. Similarly to the previous step, this code retrieves information about two tokens (token0
and token1
) based on their IDs stored in the pool object.
The code also performs updates to specific global statistics associated with the factory. This includes incrementing the transaction count (txCount
) and the total volume of ETH traded based on the data acquired during the swap event.
Note
For simplicity's sake, we won't delve into a comprehensive explanation of the global statistics metrics. However, in the final code, you'll find entities with names like PoolDayData
, PoolHourData
, TickHourData
, TickDayData
, and TokenHourData
, and their names provide self-explanatory context.
Furthermore, the code calculates and updates token prices within the pool using the square root price (sqrtPrice
) of the pool and the tokens involved in the swap. A new Swap
event object is generated to record the details of the swap transaction.
Finally, the function saves the updated data for the swap, factory, pool, token0, and token1 objects to ensure that the changes persist in a database or data store.
🎉 At this stage, you've crafted handling logic for both the factory smart contract and all the pools smart contracts it creates. Additionally, you've populated the project with more entities like Swap
, Burn
, and Mint
, making them queryable. Once again, you can proceed to the building process to test how the indexer operates up to this point.
NonfungiblePositionManager
As you may already know, swaps in UniswapV3 are executed within the context of pools. To enable swaps, these pools must be liquid, and users provide liquidity to each specific pool. Each liquidity provision results in a Liquidity Position, essentially an NFT. This design enables a broader range of DeFi use cases. And the contract responsible for managing these provisions is known as the NonfungiblePositionManager.
Your Project Manifest File
The Project Manifest file is an entry point to your project. It defines most of the details on how SubQuery will index and transform the chain data.
For EVM chains, there are three types of mapping handlers (and you can have more than one in each project):
- BlockHanders: On each and every block, run a mapping function
- TransactionHandlers: On each and every transaction that matches optional filter criteria, run a mapping function
- LogHanders: On each and every log that matches optional filter criteria, run a mapping function
For the NonfungiblePositionManager smart contract, we want to introduce the following updates to the manifest file:
{
dataSources: [
{
kind: EthereumDatasourceKind.Runtime,
startBlock: 12369651,
options: {
// Must be a key of assets
abi: "NonfungiblePositionManager",
address: "0xC36442b4a4522E871399CD717aBDD847Ab11FE88",
},
assets: new Map([
[
"NonfungiblePositionManager",
{ file: "./abis/NonfungiblePositionManager.json" },
],
["Pool", { file: "./abis/pool.json" }],
["ERC20", { file: "./abis/ERC20.json" }],
["Factory", { file: "./abis/factory.json" }],
]),
mapping: {
file: "./dist/index.js",
handlers: [
{
kind: EthereumHandlerKind.Event,
handler: "handleIncreaseLiquidity",
filter: {
topics: [
"IncreaseLiquidity (uint256 tokenId, uint128 liquidity, uint256 amount0, uint256 amount1)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleDecreaseLiquidity",
filter: {
topics: [
"DecreaseLiquidity (uint256 tokenId, uint128 liquidity, uint256 amount0, uint256 amount1)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleCollect",
filter: {
topics: [
"Collect (uint256 tokenId, address recipient, uint256 amount0, uint256 amount1)",
],
},
},
{
kind: EthereumHandlerKind.Event,
handler: "handleTransfer",
filter: {
topics: ["Transfer (address from, address to, uint256 tokenId)"],
},
},
],
},
},
],
}
The configuration process closely resembles what we've seen earlier. However, we now have a completely new smart contract that we'll be handling events from. This entails different ABI, address, and start block values. Naturally, it also introduces new events, which are listed under the handlers
object.
Update Your GraphQL Schema File
The schema.graphql
file determines the shape of your data from SubQuery due to the mechanism of the GraphQL query language. Hence, updating the GraphQL Schema file is the perfect place to start. It allows you to define your end goal right at the start.
From this smart contract, the only new entity we'll emphasize is the Position
:
type Position @entity {
# Positions created through NonfungiblePositionManager
# NFT token id
id: ID!
# owner of the NFT
owner: String!
# pool position is within
pool: Pool!
# allow indexing by tokens
token0: Token!
# allow indexing by tokens
token1: Token!
...
# tx in which the position was initialized
transaction: Transaction!
# vars needed for fee computation
feeGrowthInside0LastX128: BigInt!
feeGrowthInside1LastX128: BigInt!
}
SubQuery simplifies and ensures type-safety when working with GraphQL entities, smart contracts, events, transactions, and logs. The SubQuery CLI will generate types based on your project's GraphQL schema and any contract ABIs included in the data sources.
yarn codegen
npm run-script codegen
This action will generate a new directory (or update the existing one) named src/types
. Inside this directory, you will find automatically generated entity classes corresponding to each type defined in your schema.graphql
. These classes facilitate type-safe operations for loading, reading, and writing entity fields. You can learn more about this process in the GraphQL Schema section.
It will also generate a class for every contract event, offering convenient access to event parameters, as well as information about the block and transaction from which the event originated. You can find detailed information on how this is achieved in the EVM Codegen from ABIs section. All of these types are stored in the src/types/abi-interfaces
and src/types/contracts
directories.
You can conveniently import all these types:
import { Position } from "../types";
import {
IncreaseLiquidityEvent,
DecreaseLiquidityEvent,
CollectEvent,
TransferEvent,
} from "../types/contracts/NonfungiblePositionManager";
Add a Mapping Function
Mapping functions define how blockchain data is transformed into the optimised GraphQL entities that we previously defined in the schema.graphql
file.
For this contract, we will craft the mappings in a file named position-manager.ts
. Once again, this separation provides context and clarity.
The mapping functions in this file can be extensive. In this example, we will provide a partial implementation of the mapping for the handleIncreaseLiquidity
handler:
export async function handleIncreaseLiquidity(
event: EthereumLog<IncreaseLiquidityEvent["args"]>,
): Promise<void> {
const position = await getPosition(event, event.args.tokenId);
const [token0, token1] = await Promise.all([
Token.get(position.token0Id),
Token.get(position.token1Id),
]);
await updateFeeVars(position, event, event.args.tokenId);
await position.save();
await savePositionSnapshot(position, event);
}
async function getPosition(
event: EthereumLog,
tokenId: BigNumber,
): Promise<Position> | null {
let position = await Position.get(tokenId.toString());
if (position === undefined) {
const contract = NonfungiblePositionManager__factory.connect(
event.address,
api,
);
let positionResult;
try {
positionResult = await contract.positions(tokenId);
} catch (e) {
logger.warn(
`Contract ${event.address}, could not get position with tokenId ${tokenId}`,
);
return null;
}
const [poolAddress, transaction] = await Promise.all([
factoryContract.getPool(
positionResult[2],
positionResult[3],
positionResult[4],
),
loadTransaction(event),
]);
position = Position.create({
id: tokenId.toString(),
owner: ADDRESS_ZERO,
poolId: poolAddress,
token0Id: positionResult[2],
token1Id: positionResult[3],
tickLowerId: `${poolAddress}#${positionResult[5].toString()}`,
tickUpperId: `${poolAddress}#${positionResult[6].toString()}`,
liquidity: ZERO_BI,
depositedToken0: 0, //ZERO_BD.toNumber(),
depositedToken1: 0, //ZERO_BD.toNumber(),
withdrawnToken0: 0, //ZERO_BD.toNumber(),
withdrawnToken1: 0, //ZERO_BD.toNumber(),
collectedFeesToken0: 0, //ZERO_BD.toNumber(),
collectedFeesToken1: 0, //ZERO_BD.toNumber(),
transactionId: transaction.id,
feeGrowthInside0LastX128: positionResult[8].toBigInt(),
feeGrowthInside1LastX128: positionResult[9].toBigInt(),
});
}
return position;
}
To briefly clarify the code provided above: the handler function handleIncreaseLiquidity
is responsible for managing an increase in liquidity event on the blockchain. Initially, it invokes the getPosition
function to either retrieve an existing liquidity position associated with the event's tokenId
or create a new one if necessary. Any modifications made to the Position
object during this process are saved to ensure persistence.
🎉 In conclusion, we have successfully incorporated all the desired entities that can be retrieved from various smart contracts. For each of these entities, we've created mapping handlers to structure and store the data in a queryable format.
Note
Check the final code repository here to observe the integration of all previously mentioned configurations into a unified codebase.
Build Your Project
Next, build your work to run your new SubQuery project. Run the build command from the project's root directory as given here:
yarn build
npm run-script build
Important
Whenever you make changes to your mapping functions, you must rebuild your project.
Now, you are ready to run your first SubQuery project. Let’s check out the process of running your project in detail.
Whenever you create a new SubQuery Project, first, you must run it locally on your computer and test it and using Docker is the easiest and quickiest way to do this.
Run Your Project Locally with Docker
The docker-compose.yml
file defines all the configurations that control how a SubQuery node runs. For a new project, which you have just initialised, you won't need to change anything.
However, visit the Running SubQuery Locally to get more information on the file and the settings.
Run the following command under the project directory:
yarn start:docker
npm run-script start:docker
Note
It may take a few minutes to download the required images and start the various nodes and Postgres databases.
Query your Project
Next, let's query our project. Follow these three simple steps to query your SubQuery project:
Open your browser and head to
http://localhost:3000
.You will see a GraphQL playground in the browser and the schemas which are ready to query.
Find the Docs tab on the right side of the playground which should open a documentation drawer. This documentation is automatically generated and it helps you find what entities and methods you can query.
Try the following queries to understand how it works for your new SubQuery starter project. Don’t forget to learn more about the GraphQL Query language.
Pools
Request
query {
pools(first: 5) {
nodes {
token0Id
token1Id
token0Price
token1Price
txCount
volumeToken0
volumeToken1
}
}
}
Response
{
"data": {
"pools": {
"nodes": [
{
"token0Id": "0x4a220E6096B25EADb88358cb44068A3248254675",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"token0Price": 0,
"token1Price": 0,
"txCount": "88",
"volumeToken0": 7035,
"volumeToken1": 61
},
{
"token0Id": "0x7Ef7AdaE450e33B4187fe224cAb1C45d37f7c411",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"token0Price": 0,
"token1Price": 0,
"txCount": "1",
"volumeToken0": 0,
"volumeToken1": 0
},
{
"token0Id": "0x1337DEF16F9B486fAEd0293eb623Dc8395dFE46a",
"token1Id": "0x6B175474E89094C44Da98b954EedeAC495271d0F",
"token0Price": 1,
"token1Price": 1,
"txCount": "34",
"volumeToken0": 15588,
"volumeToken1": 17059
}
]
}
}
}
Swaps
Request
{
swaps(first: 3) {
nodes {
token0Id
token1Id
amount0
amount1
id
amountUSD
timestamp
transactionId
}
}
}
Response
{
"data": {
"swaps": {
"nodes": [
{
"token0Id": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"amount0": -19973,
"amount1": 5,
"id": "0xa670c80538614ce0a2bd3f8071b3c9b51ae4a7a72d9c6405212118895ebe741b#2144",
"amountUSD": 0,
"timestamp": "1620402695",
"transactionId": "0xa670c80538614ce0a2bd3f8071b3c9b51ae4a7a72d9c6405212118895ebe741b"
},
{
"token0Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"token1Id": "0xf65B5C5104c4faFD4b709d9D60a185eAE063276c",
"amount0": 0,
"amount1": 2000,
"id": "0x44be483f706bb88213a065acdbd9bcbafc3eee68fd95f7bcb7e88808e777bf87#376",
"amountUSD": 0,
"timestamp": "1620269842",
"transactionId": "0x44be483f706bb88213a065acdbd9bcbafc3eee68fd95f7bcb7e88808e777bf87"
},
{
"token0Id": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"amount0": -1043,
"amount1": 0,
"id": "0x36f0f7de5a8ba5a2b6556917d329fb9631b70efd4ada284d504b52cf74fc4d98#2916",
"amountUSD": 0,
"timestamp": "1620433130",
"transactionId": "0x36f0f7de5a8ba5a2b6556917d329fb9631b70efd4ada284d504b52cf74fc4d98"
}
]
}
}
}
Positions
Request
{
positions(first: 3) {
nodes {
id
liquidity
tickLowerId
tickUpperId
token0Id
token1Id
transactionId
}
}
}
Response
{
"data": {
"positions": {
"nodes": [
{
"id": "5096",
"liquidity": "1406248200435775",
"tickLowerId": "0xe6868579CA50EF3F0d02d003E6D3e45240efCB35#-129150",
"tickUpperId": "0xe6868579CA50EF3F0d02d003E6D3e45240efCB35#-129140",
"token0Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"token1Id": "0xD46bA6D942050d489DBd938a2C909A5d5039A161",
"transactionId": "0xb097dc1b4eeafd7e770ca5f8de66ffb8117a64dee3e00dd66ae2ee8acf1deb30"
},
{
"id": "1567",
"liquidity": "0",
"tickLowerId": "0x0000000000000000000000000000000000000000#193380",
"tickUpperId": "0x0000000000000000000000000000000000000000#196260",
"token0Id": "0xA0b86991c6218b36c1d19D4a2e9Eb0cE3606eB48",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"transactionId": "0x112f8d32c821a10d91d333738b7b5f5db8ce19ed7c7d7bea5fd1d5f20f39816e"
},
{
"id": "6480",
"liquidity": "105107889867915697479",
"tickLowerId": "0x0000000000000000000000000000000000000000#-42600",
"tickUpperId": "0x0000000000000000000000000000000000000000#-40200",
"token0Id": "0xB6Ca7399B4F9CA56FC27cBfF44F4d2e4Eef1fc81",
"token1Id": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
"transactionId": "0x714cb4b06338185106289ad291cc787cacabda030747d7870f9830b91e503347"
}
]
}
}
}
uniswapDayData (statistical data)
Request
{
uniswapDayData(last: 3) {
nodes {
volumeUSD
volumeETH
txCount
date
feesUSD
}
}
}
Response
{
"data": {
"uniswapDayData": {
"nodes": [
{
"volumeUSD": 0,
"volumeETH": 0,
"txCount": "174",
"date": 1620237271,
"feesUSD": 0
},
{
"volumeUSD": 0,
"volumeETH": 0,
"txCount": "12741",
"date": 1620291964,
"feesUSD": 0
},
{
"volumeUSD": 0,
"volumeETH": 0,
"txCount": "10396",
"date": 1620282149,
"feesUSD": 0
}
]
}
}
}
What's next?
Congratulations! You have now a locally running SubQuery project that accepts GraphQL API requests for transferring data.
Tip
Find out how to build a performant SubQuery project and avoid common mistakes in Project Optimisation.
Click here to learn what should be your next step in your SubQuery journey.